Process Mining Loader is a webMethods.io Integration accelerator that allows user to upload the data into ARIS Process Mining. It supports both single upload and batch upload based scenarios. It creates table definition if it doesn't exist already and then uploads data into ARIS Process Mining.
- webMethods.io Integration tenant
- ARIS Process Mining tenant
-
Setup ARIS Process Mining
- Login to your ARIS Process Mining tenant.
- Create a system integration for the data ingestion API. Please refer to section 1.2.1 in documentation.
- Create a connection in the data set. Please refer to section 1.2.2 in documentation.
-
Setup webMethods.io ARIS Process Mining connector
- Login to the webMethods.io tenant.
- Go to Projects tab, select an existing project or create new project.
- Go to Connectors tab and select ARIS Process Mining connector under Predefined Connectors list.
- Configure Add Account for ARIS Process Mining connector.
- ARIS Cloud URL as your ARIS Process Mining tenant url.
- Project Room as tenant name.
- Data Set Name of the data set that exists in ARIS Process Mining.
- Client id & Client secret.
- Download latest release Process_Mining.zip from Releases section.
- Login to webMethods.io Integration tenant.
- Go to Projects and select the project.
- Click Import and choose Process_Mining.zip.
- Choose ARIS Process Mining connector in Connect to ARIS Process Mining drop-down.
- Click Import.
This workflow can be executed by invoking Webhook URL using HTTP REST Client.
- Login to webMethods.io Integration tenant.
- Go to Projects and select project and open workflow.
- Click Settings icon by hovering over the Webhook step.
- Copy the Webhook URL.
- Go to your REST Client to invoke the workflow.
- Request URL: Webhook url
- Request Method: POST
- Content-Type Header: application/json
- Basic Authentication
- Request Payload: example payload below
{
"RecordHeader": {
"dataLoadKey": "dataKey001",
"commit": "true"
},
"TableInput": {
"TableName": "users",
"Namespace": "demo",
"ColumnData": [
{
"name": "demo",
"id": 12345,
"mail": "demo@softwareag.com",
},
{
"name": "demo2",
"id": 56789,
"mail": "demo2@softwareag.com",
}
]
}
}
- dataLoadKey - Unique key set by user for the table.
- commit - if it set to true, after uploading the data into ARIS data will be committed. false - used for batch update and data upload will not be commiited.
- TableName - ARIS Table Name
- Namespace - ARIS Table Namespace
- ColumnData - Array of data that needs to be uploaded into ARIS
- For datetime fields transform the date into format MM/dd/yyyy HH:mm:ss.SSS then only it will be considered as datetime field.
- For initial upload make sure all the fields are present in the first record of ColumnData.
Process Mining Jira Loader is a webMethods.io Integration accelerator that allows uploading the Jira issues and changes to ARIS Process Mining. It fetch the issues and issue history details from Jira and transform them into array then invokes Process Mining workflow using its Webhook url to upload the data into ARIS.
- webMethods.io Integration tenant
- Process Mining workflow uploaded
- Jira software
- ARIS Process Mining
-
setup webMethods.io Atlassian Jira Connector
- Login to the webMethods.io tenant.
- Go to Projects tab, select an existing project or create new project.
- Go to Connectors tab and select Atlassian Jira connector under Predefined Connectors list.
- Configure Add Account for Atlassian Jira connector
- Select Authorization Type as Connection from drop down list and click Next
- Jira server URL as your Jira server url
- Username as jira login username
- API key as Jira API token. Please refer to create an api token section in Documentation
- Set Hostname Verifier to org.apache.http.conn.ssl.NoopHostnameVerifier.
- Set Enable Connection Pooling to false.
- Set Session management to none.
-
setup webmethods.io Hypertext Transfer Protocol connector
- Login to the webMethods.io tenant.
- Go to Projects tab, select an existing project or create new project.
- Go to Connectors tab and select Hypertext Transfer Protocol connector under Predefined Connectors list.
- URL as webhook URL of Process Mining workflow
- Authentication Type as Basic
- Username as tenant login username
- Password as tenant login password
- Install flow services
- Download latest release sendJiraIssueDetails.zip from Releases section. This zip file contains all the required flowservices.
- Login to webMethods.io Integration tenant.
- Go to Projects and select the project.
- Click Import and choose sendJiraIssueDetails.zip.
- Choose Atlassian Jira and Hypertext Transfer Protocol connector in respective drop-down.
- Click Import.
- sendJiraIssueDetails - This flow service fetches the issues details from jira and invokes issueMapping flow service for mapping transformation and then invokes sendDataToARIS flowservice to upload the data into ARIS.
- createTableWithSampleData - This service load sample datainto ARIS before fetching JIRA details, So it will create table structure with all the required fields. This service execution is controlled by project parameter "loadSampleData".
- issueMapping - This flow service transform the jira issue details into array.
- sendDataToARIS - This flow service invokes Process Minining workflow to upload the data into ARIS using HTTP call.
- Install workflow
- Download latest release Send_Issue_Details_to_ARIS.zip from Releases section.
- Login to webMethods.io Integration tenant.
- Go to Projects and select the project.
- Click Import and choose Send_Issue_Details_to_ARIS.zip.
- Click Import.
- Send Issue Details to ARIS - This is the main workflow that invokes sendJiraIssueDetails flows service with configured project paramerters.
- Configure Project Parameters
Go to Configurations > Workflow > Parameter and update project parameters
- jql - jql query to filter the issue data
- expand - include additional fields into results. This usecase "changelog" has to be passed as value. This will includes the jira issue history into results while fetching the data.
- tableNamespace - Name of the table namespace.
- issueTableName - Table name for issues table.
- changesTableName - Table name for changes table.
- dataLoadKey_changes - Unique key for loading the data into changes table.
- dataLoadKey_issue - Unique key for loading the data into issues table.
- batchRecordSize - Records size for batch upload.
- loadSampleData - Whether to invoke createTableWithSampleData flow service or not. if set to "true" during the workflow execution it will invoke the flow service.
- changesSample - Sample data with all the fields for chages table.
- issueSample - Sample data with all the fields for issues table.
This workflow can be executed by clicking the run workflow button.
These tools are provided as-is and without warranty or support. They do not constitute part of the Software AG product suite. Users are free to use, fork and modify them, subject to the license agreement. While Software AG welcomes contributions, we cannot guarantee to include every contribution in the master project.