Initiates a batch job that manages the processing of multiple operations in a workflow.
All operations are defined in the body of the request package. These operations are known as tasks within the workflow. Each task is defined as a JSON object in the request package and processed as a separate job by the workflow engine.
The following operations can be defined as tasks in a batch job: Create Export Job, Create Geohaz Job, Create Grouping Job, Create Model Job, Recalculate with PATE, Create Risk Data Report, Create Bulk Edit Job.
Careful ordering of the tasks in a request package enables you to define end-to-end workflows.
User-defined Workflow
A user-defined workflow is a workflow that consists of a series of tasks that are processed in the specified order.
For example, a batch job can define a workflow that consists of multiple tasks that perform the following operations in sequence: 1) create new account exposures, 2) generate an exposure summary report for those accounts, 3) geocode and hazard those accounts, 4) export location exposure data for those accounts.
For each task in the workflow, the batch job initiates a separate job to process that request. Each job in the workflow must be completed before the next task in the workflow can be started. The batch job manages the polling of jobs and the passing of operation outputs (e.g. new object IDs) between tasks using task path variables.
{
"name": "My Custom Workflow",
"settings": {
"combine-batch-edit-and-geohaz": false
},
"tasks": [
{
"label": "BulkAccounts",
"operationUri": "/platform/riskdata/v1/exposures/6667/bulk-edit",
"dependsOn": [],
"continueOnFailure": "false",
"requestBody": {
// bulk-edit request
}
},
{
"label": "SummaryReport",
"operationUri": "/platform/riskdata/v1/reports",
"dependsOn": ["BulkAccounts"],
"continueOnFailure": "true",
"requestBody": {
// reports request
}
},
{
"label": "GeohazAccounts",
"operationUri": "/plaform/geohaz/v1/jobs",
"continueOnFailure": false,
"dependsOn": ["BulkAccounts"],
"requestBody": {
//geohaz request
}
},
{
"label": "ModelAccount",
"operationUri": "/platform/model/v1/jobs",
"continueOnFailure": true,
"dependsOn": ["GeohazAccounts"],
"requestBody": {
// cat modeling request
}
},
{
"label": "ExportLocationResults",
"operationUri": "/platform/export/v1/jobs",
"continueOnFailure": "true",
"dependsOn": ["ModelAccounts"],
"requestBody": {
// export locations request
}
}
]
}
Every task is defined by a label
, dependsOn
, operationUri
, and requestBody
parameter.
- The
label
parameter uniquely identifies a task within the workflow. - The
dependsOn
parameter specifies an array of tasks that must be completed before the next task in the workflow. Tasks are identified bylabel
in the array. - The
operationUri
parameter specifies the resource URI of the operation performed by the task. If operation path includes a variable (e.g. a resource ID), you may define the resource URL using a task path variable instead of the unknown variable value. - The
requestBody
defines the body of the operation request. If the request body needs to specify the ID number of resources created by a previous task, you may specify the ID of that resource using a task path variable.
The contents of a request body varies depending on the operation performed. To understand requestBody
requirements a task operation, see reference documentation for the corresponding operation:
OperationUri | Operation Documentation |
---|---|
/platform/export/v1/jobs | Create Export Job |
/platform/geohaz/v1/jobs | Create Geohaz Job |
/platform/grouping/v1/jobs | Create Grouping Job |
/platform/model/v1/jobs | Create Model Job |
/platform/riskdata/v1/analyses/{analysisId}/pate | Recalculate with PATE |
/platform/riskdata/v1/reports | Create Risk Data Report |
/platform/riskdata/v1/v1/exposures/{exposureId}/bulk-edit | Create Bulk Edit Job |
Note
This operation supports the processing of large number of exposures in a single request. Consequently, the size of the request package may be quite large. Maximum supported payload: 5MB. Request packages that define up to 1000 exposures may be submitted in JSON format:
Content-Type: application/json
Task path variables
Batch jobs make heavy use of task path variables to identify and retrieve values (e.g. the IDs of exposures) that are created in the course of the workflow. Task path variables are the mechanism that enable you to create, geocode, and model an account in a single request.
A task path variable is a JSON query string that selects a new API resource -- a resource with an unknown ID-- based on a known value, e.g. the label
applied to the task that created that resource.
For example, the requestBody
of the GeohazAccounts
task can use the {{$.BulkAccounts.output.accounts.[?(@.label == 'Account1')].id}}
task path variable to select an account for geocoding and hazarding. The ID of this account is unknown, but it is known to be created by an earlier BulkAccounts
task.
"requestBody": {
"resourceType": "account",
"settings": {
"resourceType": "account",
"settings": {
"layers": [
// Layers
]
},
"resourceUri": "/platform/riskdata/v1/exposures/ \
{{exposure_id}}/accounts/ \
{{$.BulkAccounts.output.accounts.[?(@.label == 'DemoAccount1')].id}}"
}
}
},
In example, the requestBody
of the GeohazAccounts
task uses a task path variable to identify an account exposure that it needs to geohazard. This account cannot be identified by its ID number because the account did not exist at the time that the batch job was submitted-- it was created in the course of the workflow by the BulkAccounts
.
During processing, the workflow engine "expands" the task query to retrieve the unique ID number for the account using the path (label.output.id
). The path variable is a simple JSON query string enclosed in double curly braces.
Workflow optimization
If the combine-batch-edit-and-geohaz
parameter is true
, consecutive bulk-edit
and geohaz
tasks are combined and processed together as a single job, which can greatly reduce the runtime of a batch job.
In the example, the dependsOn
parameter in the GeohazAccount
object identifies the BatchAccount
task as its immediate predecessor. Consequently, the bulk-edit
and geohaz
operations defined by these tasks are processed as a single job:
{
"name": "Optimization Example",
"settings": {
"combine-batch-edit-and-geohaz": true,
"disable-underwriter-reports": true
},
"tasks": [
{
"label": "BatchAccount",
"operationUri": "/platform/riskdata/v1/exposures/{exposureId}/bulk-edit",
"dependsOn": [],
"continueOnFailure": "false",
"requestBody": {
// bulk-edit request
}
},
{
"label": "GeohazAccount",
"operationUri": "/platform/geohaz/v1/jobs",
"dependsOn": ["BatchAccount"],
"continueOnFailure": false,
"requestBody": {
// geohaz request
}
},
{
"label": "ModelAccount",
"operationUri": "/platform/model/v1/jobs",
"dependsOn": ["GeohazAccount"],
"continueOnFailure": false,
"requestBody": {
// cat modeling request
}
}
]
}
UnderwriteIQ reports
This operation can generate two types of reports that support underwriting workflows:
- An exposure summary report (EXPOSURE_SUMMARY) shows a peril-specific overview of an exposure's total insurable value (TIV). This report is available for accounts, portfolios, and analyses. To learn more about UnderwriteIQ report views, see Create Risk Data Report.
- A report view is a collection of exposure-specific summary reports that lists metrics and statistics. To learn more about UnderwriteIQ report views, see Search Report Views.
This operation automatically generates a report view whenever it is used to create exposures in batch, i.e. when it includes a bulk-edit
task. If the disable-underwriter-reports
parameter is false
, this operation does not automatically generate UnderwriteIQ report.