UnderwriteIQ Batch Workflows

Understand Batch Workflows for UnderwriteIQ

Overview

Batch processing is a method of processing large volumes of data in batches "in an automated and unattended way". Batch processing enables UnderwriteIQ tenants to streamline their underwriting processes by defining end-to-end workflows that managing large volumes.

"A batch job is a routine that executes a program with little to no user interaction, often processing a large volume of data at once."

The Batch API supports operations that enable UnderwriteIQ tenants to define and run customized batch jobs that define custom end-to-end workflows. The Batch API facilitates the management of end-to-end underwriting workflows by enabling risk management organizations to define data processing pipelines as user-defined workflows. For example, a single batch job can create accounts, geocode and hazard account locations, model account exposures, export location data, and generate an exposure summary report in a single request.

A user-defined workflow is a mechanism that enables you to package and manage multiple jobs as a single request. Each job is submitted and processed by the workflow engine separately and in the order specified in the batch job.

Batch jobs facilitate the management of end-to-end processes that incorporate multiple underwriting processes by enabling underwriting client to define automated and repeatable data processing pipelines. The custom workflow specifies the order of operation, manages all messaging, and passes the output of one task to the next task in the batch job. Clients can define and submit end-to-end workflows with a single API request and do not need to poll the platform for the status of each .

Batch Job Requests

All user-defined workflows are defined in body of a batch job.

All operations are defined in the body of the request package.

A Create Batch Job operation defines and initiates a batch job that manages the processing of multiple operations in a workflow.

The workflow consists of multiple operations that are identified as tasks within the batch job. Each task is defined as a JSON object in the request package and processed as a separate job by the workflow engine.

The example is an outline of a Create Batch Job request object that defines an end-to-end underwriting workflow. The batch job automates the running of the following operations: 1) creates a large number of accounts in bulk, 2) geocodes and 3) look ups of hazard data for account locations, 4) models account exposures, and 5) creates an exposure summary report for the new accounts.

Each operations is defined and managed as a task in the body of the request.

{
  "name": "Bulk Edit and Model Account Exposures",
  "settings": {
    "combine-batch-edit-and-geohaz": true,
    "disable-underwriter-reports": false
  },
  "tasks": [
    {
      // Step 1: Creates accounts in bulk
      "label": "BulkAccounts",
      "operationUri": "/platform/riskdata/v1/exposures/6667/bulk-edit",
      "dependsOn": [],
      "continueOnFailure": "false",
      "requestBody": {
        // Bulk-Edit request that creates multiple account exposures
      }
    },
    {
      // Step 2: Geocodes and looks up hazard data for account locations
      "label": "GeohazAccounts",
      "operationUri": "/plaform/geohaz/v1/jobs",
      "continueOnFailure": false,
      "dependsOn": ["BulkAccounts"],
      "requestBody": {
        // Geohaz request that geocodes and looks up hazard data.
      }
    },
    {
      // Step 4: Cat models accounts
      "label": "ModelAccount",
      "operationUri": "/platform/model/v1/jobs",
      "continueOnFailure": true,
      "dependsOn": ["GeohazAccounts"],
      "requestBody": {
        // Models geocoded accounts
      }
    },
    {
      // Step 5: Create Exposure Summary Report
      "label": "RiskDataReport",
      "operationUri": "/platform/riskdata/v1/reports",
      "continueOnFailure": true,
      "dependsOn": ["ModelAccount"],
      "requestBody": {
        // Creates an exposure summary report that shows TIV for new accounts
      }
    }
  ]
}

Every task is defined by a label, dependsOn, operationUri, and requestBody parameter. In the example, the requestBody of each task object has been omitted for simplicity.

ParameterTypeDescription
labelStringName of task. Must be unique within the request.
operationUriStringResource URI of the operation performed by the task. May be defined as task path variable.
dependsOnArrayList of tasks that must precede the task. Tasks are identified by label in the array.
requestBodyObjectBody of task. Matches the request body of corresponding operation.
continuedOnFailureBooleanIf true, batch job continues if task fails. By default, false.
skipMissingVariablesBooleanIf true, task is processed if tasks path variable is missing. By default, false.

The example does not display the content of the requestBody parameter, which defines operation.The contents of a request body varies depending on the operation performed. For example, if you define a geohaz task, the requestBody must specify a request that is valid for a Create Geohaz Job operation. To understand requestBody requirements a task operation, see reference documentation for the corresponding operation.

The following operations can be defined as tasks in a batch job: Create Export Job, Create Geohaz Job, Create Grouping Job, Create Model Job, Create Marginal Impact, Recalculate with PATE, Convert Event Rates and Losses, Create Risk Data Report, Create Bulk Edit Job, Covert Event Rates and Losses.

Supported Operations

The request body supports many different operations.

The operationUri specifies the operation run by each task.

OperationOperation URIDescription
Create Export Job/platform/export/v1/jobsExports exposure or result data in the specified format.
Create Geohaz Job/platform/geohaz/v1/jobsGeocodes or looks up hazard data for exposures.
Create Grouping Job/platform/grouping/v1/jobsCreates an analysis group, a collection of analysis results.
Create Model Job/platform/model/v1/jobsCat models exposures using the ALM, DLM, or HD model.
Calculate Marginal Impact/platform/riskdata/v1/ analyses/{id}/marginal-impactCalculates marginal impact analysis that measures the effect of adding additional accounts to an existing portfolio as differential losses.`
Recalculate with PATE/platform/riskdata/v1/ analyses/{id}/marginal-impactRecalculates metrics and statistics in an existing analysis or analysis group based on treaty term adjustments.
Convert Event Rates and Losses/platform/riskdata/v1/ analyses/{id}/convert-event-rate-lossCalculates new analysis based on an existing portfolio-level EP result by applying event rate or event loss adjustments.
Create Risk Data Report/platform/reports/v1/jobsCreates a risk data report, an exportable summary of exposure or analysis data.
Create Bulk Edit Job/platform/riskdata/v1/ exposures/{id}/bulk-editCreates, updates, or deletes multiple resources (portfolios, accounts, locations, policies, or treaties) in the specified EDM.

Settings

The settings object accepts two optional parameters:

ParameterTypeDescriptions
combine-batch-edit-and-geohazBooleanIf true, job supports workflow optimization. By default, false.
disable-underwriter-reportsBooleanIf true, suppresses generation of underwriting reports. By default, false.

Tasks

All operations are defined in the body of the request package. These operations are known as tasks within the workflow. Each task is defined as a JSON object in the request package and processed as a separate job by the workflow engine.

For example, a GEOHAZ task

{
  "tasks": [
    {
      "label": "geohaz1",
      "operationUri": "/platform/geohaz/v1/jobs",
      "dependsOn": ["bulkInsertAccountData"],
      "skipMissingVariables": true,
      "continueOnFailure": "true",
      "requestBody": {
        "resourceType": "account",
        "settings": {
            "layers": [
                {
                    "name": "geocode",
                    "type": "geocode",
                    "engineType": "RL",
                    "version": "21.0",
                    "layerOptions": {
                    "skipPrevGeocoded": false,
                    "aggregateTriggerEnabled": "true",
                    "geoLicenseType": "0"
                    }
                },
                {
                    "name": "earthquake",
                    "type": "hazard",
                    "version": "21.0",
                    "engineType": "RL",
                    "layerOptions": {
                    "skipPrevHazard": false,
                    "overrideUserDef": false
                    }
                }
            ]
        }
    ]
 }

The operationUri and requestBodyparameters may include task path variables.

For the workflow to work properly, it must be able to pass values output by one task to subsequent tasks the workflow. If you create an account in a task, you must be able to retrieve the accountId of that new account, so that you can use that ID to perform operations on that account in subsequent tasks in the workflow. This is accomplished using task path variables.

Task Path Variables

Batch jobs make heavy use of task path variables to identify and retrieve values (e.g. the IDs of exposures) that are created in the course of the workflow. Task path variables are the mechanism that enable you to create, geocode, and model an account in a single request.

A task path variable is a JSON query string that selects a new API resource -- a resource with an unknown ID-- based on a known value, e.g. the label applied to the task that created that resource.

For example, the requestBody of the GeohazAccounts task can use the {{$.BulkAccounts.output.accounts.[?(@.label == 'Account1')].id}} task path variable to select an account for geocoding and hazarding. The ID of this account is unknown, but it is known to be created by an earlier BulkAccounts task.

"requestBody": {
  "resourceType": "account",
  "settings": {
    "resourceType": "account",
    "settings": {
      "layers": [
        // Layers
      ]
    },
    "resourceUri": "/platform/riskdata/v1/exposures/ \
    {{exposure_id}}/accounts/ \
    {{$.BulkAccounts.output.accounts.[?(@.label == 'DemoAccount1')].id}}"
   }
  }
}

In example, the requestBody of the GeohazAccounts task uses a task path variable to identify an account exposure that it needs to geohazard. This account cannot be identified by its ID number because the account did not exist at the time that the batch job was submitted-- it was created in the course of the workflow by the BulkAccounts.

During processing, the workflow engine "expands" the task query to retrieve the unique ID number for the account using the path (label.output.id). The path variable is a simple JSON query string enclosed in double curly braces.

Workflow Optimization

If the combine-batch-edit-and-geohaz parameter is true, consecutive bulk-edit and geohaz tasks are combined and processed together as a single job, which can greatly reduce the runtime of a batch job.

In the example, the dependsOn parameter in the GeohazAccount object identifies the BatchAccount task as its immediate predecessor. Consequently, the bulk-edit and geohaz operations defined by these tasks are processed as a single job:

{
  "name": "Optimization Example",
  "settings": {
    "combine-batch-edit-and-geohaz": true,
    "disable-underwriter-reports": true
  },
  "tasks": [
    {
      "label": "BatchAccount",
      "operationUri": "/platform/riskdata/v1/exposures/{exposureId}/bulk-edit",
      "dependsOn": [],
      "continueOnFailure": "false",
      "requestBody": {
        // bulk-edit request
      }
    },
    {
      "label": "GeohazAccount",
      "operationUri": "/platform/geohaz/v1/jobs",
      "dependsOn": ["BatchAccount"],
      "continueOnFailure": false,
      "requestBody": {
        // geohaz request
      }
    },
    {
      "label": "ModelAccount",
      "operationUri": "/platform/model/v1/jobs",
      "dependsOn": ["GeohazAccount"],
      "continueOnFailure": false,
      "requestBody": {
        // cat modeling request
      }
    }
  ]
}

Examples

The following three examples demonstrate three common underwriting workflows:

  • Example 1: This workflow creates accounts in bulk, geocodes and looks up hazard data for account locations, and then models those exposures.
  • Example 2: This workflow exports analysis results to an RDM database artifact.
  • Example 3: This workflow performs a marginal impact analysis.

Moody’s Insurance Solutions makes Postman Collections available to licensed tenants for download from the RMS Developers public workspace.

A Postman Collection is a set of related API operations that when run in sequence perform a standard workflow. Each Postman Collection defines data structures for evaluating and testing Intelligent Risk Platform features, and understanding how request parameters affect responses.

Example 1: Batch Edit and Model

The Example 1 batch workfow is a simple workflow that creates a new account from file, geocodes account locations, and models those exposures.

In this example, the workflow creates a new account, geocodes and looks up hazard data for account locations, and models those exposures using DLM.

This workflow consists of four tasks:

  • /platform/riskdata/v1/exposures/{{exposure_id}}/bulk-edit
  • /platform/geohaz/v1/jobs
  • /platform/geohaz/v1/jobs
  • /platform/model/v1/jobs

The Batch operation initiates a BATCH job.

🍊

Postman Collection

Moody's Insurance Solutions makes sample code available in a Postman Collection that you can download from the RMS Developers public workspace: Batch Workflow (UIQ) Bulk Edit and Model

LabelOperationIddependsOn
bulkInsertAccountData/platform/riskdata/v1/exposures/{id}/bulk-edit
geohaz1/platform/geohaz/v1/jobsbulkInsertAccountData
geohaz2/platform/geohaz/v1/jobsbulkInsertAccountData
model1/platform/model/v1/jobsgeohaz1

This workflow consists of four tasks that initiate four jobs: EXPOSURE-BATCHES, GEOHAZ, GEOHAZ, PROCESS.

{
  "name": "Bulk Edit and Model",
  "settings": {
    "disable-underwriter-reports": true,
    "combine-batch-edit-and-geohaz": true
  },
  "tasks": [
  {
            "continueOnFailure": "false",
            "dependsOn": [],
            "skipMissingVariables": true,
            "label": "bulkInsertAccountData",
            "operationUri": "/platform/riskdata/v1/exposures/{{exposure_id}}/bulk-edit",
            "requestBody": {
                "accounts": []
            }
  }
    {
      "continueOnFailure": "true",
      "dependsOn": ["bulkInsertAccountData"],
      "skipMissingVariables": true,
      "label": "geohaz1",
      "operationUri": "/platform/geohaz/v1/jobs",
      "requestBody": {
        "resourceType": "account",
        "settings": {
          "layers": [
            {
              "name": "geocode",
              "type": "geocode",
              "engineType": "RL",
              "version": "21.0",
              "layerOptions": {
                "skipPrevGeocoded": false,
                "aggregateTriggerEnabled": "true",
                "geoLicenseType": "0"
              }
            },
            {
              "name": "earthquake",
              "type": "hazard",
              "version": "21.0",
              "engineType": "RL",
              "layerOptions": {
                "skipPrevHazard": false,
                "overrideUserDef": false
              }
            }
          ]
        },
        "resourceUri": "/platform/riskdata/v1/exposures/{{exposure_id}}/accounts/{{$.bulkInsertAccountData.output.accounts.[?(@.label == 'demo_acct_1')].id}}"
      }
    },
    {
        // Model
        {
            "continueOnFailure": "true",
            "dependsOn": [
                "geohaz1"
            ],
            "skipMissingVariables": true,
            "label": "model1",
            "operationUri": "/platform/model/v1/jobs",
            "requestBody": {
                "settings": {
                    "currency": {
                        "code": "{{anchor_currency_code}}",
                        "scheme": "{{currency_scheme_code}}",
                        "asOfDate": "2025-05-28",
                        "vintage": "{{currency_vintage}}"
                    },
                    "modelProfileId": {{model_profile_id}},
                    "outputProfileId": {{output_profile_id}},
                    "eventRateSchemeId": {{event_rate_scheme_id}}
                },
                "type": "DLM",
                "resourceType": "account",
                "resourceUri": "/platform/riskdata/v1/exposures/{{exposure_id}}/accounts/{{$.bulkInsertAccountData.output.accounts.[?(@.label == 'demo_acct_1')].id}}"
            }
        }
    },
  ]
}

Example 2: Batch Workflow that Exports Results

A batch job can be models new account exposures and exports the analysis results to RDM.

This batch job defines a workflow that consists of five tasks that are executed in succession. First, the bulk-edit task that creates an account, two geohaz tasks that geocode and perform hazard lookups for account locations, a model task that models account exposures using the DLM model.

Finally, it exports the data. The export task initiates an export job that exports result data to an RDM as a database artifact (BAK format). A maximum of 100 analysis results can be exported in a single request.

This workflow consists of five tasks:

  • /platform/riskdata/v1/exposures/{{exposure_id}}/bulk-edit
  • /platform/geohaz/v1/jobs
  • /platform/geohaz/v1/jobs
  • /platform/model/v1/jobs
  • /platform/export/v1/jobs

🍊

Postman Collection

Moody's Insurance Solutions makes sample code available in a Postman Collection that you can download from the RMS Developers public workspace: Batch Workflow with Export RDM Artifact

The Create Batch Job operation request specifies five tasks:

{
  "name": "Batch Job: Bulk-Edit, Geohaz, Model, Export Losses",
  "settings": {
    "disable-underwriter-reports": true,
    "combine-batch-edit-and-geohaz": true
  },
  "tasks": [
    {  // bulkInsertAccountData  },
    {  // geohaz1 },
    {  // geohaz2 },
    {  // model1 },
    {
      "continueOnFailure": "true",
      "dependsOn": ["model1"],
      "skipMissingVariables": true,
      "label": "exportloss",
      "operationUri": "/platform/export/v1/jobs",
      "requestBody": {
        "exportType": "RDM",
        "resourceType": "analyses",
        "settings": {
          "fileExtension": "BAK",
          "sqlVersion": "2019",
          "rdmName": "Test_RDM_Export"
        },
        "resourceUris": [
          "/platform/riskdata/v1/analyses/{{$.model1.output.analysisId}}"
        ]
      }
    }
  ]
}

A successful returns a 201 Created HTTP response code and the URL of the BATCH job in the Location header, e.g.
https://api-euw1.rms-ppe.com/platform/batch/v1/jobs/22880692. Use this URL to poll the status of the job.

Example 3: Batch Workflow with Marginal Impact

As with the previous examples, it defines a bulk-edit task that creates an account, two geohaz tasks that geocode and perform hazard lookups for account locations, a model task that models account exposures using the DLM model.

It then calculates a marginal impact analysis based on a reference analysis and creates an EXPOSURE_SUMMARY report based on the analysis.

This Batch Workflow (Example 3) is defined by a batch job that consists of six tasks:

🍊

Postman Collection

Moody's Insurance Solutions makes sample code available in a Postman Collection that you can download from the RMS Developers public workspace: Batch Workflow (UIQ) with Marginal Impact

The required analysisid path parameter specifies the ID of the reference analysis, a portfolio-level, ELT- or PLT-based analysis. The required marginalImpactAnalysisIds body parameter defines an array of account-level analysis results. For each analysis result, the operation returns differential losses that represent the difference between the original portfolio metrics with those of the updated portfolio that includes the account metrics.

The example shows the last two tasks in the Create Batch Job operation request:

{
  "name": "Batch Job: Marginal Impact, Export Report",
  "settings": {
    "disable-underwriter-reports": true,
    "combine-batch-edit-and-geohaz": true
  },
  "tasks": [
    {  // bulkInsertAccountData },
    {  // geohaz1 },
    {  // geohaz2 },
    {  // model1 },
    {
      "continueOnFailure": "true",
      "dependsOn": ["model1"],
      "skipMissingVariables": true,
      "label": "marginal1",
      "operationUri": "/platform/riskdata/v1/analyses/{{ref_analysis_id}}/marginal-impact",
      "requestBody": {
        "currency": {
          "currencyCode": "{{anchor_currency_code}}",
          "currencyScheme": "{{currency_scheme_code}}",
          "currencyAsOfDate": "2025-05-28",
          "currencyVintage": "{{currency_vintage}}"
        },
        "outputLevel": "ACCOUNT",
        "marginalImpactAnalysisIds": ["{{$.model1.output.analysisId}}"],
        "jobName": "Test_MI_Bala",
        "eventRateSchemeIds": [12]
      }
    },
    {
      "continueOnFailure": "true",
      "dependsOn": ["bulkInsertAccountData"],
      "skipMissingVariables": true,
      "label": "expSummaryReport",
      "operationUri": "/platform/riskdata/v1/reports",
      "requestBody": {
        "reportType": "EXPOSURE_SUMMARY",
        "settings": {
          "perilCodes": ["eq"],
          "fileExtension": "JSON",
          "reportName": "expsummarytest"
        },
        "resourceUri": "/platform/riskdata/v1/exposures/{{exposure_id}}/accounts/{{$.bulkInsertAccountData.output.accounts.[?(@.label == 'demo_acct_1')].id}}",
        "resourceType": "account"
      }
    }
  ]
}