Import Exposures in Batch
Import large volumes of exposures and generate summary reports
Overview
Exposure Batch for UnderwriteIQ leverages the Risk Modeler API to facilitate the importation of batches of exposure data in support of underwriting workflows. Underwriters and other users that are primarily interested in account-level modelling will benefit from the increased performance and usability of this process when importing exposure data.
Exposure Batch for UnderwriteIQ is best suited for importing exposures that are applicable to single accounts.
This workflow supports two methods for uploading large volumes of exposures to the Intelligent Risk Platform.
- Method 1: Exposure data is defined in the body of the request. The request JSON defines the relationships between accounts, locations, policies, and treaties in a hierarchical structure.
- Method 2: Exposure data is defined in the body of a JSON file that is uploaded to a storage bucket on AWS and the body of the Manage Exposures in Batch operation specifies a
uploadId
of the uploaded file.
Exposure Batch for UnderwriteIQ is fast and easy to use. Unlike MRI, which is designed to enable large-scale data migration projects and emphasizes data throughput, batch processing is designed to quickly import smaller batches of exposure data in a single request. You do not need to upload and process multiple upload packages or define mappings between exposures.
Method 1: Upload Exposure Data in Request
In Method 1, the request body defines an object that consists of a set of nested arrays. The resource accepts a portfolios array, accounts array, locations array, policies, and treaties array. Where parent-child relationships exist between exposures (e.g. between an account and associated locations), child exposures may be defined within the definition of the parent exposure.
To take advantage of its performance benefits, limit the size and scale of the request package:
- Ensure that the request payload is not larger the 5MB in size.
- Ensure that the number of location exposures in the request is less than 1000 records.
Postman Collection
Moody's makes sample code available to you in a Postman Collection that you can download from our Github repo. https://github.com/RMS/rms-developers/releases/tag/2023-04-uw
Step 1.1: Import exposures
The Manage Exposures in Batch resource enables you to create, update, or delete large numbers of exposures in a single request.
As with all requests that use Risk Modeler API resources, you must specify a host in the resource URL and provide a valid API key in the request header. The {host}
variable identifies the environment hosting your UnderwriteIQ application instance (one of api-euw1.rms.com
or api-use1.rms.com
) and the apiKey
specifies an API key token.
curl --location --request POST "https://{host}/riskmodeler/v3/exposurebatches?datasource={myEDM}"
--header "Authorization: {api_key}"
--header "X-Rms-Resource-Group-Id: {resource_group_id}"
--data "{exposure_data}"
Each array in the request package defines one or more operations that create, edit, or delete an exposure.
An operation is defined by an optional operationType
, a required number, and a required request body. The operationType
specifies the action performed (INSERT
, UPDATE
, or DELETE
). By default, INSERT
. If you are adding new exposures, as we are here, this parameter is not required. The number uniquely identifies an operation.
{
"accounts": [
{
"operationType": "INSERT",
"name": "NAEQ-JSON",
"number": "NAEQ-JSON",
"locations": [
{ location_object1 },
{ location_object2 },
{ location_object3 },
{ location_object4 }
]
}
],
...
}
Best Practice
For best results, omit the portfolios array and make the accounts array the top level of the exposure hierarchy. Within that array, limit the number of location exposures in the request to 1000 records or less.
Warning
Ensure that the request package is no larger than 5MB in size. Request packages that define up to 1000 exposures may be submitted in JSON format: Content-Type: application/json.
Each array in the request package defines one or more operations that create, edit, or delete an exposure. An operation is defined by an optional operationType, a required number, and a required request body. The operationType specifies the action performed (INSERT
, UPDATE
, or DELETE
). By default, INSERT
. If you are adding new exposures, as we are here, this parameter is not required. The number uniquely identifies an operation.
On success, the operations returns a 202 Accepted
response and adds an EXPOSURE_BATCH_EDIT
job to the workflow engine queue. The Location
response header specifies the job ID as part of a URL that you can use to track the status of the job, e.g. https://{host}/riskmodeler/v1/workflows/9034518
.
Step 1.2: Poll job status
The Get job status operation enables you to view the status of the EXPOSURE_BATCH_EDIT
job and provides a link to completed exposure summary report when it is complete.
The workflow ID is specified in the endpoint path.
curl --location --request GET 'https://{host}/riskmodler/v1/workflows/9034518' \
--header 'Authorization: {api_key}'
If successful, the response returns a 200
status code and workflowId
of the job in the Location response header. You can poll this URL operation to track the status of the workflow job.
Method 2: Upload File to AWS
In Method 2, exposure data is defined in the body of a JSON file that is uploaded to a storage bucket on AWS and the body of the Manage Exposures in Batch operation specifies a uploadId of the uploaded file.
This method is similar to other Risk Modeler data migration workflows that leverage Amazon S3 storage buckets to facilitate the migration of large volumes of exposure data, e.g. the MRI workflow.
Note
The UnderwriteIQ dedicated interactive queue is not supported if exposures are imported using this method.
Step 2.1: Define JSON file of exposure data
Define exposure data in JSON file. Exposure data is structured as a request package for this operation.
Step 2.2 Upload exposure data to AWS
Use Get Storage Bucket URL to initialize a storage bucket on S3 for data upload.
The request must specify EXPOSURE_BATCH_EDIT
as the dbType
and JSON as the fileextension
.
The response returns S3 credentials and the uploadId
.
Step 2.3: Upload JSON file to AWS
Upload the JSON file to the storage bucket using Amazon S3 API operations.
Step 2.4: Import exposures
The Manage Exposures in Batch resource enables you to create, update, or delete large numbers of exposures in a single request.
As with all requests that use Risk Modeler API resources, you must specify a host in the resource URL and provide a valid API key in the request header. The {host}
variable identifies the environment hosting your UnderwriteIQ application instance (one of api-euw1.rms.com
or api-use1.rms.com
) and the apiKey
specifies an API key token.
curl --location --request POST "https://{host}/riskmodeler/v3/exposurebatches?datasource={myEDM}"
--header "Authorization: {api_key}"
--header "X-Rms-Resource-Group-Id: {resource_group_id}"
--data "{exposure_data}"
In Method 2, exposure data is defined in the body of a JSON file that is uploaded to a storage bucket on AWS and the body of the Manage Exposures in Batch operation specifies a uploadId
of the uploaded file.
{
"uploadId": "3ffc7d59-5c33-46ac-9511-5f4354979b69"
}
Note
The UnderwriteIQ dedicated interactive queue is not supported if the File Upload method is used to define exposures or if the number of location exposures specified in the request package exceeds 10000 exposures.
On success, the operations returns a 202 Accepted
response and adds an EXPOSURE_BATCH_EDIT
job to the workflow engine queue. The Location
response header specifies the job ID as part of a URL that you can use to track the status of the job, e.g. https://{host}/riskmodeler/v1/workflows/9034518
.
Step 2.5: Poll job status
The Get job status operation enables you to view the status of the EXPOSURE_BATCH_EDIT
job and provides a link to completed exposure summary report when it is complete.
The workflow ID is specified in the endpoint path.
curl --location --request GET 'https://{host}/riskmodler/v1/workflows/9034518' \
--header 'Authorization: {api_key}'
If successful, the response returns a 200
status code and workflowId
of the job in the Location response header. You can poll this URL operation to track the status of the workflow job.
Updated about 2 months ago