MRI Import
Overview
The Import API defines resources for importing data into the Intelligent Risk Platform.
Multi-relational import (MRI) is a data migration process that enables Intelligent Risk Platform™ tenants to import exposure data into an EDM.
MRI utilizes flat files of account, location, reinsurance exposure data. A flat file is a delimited text file that stores structured data in a plain text format. MRI leverages a separate mapping file to define the relationships between the account, location, and reinsurance exposures.
While you can use Import API services to securely connect to storage buckets on the Amazon S3, you must use an AWS to upload flat files of data to those storage buckets. Once you have uploaded source files, you can use Risk Modeler API operations to import the exposure data in those flat files into an EDM.
Postman Collection
Moody's RMS makes a Postman Collection available for testing and evaluating the MRI Import workflow.
While this page documents this workflow and describes step-by-step the workflow and the endpoints that constitute this process, the Plaform: MRI collection enables Intelligent Risk Platform tenants to test this workflow.
RMS Developer Portal provides collections for testing standard Platform workflows for importing and exporting exposures to and from the Intelligent Risk Platform. The site is freely available to the public. To learn more, see RMS Developers Portal Team.
Step 1: Prepare source files
The first step, is to create one or more source files that define the exposures to be imported in flat files. A flat file is two-dimensional database in a plain text format. Each row of text in the file defines a record. The file uses a delimiter character to organize text into discreet columns of structured data. The flat file may use commas, semicolons, or tabs to delimit data values.
MRI Import enables you to import data from four different source files that are defined a flat files:
Resource File | Descripton |
---|---|
account | A flat file that contains exposure data for one or more accounts and, optionally, policy data associated with each account. |
location | A flat file that contains location exposure data. |
reinsurance | A flat file of reinsurance exposure data. |
mapping | A flat file that defines mappings between exposure data defined in different resource files. |
The following restrictions apply to the flat files.
- The file must have a unique name and be saved in the CSV, TXT, or XSLX format. For example,
accexp.csv
oraccexp.txt
. - The file must contain
ACCTNUM
data. TheACCTNUM
attribute is required in both the account source file and the location source file enabling the MRI service to map locations to accounts. - Each column in the flat file contains data contains account attribute values separated by means of a text delimiter. To ensure accurate interpretation of any numbers that use commas, RMS recommends tab or semicolon delimiters. In Step 4, you will need to specify a delimiter value.
- The first line of text in the flat file may specify column headers, strings that identify the data attribute represented by a column of data. Column header data is ignored during the import process.
- If included, the
POLICYNUM
attribute cannot be blank or null. - If included, the
POLICYTYPE
attribute cannot be blank or null. One of1
(Earthquake),2
(Windstorm),3
(Winterstorm),4
(Flood),5
(Fire),6
(Terrorism),7
(Workers Compensation). - If included, policy limits, deductibles, and premiums, must be specified as positive values. Negative numbers are not allowed.
- If included, policy coverage limits, deductibles, and premiums, must be specified as positive values. Negative numbers are not allowed.
- If included, the
INCEPTDATE
cannot specify a value later than that of theEXPIREDATE
value.
For a comprehensive description of account data requirements for MRI import, see DLM Reference Guide on RMS Owl.
In Step 4 of this recipe, the MRI_IMPORT
job will ingest account data from this file and import that data into the EDM you specify. To ensure that the platform correctly parses the file, you will need to specify the delimiter used to structure text in the file and indicate if the file uses column headers.
DATA VALIDATION
Intelligent Risk Platform™ validates the data specified in the account source file prior to importing that data into the EDM. The import process performs basic data validations to make sure that imported data is valid and does not create orphan records.
If a required data attribute value is missing or invalid, the platform throws a validation error.
Step 2 : Create import folder
The Create Import Folder resource enables you to create a temporary storage location (a storage bucket) on AWS S3. An import folder is a temporary storage location and a logical path on AWS S3 to upload the relevant for performing an import.
You must create a unique import folder for each source file. If you are uploading account, location, reinsurance, and mapping source files, you must repeat this step four times.
An import folder is a temporary storage location to upload the relevant for performing an import. Import Folder is a logical path on AWS S3 for which the temporary credentials can be obtained through POST /folders
request. The import folder is disposed immediately after import job is complete
The resource takes two required body parameters: the folderType
parameter and a properties
object that defines the extension of the output file.
curl --request POST \
--url https://{host}/platform/import/v1/folders \
--header 'accept: application/json' \
--header 'content-type: application/json'
All parameters are specified in the request body. The folderType
and properties
object are required.
If the folderType
is MRI
, the properties
object specifies the fileExtension
of the data to be uploaded to the import folder and the fileTypes
array, which lists the names of the files to be uploaded.
If the `fileExtension` parameter is `TXT` or `CSV`, the fileTypes parameter may define an array of values to identify the `accountsFile`, `locationsFile`, `reinsuranceFile` and `mappingsFile` uploaded.
```json
{
"folderType": "mri",
"properties": {
"fileExtension": "csv",
"fileTypes": [
"accountsFile",
"locationsFile",
"mappingsFile",
"reinsuranceFile",
]
}
}
If successful, the operation returns a 201 Created
status code and uploadDetails
for the file to be uploaded to this folder: spreadsheet
, accountsFile
, locationsFile
, reinsuranceFile
, or mappingFile
.
{
"folderType": "MRI",
"folderId": "string",
"uploadDetails": {
"spreadsheet": {
"uploadUrl": "string",
"fileUri": "import/folders/123213/files/1",
"presignParams": {
"accessKeyId": "string",
"secretAccessKey": "string",
"sessionToken": "string",
"path": "import/folders/123213",
"region": "string"
}
},
"accountsFile": {
"uploadUrl": "string",
"fileUri": "import/folders/123213/files/1",
"presignParams": {
"accessKeyId": "string",
"secretAccessKey": "string",
"sessionToken": "string",
"path": "import/folders/123213",
"region": "string"
}
},
"locationsFile": {
"uploadUrl": "string",
"fileUri": "import/folders/123213/files/1",
"presignParams": {
"accessKeyId": "string",
"secretAccessKey": "string",
"sessionToken": "string",
"path": "import/folders/123213",
"region": "string"
}
},
"reinsuranceFile": {
"uploadUrl": "string",
"fileUri": "import/folders/123213/files/1",
"presignParams": {
"accessKeyId": "string",
"secretAccessKey": "string",
"sessionToken": "string",
"path": "import/folders/123213",
"region": "string"
}
},
"mappingFile": {
"uploadUrl": "string",
"fileUri": "import/folders/123213/files/1",
"presignParams": {
"accessKeyId": "string",
"secretAccessKey": "string",
"sessionToken": "string",
"path": "import/folders/123213",
"region": "string"
}
}
}
}
The presignParams
Iobject returns the temporary security credentials that enable you to programmatically sign AWS requests. Signing helps to secure requests by verifying the identity of the requester and protecting the data in transit.
Parameter | Description |
---|---|
accessKeyId | A base64 encoded S3 access key ID, a unique identifier for your S3 access key. |
secretAccessKey | A base64 encoded S3 secret access key. The access key ID and secret access key enable you to sign AWS requests. |
path | A base64 encoded path to the Amazon S3 bucket. For example, import/folders/123213 |
sessionToken | A base64 encoded S3 session token. |
region |
In Step 6, we can use Amazon SDK APIs to upload the source file to this folder. But first, we need to get details about the location of the location of the resource files.
Step 3: Get EDM
The Get EDM API resource returns information about the EDMs available to an Intellient Risk Plaform tenant.
curl --request GET \
--url 'https://{HOST}/platform/riskdata/v1/exposures?sort=exposureName%20ASC&limit=100&offset=0' \
--header 'accept: application/json'
This operation supports response filtering based the value of a subset of properties. Depending on the property, you may use a combination of comparison operators, list operators, and logical operators.
The response returns an array of EDM objects.
[
{
"exposureName": "string",
"exposureId": 0,
"uri": "string",
"status": "string",
"databaseName": "string",
"metrics": {
"additionalProp": {}
},
"ownerName": "string",
"exposureSetId": 0,
"serverType": "PLATFORM",
"serverName": "string",
"serverId": 0,
"tagIds": [
0
]
}
]
The exposureId
identifies the ID number of an EDM. In Step 4, we will use this ID to retrieve a list of the portfolios available on that EDM.
Step 4: Get Portfolios
The Search Portfolios API resource returns information about the portfolios within a specific EDM. Using the exposureId
returned in Step 5, you can
curl --request GET \
--url https://api-euw1.rms.com/platform/riskdata/v1/exposures/exposureId/portfolios \
--header 'accept: application/json'
https://{{HOST}}/platform/riskdata/v1/exposures/10171/portfolios
This operation supports response filtering based the value of a subset of properties. Depending on the property, you may use a combination of comparison operators, list operators, and logical operators.
The response returns an array of portfolio objects.
[
{
"portfolioId": 0,
"portfolioName": "string",
"portfolioNumber": "string",
"description": "string",
"createDate": "string",
"stampDate": "string",
"uri": "string",
"geocodeVersion": "string",
"hazardVersion": "string",
"ownerName": "string",
"updatedBy": "string",
"tagIds": [
0
]
}
]
The portfolioId
identifies the ID number of a portfolio.
Step 5: Get Resource Groups
The Get Resource Groups API resource returns a list of resource groups.
A resource group is an X.
This endpoint accepts four different path parameters: RI-RISKMODELER
, RI-UNDERWRITEIQ,
EXPOSUREIQ, and
TREATYIQ`.
curl --request GET \
--url https://{HOST}/platform/tenantdata/v1/entitlements/RI-RISKMODELER/resourcegroups \
--header 'accept: application/json'
The response returns an array of resource group objects including the resourceGroupId
for each resource group.
[
{
"resourceGroupId": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"name": "string",
"entitlement": "RI-RISKMODELER",
"createdBy": "string",
"createdAt": "2024-02-07T01:26:17.883Z",
"modifiedBy": "string",
"modifiedAt": "2024-02-07T01:26:17.883Z",
"default": true,
"allAccess": true
}
]
Step 6: Upload files
The Moody's RMS Import API does not provide operations for uploading local files to AWS. Rather, you must use the Amazon S3 API or an Amazon SDK to upload the database artifact to the Amazon S3 bucket you created in Step 2.
In this procedure, you will use the Amazon S3 bucket path and temporary user credentials to upload account data to the MRI folder. First, you must decode to the accessKeyId
, secretAccessKey
, sessionToken
, and s3Path
values returned in Step 2 and pass the decoded values to a S3 client. The sample code is in Java 8.
private static String base64Decode(String text) {
return new String(Base64.getDecoder().decode(text));
}
Pass the decoded accessKeyId
, secretAccessKey
, and sessionToken
to the Amazon getS3Client(
) method to create an Amazon S3 client.
private static AmazonS3 getS3Client(String accessKey, String secretKey, String sessionToken){
BasicSessionCredentials sessionCredentials = new BasicSessionCredentials(
accessKey,
secretKey,
sessionToken);
return AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(sessionCredentials))
.withRegion(Regions.EU_WEST_1)
.build();
}
Amazon TransferManager is a high-level utility for managing transfers to Amazon S3 that makes extensive use of Amazon S3 multipart uploads.
Once you have the Amazon S3 client, you can pass the s3Client
, bucketName
, key
, and the filePath
to the TransferManager.
private static void upload(AmazonS3 s3Client, String bucketName, String key, String filePath) {
try {
TransferManager tm = TransferManagerBuilder.standard()
.withS3Client(s3Client)
.build();
// TransferManager processes all transfers asynchronously,
// so this call returns immediately.
Upload upload = tm.upload(bucketName, key, new File(filePath));
System.out.println("Object upload started");
// Optionally, wait for the upload to finish before continuing.
upload.waitForCompletion();
System.out.println("Object upload complete");
}catch( Exception ex){
System.out.println(ex.getMessage());
}
}
The parameters are derived from previous steps:
Parameter | Description |
---|---|
bucketName | The bucketName can be extracted from the the initial section of the decoded s3Path. If the s3Path is rms-mi/preview/tenant/50000/import/mri/3929 , the bucketName is rms-mi . |
key | Combines the remaining portion of the s3Path with the fileId , fileName in the pattern: s3Path/fileId-fileName . For example, preview/tenant/50000/import/mri/3929/12373-fileName |
filePath | The absolute path to the file you want to upload. |
If successful, the Amazon API will upload the files to the CEDE folder on AWS. From this folder, we can use the Import API to import that data into the Intelligent Risk Platform. But before we can do that we need to identify an appropriate data server and create an exposure set on that data server.
Step 4: Import files
The Create Import Job resource enables you import the data that you have uploaded to the import folder on AWS S3 into the Intelligent Risk Platform.
curl --request POST \
--url https://{host}/platform/import/v1/jobs \
--header 'accept: application/json' \
--header 'content-type: application/json'
All request parameters are specified in the request body:
{
"importType": "mri",
"settings": {
"import files": {
"accountsFileId": "accountfile",
"locationsFileId": "locationsfile",
"reinsuranceFileId": "reinsurancefile",
"mappingFileId": "mappingfile"
},
"sheets": {
"accountsSheetIndex": 67,
"locationsSheetIndex": 89,
"reinsuranceSheetIndex": 98
},
"folderId": "888888",
"currency": "USD",
"delimiter": "SEMICOLON",
"skipLines": 0,
"appendLocations": true,
"geoHaz": false
}
}
The required importType
parameter specifies the import job type. One of CEDE
, EDM
, OED
, or MRI
. Depending on the importType
specified, different properties are required depending on the specified. The optional resourceUri
parameter specifies where the import needs to be performed.
Parameter | Descripton |
---|---|
settings | Properties that define the MRI. Because we are defining an MRI job, you must specify the accountsFileId and locationsFileId body parameters as well as the bucketId , dataSourceNqme , and the delimiter parameters. We have also uploaded a reinsurance source file and a mapping file and will identify those files. |
importfiles | The import files object identifies the name of the source files to import. The accountFileId , locationFileId , and reinsuranceFielId identify the account source file, location source file, and reinsurance source file respectively. The MRI process only works if both an accounts resource file and locations resource file are uploaded to the import folder. Consequently, the accountsFileId and locationsFileId must also be defined in MRI jobs. The mappingFileId parameter specifies the ID number of an import mapping file. The import mapping file maps the columns in uploaded resource files to IRP database columns. The mapping file both defines the relationships between account, location and reinsurance data and between columns in the uploaded account, location, and reinsurance flat files and the corresponding columns in the EDM. |
sheets | The ID of the accountsSheetIndex , locationsSheetIndex and reinsuranceSheetIndex |
folderId | ID of the import folder. |
currency | A default value for the currency if the currency column for an imported exposure is undefined or blank. The value specified does not overwrite currency values in the source file themselves. |
delimiter | Delimiter used to structure data in the account, location, and reinsurance source files. One of TAB , COMMA , or SEMICOLON . Required. |
skipLines | |
appendLocations | |
geohaz |
If successful, returns a 201 Accepted
HTTP response and initiates an MRI_IMPORT
job. Use the Get import jobs resource to track the status of the job.
Step 5: Track Status
The Get job status operation enables you track the status the MRI import job you just ran. The workflow ID is specified in the endpoint path.
curl --location --request GET 'https://{host}/riskmodler/v1/workflows/451289' \
--header 'Authorization: XXXXXXXXXXXX'
A successful response returns the Workflow
object, which provides detailed information about this workflow job including the submitTime
, startTime
, type
, status
.
When the job status is FINISHED
, the exposure data will have been imported into the EDM.
{
"id": 451289,
"userName": "[email protected]",
"status": "FINISHED",
"submitTime": "2020-07-01T17:58:30.310Z",
"startTime": "2020-07-01T17:58:33.608Z",
"endTime": "2020-07-01T18:03:25.335Z",
"name": "ACCOUNT",
"type": "MRI_IMPORT",
"jobs": [
{
"id": "c1469aad-4a36-4e75-9999-8daf3b0886a1",
"taskId": 0,
"workflowId": 451289,
"status": "Succeeded",
"submitTime": "2020-07-01T17:58:33.929Z",
"createdAt": "2020-07-01T17:58:30.309Z",
"name": "MRI_IMPORT",
"input": {
"name": "MRI_IMPORT"
},
"output": {
"importSummary": "Imported 1 Accounts and 2 Locations"
},
"priorJobs": [],
"percentComplete": 100
}
],
"summary": {
"validationDownloadLink": "Not Available",
"importSummary": "Imported 1 Accounts and 2 Locations",
"expirationDate": "Not Available"
},
"progress": 100,
"messages": []
}
Now, we will verify that the exposure data was added by generating an exposure summary report for the portfolio.
Updated 10 months ago