MRI Import
Overview
Multi-relational import (MRI) is a data migration process that enables Intelligent Risk Platform™ tenants to import large volumes of account, and location exposure data into portfolios in the Intelligent Risk Platform™. Tenants can import account, location, and reinsurance cession into existing portfolios in an EDM.
The MRI import workflow depends upon the definition of account, location, and reinsurance exposure data in structured text files that are uploaded to temporary storage (an import folder) on Amazon S3. Once uploaded to the import folder, the Import API supports a process for ingested this data from the folder in the specified EDM.
Like other Import API workflows, this process relies upon Amazon S3 API to upload data to the storage buckets. While you can use Import API operations to securely connect to storage buckets on the Amazon S3, you must use an AWS client to upload flat files of data to those storage buckets. Once you have uploaded source files, you can use Risk Modeler API operations to import the exposure data in those flat files into an EDM.
Step 1: Prepare Source Files
The MRI import process requires that data is defined in a particular format in source files. These resource files are essentially flat files. A flat file is a delimited text file that stores structured data in a plain text format. MRI leverages a separate mapping file to define the relationships between the account, location, and reinsurance exposures.
The first step, is to create one or more source files that define the exposures to be imported in flat files.
A flat file is two-dimensional database in a plain text format. Each row of text in the file defines a record. The file uses a delimiter character to organize text into discreet columns of structured data. The flat file may use commas, semicolons, or tabs to delimit data values.
The MRI import process supports importing data and settings from four different types of files:
| File Type | Content | Format | |
|---|---|---|---|
accountsFile | Required | Account and policy data | CSV, TXT |
locationsFile | Required | Location data | CSV, TXT |
reinsuranceFile | Optional | Reinsurance cession data | CSV, TXT |
mappingsFile | Optional | Table mappings | MFF |
To ensure that the platform correctly parses the file, you will need to specify the delimiter used to structure text in the file and indicate if the file uses column headers.
- The file must have a unique name and be saved in the CSV, TXT, or XSLX format. For example,
accexp.csvoraccexp.txt. When you create the import folder in Step 2, you can specify the file format of the uploaded source files. - The first line of text in the flat file may specify column headers, strings that identify the data attribute represented by a column of data. Column header data is ignored during the import process.
- Each column in the flat file contains data contains account attribute values separated by means of a text delimiter. To ensure accurate interpretation of any numbers that use commas, Moody's recommends tab or semicolon delimiters. When you import the data in Step 4, you will need to identify the delimiter used in the source file.
For a comprehensive description of account data requirements for MRI import, see DLM Reference Guide on Moody's RMS Support Center. For detailed information on import file requirements, see MRI Import guide in Risk Modeler Help Center.
Account Data
The accountsFile is a flat file that contains a list of accounts and related policy data. Each row after the initial header row specifies detailed information about a specific account that will be added to the portfolio.
ACCNTNUM,ACCNTNAME,CEDANTID,CEDANTNAME,POLICYNUM,LOBNAME,POLICYTYPE,POLICYSTAT,INCEPTDATE,EXPIREDATE
acctnum1,acctname1,cedantid1,cedantnm1,policynm1,lobnam1,1,book,6/14/2018,6/14/2019
acctnum2,acctname2,cedantid2,cedantnm2,policynm2,lobnam1,1,book,6/14/2018,6/14/2019
acctnum3,acctname3,cedantid3,cedantnm3,policynm3,lobnam1,1,book,6/14/2018,6/14/2019
acctnum4,acctname4,cedantid4,cedantnm4,policynm4,lobnam1,1,book,6/14/2018,6/14/2019
The ACCTNUM and POLICYNUM properties are required.
- If included, the
POLICYNUMattribute cannot be blank or null. - If included, the
POLICYTYPEattribute cannot be blank or null. One of1(Earthquake),2(Windstorm),3(Winterstorm),4(Flood),5(Fire),6(Terrorism),7(Workers Compensation). - If included, policy coverage limits, deductibles, and premiums, must be specified as positive values. Negative numbers are not allowed.
- If included, the
INCEPTDATEcannot specify a value later than that of theEXPIREDATEvalue.
For detailed information about the account data that can be imported including field names, data types, and EDM schema names, see Account File: Import Information.
Location Data
The locationsFile is a flat file that contains a list of location exposures that are tied to one or more accounts.
Each row after the initial header row specifies detailed information about a location. Note that the first column specifies the ACCOUNTNUM of the location.
ACCNTNUM,LOCNUM,LOCNAME,STREETNAME,CITY,STATE,STATECODE,POSTALCODE,LATITUDE,LONGITUDE,NUMBLDGS,BLDGHEIGHT
acctnum1,locnm1,newark,7575 gateway,newark,California,CA,94560,37.5412,-122.06061,5,12000
acctnum2,locnm1,newark,7575 gateway,newark,California,CA,94560,37.5412,-122.06061,5,12000
acctnum1,locnm2,newark,7575 gateway,newark,California,CA,94560,37.5412,-122.06061,5,12000
acctnum2,locnm2,newark,7575 gateway,newark,California,CA,94560,37.5412,-122.06061,5,12000
acctnum3,locnm1,newark,7575 gateway,newark,California,CA,94560,37.5412,-122.06061,5,12000
The ACCTNUM property must be specified for each location in the locationsFile. The ACCTNUM property maps locations to accounts and is required in both the accountsFile and locationsFile.
For detailed information about the location data that can be imported including field names, data types, and EDM schema names, see Location File: Import Information.
Reinsurance Data
The reinsuranceFile is a flat file that contains reinsurance cession and coverage data that applies to the account and the location exposures.
The following is an example of reinsurancefile that uses the comma character as a delimiter.
EXPOSRNUM,EXPOSRTYPE,PRIORITY,LAYERNUM,PCNTREINS,LAYERAMT,LAYERCUR,EXCESSAMT,EXCESSCUR,REINSID,FACNAME,REINSTYPE,MAOLAMT
exposnum1,POL,1,0,25,15000,CAD,0,CAD,POL_SS,,T,0
exposnum2,POL,1,0,25,15000,CAD,0,CAD,POL_SS,,T,0
exposnum3,POL,1,0,25,15000,CAD,0,CAD,POL_SS,,T,0
The reinsuranceFile is optional. If uploaded, the EXPOSRNUM, EXPOSRTYPE, LAYERNUM, PCNTREINS, EXCESSAMT, and RINSTYPE properties are required. The REINSID property is required for facultatative cessions.
For detailed information about reinsurance data including field names, data types, and EDM schema names, see Reinsurance File: Import Information.
Step 2: Create Import Folder
The Create Import Folder operation creates a storage bucket on Amazon S3.
An import folder is a temporary storage location to upload the relevant for performing an import. Import Folder is a logical path on AWS S3 for which the temporary credentials can be obtained through POST /folders request. The import folder is disposed immediately after import job is complete
This import folder is a temporary storage location and a logical path on Amazon S3 to upload the relevant for performing an import.
The resource takes two required body parameters: the folderType parameter and a properties object that defines the extension of the output file.
curl --request POST \
--url https://{host}/platform/import/v1/folders \
--header 'accept: application/json' \
--header 'content-type: application/json'
All parameters are specified in the request body. The folderType and properties object are required.
If the folderType is MRI, the properties object specifies the fileExtension of the data to be uploaded to the import folder and the fileTypes array, which lists the names of the files to be uploaded.
If the fileExtension parameter is TXT or CSV, the fileTypes parameter may define an array of values to identify the accountsFile, locationsFile, reinsuranceFile and mappingsFile uploaded.
{
"folderType": "MRI",
"properties": {
"fileExtension": "txt",
"fileTypes": ["accountsFile", "locationsFile", "reinsuranceFile"]
}
}
Alternatively, if the fileExtension parameter is XLSX, the fileTypes array specifies spreadsheet.
{
"folderType": "mri",
"properties": {
"fileExtension": "xlsx",
"fileTypes": ["spreadsheet"]
}
}
If successful, the operation returns a 201 Created status code and uploadDetails for the files uploaded to the import folder.
{
"folderType": "MRI",
"folderId": "234741",
"uploadDetails": {
"accountsFile": {
"fileUri": "platform/import/v1/folders/234741/files/584999",
"presignParams": {
"accessKeyId": "xxxxxx",
"secretAccessKey": "xxxxxx",
"sessionToken": "xxxxxx",
"path": "xxxxxx",
"region": "xxxxxx"
},
"uploadUrl": "https://xxxxxx.s3.amazonaws.com/xxxxxx/import/platform/mri/234741/584999-accountsfile.csv"
},
"mappingFile": {
"fileUri": "platform/import/v1/folders/234741/files/585002",
"presignParams": {
"accessKeyId": "xxxxxx",
"secretAccessKey": "xxxxxx",
"sessionToken": "xxxxxx",
"path": "xxxxxx",
"region": "xxxxxx"
},
"uploadUrl": "https://xxxxxx.s3.amazonaws.com/xxxxxx/import/platform/mri/234741/585002-mappingfile.mff"
},
"locationsFile": {
"fileUri": "platform/import/v1/folders/234741/files/585000",
"presignParams": {
"accessKeyId": "xxxxxx",
"secretAccessKey": "xxxxxx",
"sessionToken": "xxxxxx",
"path": "xxxxxx",
"region": "xxxxxx"
},
"uploadUrl": "https://xxxxxx.s3.amazonaws.com/xxxxxx/import/platform/mri/234741/585000-locationsfile.csv"
},
"reinsuranceFile": {
"fileUri": "platform/import/v1/folders/234741/files/585001",
"presignParams": {
"accessKeyId": "xxxxxx",
"secretAccessKey": "xxxxxx",
"sessionToken": "xxxxxx",
"path": "xxxxxx",
"region": "xxxxxx"
},
"uploadUrl": "https://xxxxxx.s3.amazonaws.com/xxxxxx/import/platform/mri/234741/585001-reinsurancefile.csv"
}
}
}
The presignParams object returns the temporary security credentials that enable you to programmatically sign AWS requests. Signing helps to secure requests by verifying the identity of the requester and protecting the data in transit.
| Parameter | Description |
|---|---|
accessKeyId | Base64-encoded S3 access key ID, a unique identifier for your S3 access key. |
secretAccessKey | Base64-encoded S3 secret access key. The access key ID and secret access key enable you to sign AWS requests. |
path | Base64-encoded path to the Amazon S3 bucket. |
sessionToken | Base64-encoded session token returned by AWS Security Token Service. |
region | Base64-encoded region of AWS. |
In Step 6, we can use AWS SDK APIs to upload the source file to this folder. But first, we need to get details about the location of the location of the source files.
Step 3: Upload Source Files
The Moody's RMS Import API does not provide operations for uploading local files to Amazon S3. Rather, you must use the Amazon S3 API or an AWS SDK to upload the database artifact to the Amazon S3 bucket (import folder) you created in Step 2.
In this procedure, you will use the Amazon S3 bucket path and temporary user credentials to upload account data to the MRI folder. First, you must decode to the accessKeyId, secretAccessKey, sessionToken, and s3Path values returned in Step 2 and pass the decoded values to a S3 client. The sample code is in Java 8.
private static String base64Decode(String text) {
return new String(Base64.getDecoder().decode(text));
}
Pass the decoded accessKeyId, secretAccessKey, and sessionToken to the Amazon getS3Client() method to create an Amazon S3 client.
private static AmazonS3 getS3Client(String accessKey, String secretKey, String sessionToken){
BasicSessionCredentials sessionCredentials = new BasicSessionCredentials(
accessKey,
secretKey,
sessionToken);
return AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(sessionCredentials))
.withRegion(Regions.EU_WEST_1)
.build();
}
Amazon TransferManager is a high-level utility for managing transfers to Amazon S3 that makes extensive use of Amazon S3 multipart uploads.
Once you have the Amazon S3 client, you can pass the s3Client, bucketName, key, and the filePath to the TransferManager.
private static void upload(AmazonS3 s3Client, String bucketName, String key, String filePath) {
try {
TransferManager tm = TransferManagerBuilder.standard()
.withS3Client(s3Client)
.build();
// TransferManager processes all transfers asynchronously,
// so this call returns immediately.
Upload upload = tm.upload(bucketName, key, new File(filePath));
System.out.println("Object upload started");
// Optionally, wait for the upload to finish before continuing.
upload.waitForCompletion();
System.out.println("Object upload complete");
}catch( Exception ex){
System.out.println(ex.getMessage());
}
}
The parameters are derived from previous steps:
| Parameter | Description |
|---|---|
bucketName | The bucketName can be extracted from the the initial section of the decoded s3Path. If the s3Path is rms-mi/preview/tenant/50000/import/mri/3929, the bucketName is rms-mi. |
key | Combines the remaining portion of the s3Path with the fileId, fileName in the pattern: s3Path/fileId-fileName. For example, preview/tenant/50000/import/mri/3929/12373-fileName |
filePath | The absolute path to the file you want to upload. |
If successful, AWS will upload the files to the MRI folder on Amazon S3. From this folder, we can use the Import API to import that data into the Intelligent Risk Platform. But before we can do that we need to identify an appropriate data server and create an exposure set on that data server.
Step 4: Create Import Job
In this step, we will import the account and location data uploaded to the import folder into a portfolio in our EDM. This workflow assumes that both the EDM and portfolio already exist. If these exposures do not exist, they must be created before you can run this procedure.
The Create Import Job operation imports data from the specified import folder into the specified portfolio (i.e. resourceUri) on the Intelligent Risk Platform. This operation accepts a variety of parameters. In this example, we will specify the minimum number of parameters needed to import the data.
The required x-rms-resource-group-id header parameter must be specified in the header of the request. This parameter identifies how the resource quota is allocated for job.
curl --request POST \
--url https://api-euw1.rms.com/platform/import/v1/jobs \
--header 'accept: */*' \
--header 'content-type: application/json' \
--header 'x-rms-resource-group-id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
All other parameters are specified in the body of the request.
The request body accepts many different parameters, but only three parameter are required: the importType (MRI), resourceUri (URI of a portfolio), and a settings object that specifies MRI import job specific settings.
{
"importType": "MRI",
"resourceUri": "/platform/riskdata/v1/exposures/3095468/portfolios/1",
"settings": {
"folderId": "234741"
}
}
| Parameter | Type | Descripton |
|---|---|---|
importFiles | Object | Name of accountsFileId, locationsFileId, reinsuranceFileId, and mappingFileId source files uploaded to the import folder. |
sheets | Object | ID of the accountsSheetIndex, locationsSheetIndex, reinsurnceSheetIndex. |
folderId | String | ID of the import folder. |
currency | String | A default value for the currency if the currency column for an imported exposure is undefined or blank. The value specified does not overwrite currency values in the source file themselves. |
delimiter | String | Delimiter used to structure data in the account, location, and reinsurance source files. One of TAB, COMMA, or SEMICOLON. |
skipLines | String | Number of lines to skip at the beginning of source files if flat files contain header columns. |
appendLocations | Boolean | If true, imported location data is added to existing locations. If false, new locations replace existing locations. |
geohaz | Boolean | If true, the Intelligent Risk Platform geocodes and performs hazard lookups for locations prior to import. For more information see Create Geohaz Job. |
The importFiles object identifies the name of the source files to import. The accountFileId, locationFileId, and reinsuranceFielId identify the account source file, location source file, and reinsurance source file respectively.
DATA VALIDATION
Intelligent Risk Platform™ validates the data specified in the account source file prior to importing that data into the EDM. The import process performs basic data validations to make sure that imported data is valid and does not create orphan records.
If a required data attribute value is missing or invalid, the platform throws a validation error. To learn more see Import Validations.
The MRI process only works if both an accounts source file and locations source file are uploaded to the import folder. Consequently, the accountsFileId and locationsFileId must also be defined in MRI jobs. The mappingFileId parameter specifies the ID number of an import mapping file. The import mapping file maps the columns in uploaded source files to IRP database columns. The mapping file both defines the relationships between account, location and reinsurance data and between columns in the uploaded account, location, and reinsurance flat files and the corresponding columns in the EDM.
If successful, returns a 201 Accepted HTTP response and adds an MRI_IMPORT job to the job queue. The Locations header returns the URL of the job which will enable you to poll the status of the job.
Step 5: Poll Import Job Status
The Get Import Job operation returns the status of the specified import job.
In this step, we will use the URL returned in header of the Create Import Job response to poll the status of the MRI_IMPORT job that we created in Step 4.
The job ID is specified in the endpoint path.
curl --request GET \
--url https://api-euw1.rms.com/platform/import/v1/jobs/89999 \
--header 'accept: application/json'
A successful response returns the Workflow object, which provides detailed information about this workflow job including the submitTime, startTime, type, status.
When the job status is FINISHED, the exposure data will have been imported into the portfolio.
{
"jobId": "41517094",
"userName": "[email protected]",
"status": "RUNNING",
"submittedAt": "2026-02-20T22:43:29.531Z",
"startedAt": "2026-02-20T22:43:34Z",
"name": "MriTask",
"type": "MRI_IMPORT",
"progress": 0,
"entitlement": "RI-RISKMODELER",
"resourceGroupId": "dfbc3df7-a83e-4e84-8202-cb5978b8442a",
"priority": "medium",
"details": {
"resources": [
{
"uri": "/platform/riskdata/v1/exposures/3095468/portfolios/1"
}
],
"summary": ""
},
"tasks": [
{
"guid": "ad307c66-f129-4980-927f-e2729c7f6aa9",
"taskId": "1",
"jobId": "41517094",
"status": "Running",
"submittedAt": "2026-02-20T22:43:32.572Z",
"createdAt": "2026-02-20T22:43:29.522Z",
"name": "MRI_IMPORT",
"output": {},
"priorTaskGuids": [],
"percentComplete": 0
}
]
}
Once the job is done and its status is FINISHED, we can use the operation to confirm that the new account and its child locations has been added to the specified portfolio.
Step 7: View Imported Accounts
In this step, we will verify that the imported account and location data have been added to the correct portfolio in our EDM.
The Search Accounts by Portfolio operation returns a list of all of the accounts belong to a specific portfolio.
The required exposureId and portfolioId path parameters must be specified in the request. The exposureId parameter identifies the EDM and the portfolioId identifies the portfolio.
curl --request GET \
--url https://api-euw1.rms.com/platform/riskdata/v1/exposures/3095468/portfolios/1/accounts \
--header 'accept: application/json'
If successful, the response returns 200 OK and a list of accounts.
[
{
"accountId": 1,
"accountName": "77499-ARF119697501 36",
"accountNumber": "MultoBldg",
"cedant": {
"cedantId": "XXX",
"cedantName": ""
},
"producer": {
"producerId": "21600",
"producerName": "ATLAS INSURANCE AGENCY"
},
"underwriter": {
"underwriterId": 0,
"underwriterName": ""
},
"branch": {
"branchId": 1,
"branchName": "Ficoh"
},
"userId1": "",
"userId2": "",
"userId3": "",
"userId4": "",
"userText1": "",
"userText2": "",
"createDate": "2026-02-21T01:16:05.026Z",
"stampDate": "2026-02-21T01:12:47.503Z",
"isValid": false,
"uri": "/platform/riskdata/v1/exposures/3095468/portfolios/1/accounts/1",
"locationsCount": 0,
"geocodeVersion": "",
"hazardVersion": "",
"ownerName": "ToddAPIKey",
"resultsCount": 0,
"policyExpirationDate": "2027-02-20T01:12:49.160Z",
"policyExpirationStatus": "OPEN",
"totalTIV": 0.0,
"reportsCount": 0,
"tagIds": []
}
]
Developer Resources
For step-by-step instructions on using the Risk Modeler application to import data, see MRI Import in Risk Modeler Help Center.
Postman Collection
Moody's makes a Postman Collection available for testing and evaluating the MRI Import workflow.
Moody's RMS™ Developer Portal provides collections for testing standard Platform workflows for importing and exporting exposures to and from the Intelligent Risk Platform. The site is freely available to the public. To learn more, see RMS Developers Portal Team.
Updated 16 days ago
