OED Import

Import exposure data with OED

Overview

The Import API defines a standard process for importing exposure data into the Intelligent Risk Platform using several different file formats.

The OED (Open Exposure Data) schema is an exposure database format defined by OASIS that is serves as an industry standard schema for Oasis LMF-based models.

OED Import enables you to convert data received from cedants in this format to an EDM.

Step 1: Prepare Exposure Data

The first step is to create one or more flat files of exposure data. These flat files define the account, location, reinsurance, and reinsurance scope data that you want to upload, import, and transform into EDM data.

Exposure data is defines as a tabular dataset.

Imported exposure data, financial data, and secondary modifiers may be defined four different flat files (in CSV or TXT file format):

ExposureFile TypeEDM Tables
accountaccountsFilePortinfo, Accgrp, Policy, Polcvg, Hdsteppolicy
locationlocationFileProperty, Address, Loccvg, xxdet
reinsurancereinsuranceFileReinsinf
reinsurance scopereinsuranceScopeFileReinsinf

Both CSV and TXT files are supported. Depending on the file type, you may create accountsFile.csv or accountsFile.txt.

Each column in the flat file contains data with exposure attribute values separated by means of a text delimiter: comma, semicolon, or tab. To ensure accurate interpretation of any numbers that use commas, Moody's recommends tab delimiters.

For example, an accountsFile in CSV format might include the following data:

FlexiAccTestCase    PortNumber  ACCOUNTNUMBER   POLICYNUM   PERILCOVERED    AccCurrency LayerParticipation  LayerLimit
acc1    acc1    acc1   Layer1     AA1    GBP    0.3    500000   500000
acc1    acc1    acc1   Layer2     AA1    GBP    0.3    100000000    1000000
acc10   acc10   acc10  Layer1     WSS    GBP    0.3    500000   500000
acc10   acc10   acc10  Layer2     WSS    GBP    0.3    100000000    1000000
acc18   acc18   acc18  Layer1     BB1    GBP    0.3    500000   500000
acc18   acc18   acc18  Layer2     BB1    GBP    0.3    100000000    1000000
acc25   acc25   acc25  Layer1     WTC    GBP    0.3    500000   500000
acc25   acc25   acc25  Layer2     WTC    GBP    0.3    100000000    1000000
acc96   acc96   acc96  Layer1     AA1    GBP    0.3    500000   500000
acc96   acc96   acc96  Layer2     AA1    GBP    0.3    1000000001000000

In Step 4 of this procedure, we will need to specify the delimiter value used to structure data within the uploaded CSV files.

📷

OED Import Templates

Moody's makes OED import templates for accounts, locations, and reinsurance data available for download in the ExposureIQ application.

Each OED template identifies appropriate tables. OED templates are available in both CSV and TXT format.
You can download import templates the Import panel in the ExposureIQ applicaiton.

Flat files have the following restraints:

  • Maximum file size: 300MB
  • Maximum number of accounts: 1.5 million records
  • Maximum number of locations: 1.5 million records

Once you have the flat files of exposure data, you can create an import folder on Amazon S3 to which this data can be uploaded.

Step 2: Prepare Mapping File

The OED import workflow supports custom column mappings by means of a mappingsFile.

The Import API uses a mapping engine based on the OED 1.1.5 schema definition to map OED data to EDM data.

The import folder optionally accepts a mapping file in MFF format that defineds column mappings between the uploaded OED exposure data files (accountsFile, locationsFile, reinsuranceInfoFile, reinsuranceScopeFile) and table columns in the EDM.

The optional mappingsFile defines column mappings between OED data (as defined in the uploaded accountsFile, locationsFile, reinsuranceInfoFile, reinsuranceScopeFile flat files) and columns in EDM tables. If the mappingsFile is uploaded to the import folder, the column mapping specified in the mappingsFile overwrite the default mappings used by the mapping engine.

During import, the Import API maps the exposure data, financial data, and secondary modifiers defined in the OED flat files to the corresponding EDM tables. The OED-to-EDM mapping engine supports the mapping of OED version 1.1.5 data only.

  • The mapping engine does not validate the data stored in OED import files.
  • The mapping engine may fail while mapping the data to EDM if the data includes invalid values or missing records.
  • The mapping engine defines default mappings for construction codes, occupancy codes, and secondary modifiers that you cannot change.

📷

Recommendation

Moody's has tested the mapping engine with a variety of portfolios, but it is important that you test the engine on your portfolios and evaluate the mapped data before utilizing the tool for production workflows.

Column mapping enables you to ensure that imported exposure data is written to the correct EDM table. The mappingFile.mff defines mappings between OED columns and EDM table columns using the format: `=`. For example,
PortNumber=PORTNUMBER

The OED column PortNumber is mapped to the EDM table column PORTNUMBER. The value specified in the PortNumber column in the accountsFile will be written to the PORTNUMBER column in the table of the EDM.

In the example mappingsFile, account, location, and reinsurance mappings are grouped together:

'This mapping file is created by Risk Modeler. Please do not modify it.'
[general]
[accounts]
PortNumber=PORTNUMBER
AccNumber=ACCOUNTNUMBER
PolNumber=POLICYNUM
PolPerilsCovered=PERILCOVERED
AccCurrency=ACCCURRENCY
LayerParticipation=LAYERPARTICIPATION
LayerLimit=LAYERLIMIT
LayerAttachment=LAYERATTACHMENT
PolPeril=POLPERIL
PolDed6All=POLDED6ALL
PolDedType6All=POLDEDTYPE6ALL
PolMinDed6All=POLMINDED6ALL
PolMaxDed6All=POLMAXDED6ALL
PolLimit6All=POLLIMIT6ALL
PolLimitType6All=POLLIMITTYPE6ALL
CondPeril=CONDPERIL
CondDed6All=CONDDED6ALL
CondDedType6All=CONDDEDTYPE6ALL
CondMinDed6All=CONDMINDED6ALL
CondMaxDed6All=CONDMAXDED6ALL
CondLimit6All=CONDLIMIT6ALL
CondLimitType6All=CONDLIMITTYPE6ALL
CondPriority=CONDPRIORITY
AccDed6All=ACCDED6ALL
AccDedType6All=ACCDEDTYPE6ALL
AccMinDed6All=ACCMINDED6ALL
AccMaxDed6All=ACCMAXDED6ALL
AccLimit6All=ACCLIMIT6ALL
AccLimitType6All=ACCLIMITTYPE6ALL
[locations]
AreaCode=CONDTAG
PortNumber=PORTNUMBER
AccNumber=ACCNUMBER
LocNumber=LOCNUMBER
CountryCode=COUNTRYCODE
Latitude=LATITUDE
Longitude=LONGITUDE
PostalCode=POSTALCODE
GeogScheme1=GEOGSCHEME1
GeogName1=GEOGNAME1
GeogScheme2=GEOGSCHEME2
GeogName2=GEOGNAME2
ConstructionCode=CONSTRUCTIONCODE
NumberOfBuildings=NUMBEROFBUILDINGS
LocCurrency=LOCCURRENCY
LocPerilsCovered=LOCPERILSCOVERED
BuildingTIV=BUILDINGTIV
OtherTIV=OTHERTIV
ContentsTIV=CONTENTSTIV
BITIV=BIBI
LocPeril=LOCPERIL
LocDed1Building=LOCDED1BUILDING
LocDedType1Building=LOCDEDTYPE1BUILDING
LocDed3Contents=LOCDED3CONTENTS
LocDedType3Contents=LOCDEDTYPE3CONTENTS
LocDed4BI=LOCDED4BI
LocDedType4BI=LOCDEDTYPE4BI
LocDed5PD=LOCDED5PD
LocDedType5PD=LOCDEDTYPE5PD
LocDed6All=LOCDED6ALL
LocDedType6All=LOCDEDTYPE6ALL
LocLimit1Building=LOCLIMIT1BUILDING
LocLimitType1Building=LOCLIMITTYPE1BUILDING
LocLimit3Contents=LOCLIMIT3CONTENTS
LocLimitType3Contents=LOCLIMITTYPE3CONTENTS
LocLimit4BI=LOCLIMIT4BI
LocLimitType4BI=LOCLIMITTYPE4BI
LocLimit5PD=LOCLIMIT5PD
LocLimitType5PD=LOCLIMITTYPE5PD
LocLimit6All=LOCLIMIT6ALL
LocLimitType6All=LOCLIMITTYPE6ALL
YearBuilt=YEARBUILT
NumberOfStoreys=NUMBEROFSTOREYS
Basement=BASEMENT
LocDedType2Other=LOCDEDTYPE2OTHER
LocDed2Other=LOCDED2OTHER
LocLimitType2Other=LOCLIMITTYPE2OTHER
LocLimit2Other=LOCLIMIT2OTHER
LocMinDed6All=LOCMINDED6ALL
[reinsuranceInfo]
InuringPriority=PRIORITY
ReinsNumber=REINSNUMBER
ReinsName=REINSNAME
ReinsLayerNumber=REINSLAYERNUMBER
ReinsPeril=REINSPERIL
ReinsInceptionDate=REINSINCEPTIONDATE
ReinsExpiryDate=REINSEXPIRYDATE
RiskLimit=RISKLIMIT
RiskAttachment=RISKATTACHMENT
OccLimit=OCCLIMIT
OccAttachment=OCCATTACHMENT
OccFranchiseDed=OCCFRANCHISEDED
OccReverseFranchise=OCCREVERSEFRANCHISE
AggLimit=AGGLIMIT
AggAttachment=AGGATTACHMENT
AggPeriod=AGGPERIOD
PlacedPercent=PLACEDPERCENT
ReinsCurrency=REINSCURRENCY
ReinsType=REINSTYPE
AttachmentBasis=ATTACHMENTBASIS
Reinstatement=REINSTATEMENT
ReinstatementCharge=REINSTATEMENTCHARGE
ReinsPremium=REINSPREMIUM
DeemedPercentPlaced=DEEMEDPERCENTPLACED
ReinsFXrate=REINSFXRATE
TreatyShare=TREATYSHARE
CededPercent=CEDEDPERCENT
RiskLevel=RISKLEVEL
[reinsuranceScope]
PortNumber=PORTNUMBER
ReinsNumber=REINSURANCE NUMBER
AccNumber=ACCNUMBER
PolNumber=POLNUMBER
LocGroup=LOCGROUP
LocNumber=LOCNUMBER
ProducerName=PRODUCERNAME
LOB=LOB
CountryCode=COUNTRYCODE
ReinsTag=REINSTAG
CededPercent=CEDEDPERCENT
RiskLevel=RISKLEVEL
[pbe]

Step 3: Create OED Import Folder

The Create Upload Folder operation creates a temporary storage bucket (called an import folder) on Amazon S3 for importing the data.

The OED import folder is an AWS storage bucket that is defined to accept a specified set of files in support of OED import worfklows.

curl --request POST \
--url https://{{host}}/platform/import/v1/folders \
--header 'Authorization: XXXXXXXXXX' \
--header 'accept: application/json' \
--header 'content-type: application/json'

All parameters are specified in the request body. Depending on the folderType specified, the operation supports different fileExtension and fileTypes parameters.

The fileTypes parameter defines a list of files to upload to the OED import folder.

{
  "folderType": "OED",
  "properties": {
    "fileExtension": "CSV",
    "fileTypes": [
      "accountsFile",
      "locationsFile",
      "reinsuranceInfoFile",
      "reinsuranceScopeFile"
      "mappingsFile"
    ]
  }
}

In this example, we are uploading OED data stored in four CSV flat files to an OED import folder. The optional mappingsFile defines mappings between the OED data and EDM schema tables.

If successful, the response returns a 201 Created HTTP response code.

This response also returns a folderId in the Location response header. This identifies the import folder's location on Amazon S3 and will be used in Step 4 to upload the CSV to the OED import folder using the AWS API.

For each file, the response returns information needed to upload that file to the import folder using the AWS API: fileUri, accessKeyId, secretAccessKey, sessionToken, path, region, and uploadUri:

{
  "uploadDetails": {
    "accountsFile": {
      "fileUri": "platform/import/v1/folders/37709/files/102226",
      "presignParams": {
        "accessKeyId": "xxxxx",
        "secretAccessKey": "xxxxx",
        "sessionToken": "xxxxx",
        "path": "xxxxx",
        "region": "xxxxx"
      },
      "uploadUrl": "https://rms-tenants-ppe-eu-west-1.s3.amazonaws.com/4000540/import/platform/oed/37709/102226-accountsfile.csv"
    },
    "reinsuranceScopeFile": {
      "fileUri": "platform/import/v1/folders/37709/files/102229",
      "presignParams": {
        "accessKeyId": "xxxxx",
        "secretAccessKey": "xxxxx",
        "sessionToken": "xxxxx",
        "path": "xxxxx",
        "region": "xxxxx"
      },
      "uploadUrl": "https://rms-tenants-ppe-eu-west-1.s3.amazonaws.com/4000540/import/platform/oed/37709/102229-reinsurancescopefile.csv"
    },
    "mappingFile": {
      "fileUri": "platform/import/v1/folders/37709/files/102230",
      "presignParams": {
        "accessKeyId": "xxxxx",
        "secretAccessKey": "xxxxx",
        "sessionToken": "xxxxx",
        "path": "xxxxx",
        "region": "xxxxx"
      },
      "uploadUrl": "https://rms-tenants-ppe-eu-west-1.s3.amazonaws.com/4000540/import/platform/oed/37709/102230-mappingfile.mff"
    },
    "locationsFile": {
      "fileUri": "platform/import/v1/folders/37709/files/102227",
      "presignParams": {
        "accessKeyId": "xxxxx",
        "secretAccessKey": "xxxxx",
        "sessionToken": "xxxxx",
        "path": "xxxxx",
        "region": "xxxxx"
      },
      "uploadUrl": "https://rms-tenants-ppe-eu-west-1.s3.amazonaws.com/4000540/import/platform/oed/37709/102227-locationsfile.csv"
    },
    "reinsuranceInfoFile": {
      "fileUri": "platform/import/v1/folders/37709/files/102228",
      "presignParams": {
        "accessKeyId": "xxxxx",
        "secretAccessKey": "xxxxx",
        "sessionToken": "xxxxx",
        "path": "xxxxx",
        "region": "xxxxx"
      },
      "uploadUrl": "https://rms-tenants-ppe-eu-west-1.s3.amazonaws.com/4000540/import/platform/oed/37709/102228-reinsuranceinfofile.csv"
    }
  },
  "folderType": "OED",
  "folderId": "37709"
}
PropertyTypeDescription
fileUriStringResource URI for the file, e.g. platform/import/v1/folders/37709/files/102228.
presignParamsObject`Contains security credentials for the file:accessKeyId, secretAccessKey, sessionToken, path, region, uploadUrl.
accessKeyIdStringBase-64 encoded access key for file.
secretAccessKeyStringBase-64 encoded secret access key for file.
sessionTokenStringBase-64 encoded session token for file.
pathStringBase-64 encoded path for file.
regionStringBase-64 encoded AWS region for file.
uploadUrlStringAmazon S3 URL for file.

In Step 3, we will use these security credentials to upload the files to the import folder on Amazon S3.

Step 4: Upload files

The Import API does not provide operations for uploading local files to AWS. Rather, you must use the Amazon S3 API or an AWS SDK to upload the database artifact to the Amazon S3 bucket you created in Step 2.

In this procedure, you will use the Amazon S3 bucket path and temporary user credentials to upload account data to the OED import folder. First, you must decode to the accessKeyId, secretAccessKey, sessionToken, and s3Path values returned in Step 2 and pass the decoded values to a S3 client. The sample code is in Java 8.

private static String base64Decode(String text) {
    return new String(Base64.getDecoder().decode(text));
}

Pass the decoded accessKeyId, secretAccessKey, and sessionToken to the Amazon getS3Client() method to create an Amazon S3 client.

private static AmazonS3 getS3Client(String accessKey, String secretKey, String sessionToken){
    BasicSessionCredentials sessionCredentials = new BasicSessionCredentials(
            accessKey,
            secretKey,
            sessionToken);

    return AmazonS3ClientBuilder.standard()
            .withCredentials(new AWSStaticCredentialsProvider(sessionCredentials))
            .withRegion(Regions.EU_WEST_1)
            .build();
}

Amazon TransferManager is a high-level utility for managing transfers to Amazon S3 that makes extensive use of Amazon S3 multipart uploads.

Once you have the Amazon S3 client, you can pass the s3Client, bucketName, key, and the filePath to the TransferManager.

private static void upload(AmazonS3 s3Client, String bucketName, String key, String filePath) {
    try {
        TransferManager tm = TransferManagerBuilder.standard()
                .withS3Client(s3Client)
                .build();

        // TransferManager processes all transfers asynchronously,
        // so this call returns immediately.
        Upload upload = tm.upload(bucketName, key, new File(filePath));
        System.out.println("Object upload started");

        // Optionally, wait for the upload to finish before continuing.
        upload.waitForCompletion();
        System.out.println("Object upload complete");
    }catch( Exception ex){
        System.out.println(ex.getMessage());
    }
}

The parameters are derived from previous steps:

ParameterDescription
bucketNameThe bucketName can be extracted from the the initial section of the decoded s3Path. If the s3Path is rms-mi/preview/tenant/50000/import/mri/3929, the bucketName is rms-mi.
keyCombines the remaining portion of the s3Path with the fileId, fileName in the pattern: s3Path/fileId-fileName. For example, preview/tenant/50000/import/mri/3929/12373-fileName
filePathThe absolute path to the file you want to upload.

If successful, AWS will upload the files to the OED folder on Amazon S3. From this folder, we can use the Import API to import that data into the Intelligent Risk Platform. But before we can do that we need to identify an appropriate data server and create an exposure set on that data server.

Step 5: Create Exposure Set

An exposure set is a securable that controls acesss to a collection of exposure, hazard, and financial information, and the analyses based on that data. The owner of the exposure set and may share access to the exposure set and its child data (exposures, exposure variations, and exposure analysis) with user groups.

Imported data is added to a resource group.

{
  "exposureSetName": "New ping exposure set",
  "groups": [
    {
      "groupId": "ad422c53-d970-4e90-b50e-c14aa18bb2bd"
    },
    {
      "groupId": "bd40473e-e4ab-4093-8008-ddf85a005046"
    }
  ]
}

The response returns 201 Created HTTP response and the resource URI of the new exposure set, e.g.
https://api-euw1.rms-ppe.com/platform/riskdata/v1/exposuresets/3555242.

The [Get Exposure Set] operation returns details information about this expsosure set:

{
    "exposureSetId": 3555242,
    "guid": "9cde4489-9ace-493a-9e09-147c71a73546",
    "exposureSetName": "New ping exposure set",
    "ownerName": "ToddPPEAPIKEY",
    "groups": [
        {
            "groupId": "ad422c53-d970-4e90-b50e-c14aa18bb2bd",
            "groupName": "Admin"
        },
        {
            "groupId": "bd40473e-e4ab-4093-8008-ddf85a005046",
            "groupName": "All Roles"
        }
    ]
}

When the job is complete, use the Get Exposure Set operation to retrieve the ID of the exposure set.

Step 6:

Step 5: Import Data

The Import Job operation imports data previously uploaded to an import folder on Amazon S3 into an EDM.

The request accepts a required x-rms-resource-group-id header that identifies the ID number of the :resource group to which this job is assigned for resource management.

curl --request POST \
     --url https://{{host}}/platform/import/v1/jobs \
     --header 'accept: application/json' \
     --header 'content-type: application/json'
     --header 'x-rms-resource-group-id: ba672cd9-xxxx-4cb3-9882-bbeb5f8478c6'

All parameters are specified in the body of the request. The request body defines the import job specifying the import type, the URI of the exposure set, and import job settings.

The required resourceUri identifies an exposure set that acts as the securable for the imported data.

{
  "importType": "OED",
  "resourceUri": "/platform/riskdata/v1/exposuresets/3555242",
  "settings": {
    "folderId": "37731",
    "exposureName": "SA_OED_EIQ1",
    "geoHaz": false,
    "currency": "USD",
    "delimiter": "COMMA"
  }
}

The importType, resourceUri, and settings parameters are required.

The settings object specifies information about the imported data.

AttributeTypeDescription
folderIdStringID of the OED import folder on Amazon S3. Returned by the Create Import Folder response.
exposureNameStringUnique name of EDM.
geohazBooleanIf true, locations are geocoded and hazarded on import. For more information, see Create Geohaz Job
currencyStringThree-letter ISO 4217 country code that identifies the default currency this program treaty’s terms. For example, USD.
delimiterStringText delimiter used delimit column values in uploaded flat file. to One of COMMA or TAB.
portfolioObjectPortfolio into which the exposure data is imported identified by portfolioName, portfolioNumber, and description`.

If successful, the response returns a 201 Created HTTP Response Code. This response indicates that the API has created an IMPORT job and added that job to the job queue. The response also returns an URI in the Location response header that enables you to poll the status of the this job. In Step 6, you will use this jobId to poll the status of the job.

e.g. https://api-euw1.rms-ppe.com/platform/import/v1/jobs/23389070

Step 6: Poll Job Status

The Get Job Status enables you track the status of an import job.

The request takes a single parameter that is specified as a path parameter.

curl --request GET \
     --url https://{{host}}/platform/import/v1/jobs/778 \
     --header 'Authorization: XXXXXXXXXX' \
     --header 'accept: application/json'