2026.03.b

🥥

Highlights

The 2026.03.b release adds the Peril Converter API and enhancements to the Accumulations API, Export API, and Risk Data API.

  • The Peril Converter API enables Intelligent Risk Platform tenants to update peril coverage (peril-specific policies and location coverages) in bulk for the location exposures associated with account or portfolio resources.
  • The Accumulations API now supports a batch process that enables tenants to create accumulation analyses in bulk.
  • The Export API introduces the ability to estimate the size loss data exported in RDM and RDM Data Bridge export jobs. RDM export jobs now support exporting to Parquet files.

Learn More

Accumulation API

Create Bulk Accumulations

The Create Accumulations in Batch operation (POST /platform/accumulation/v1/jobs/create-bulk-batches) initializes a batch job that generates accumulation analyses in bulk.

This operation accepts an array of resources (portfolios or portfolio variations) and an array of accumulation analysis settings, which include a list of accumulation profiles. For each unique pairing of a resource and accumulation profile, the batch process creates an ACCUMULATION job that produces an accumulation analysis. The settings object specifies both a list of accumulation profiles and a set of accumulation job configurations that identify the events, the financial perspectives, and currency schemes used in the analysis.

An accumulation is a type of analysis that identifies areas of concentrated property or workers compensation exposure. Accumulations calculate the exposed limit, i.e. the maximum loss that can be incurred from a single deterministic event enabling you to assess worst-case scenarios.

The client must specify a resource group ID in the x-rms-resource-group-id header parameter.The resource group ID identifies the resource group that submitted a job.


curl --request POST \
     --url https://api-euw1.rms.com/platform/accumulation/v1/bulk-batches \
     --header 'accept: */*' \
     --header 'content-type: application/json' \
     --header 'x-rms-resource-group-id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' \

A resource group ID must be specified in the x-rms-resource-group-id header parameter of request that initiate computationally expensive and long-running process that consumes significant system resources. Resource groups are a mechanism for tracking the resource quotas allocated to Intelligent Risk Platform tenants. A tenant is provisioned ("seeded") one resource group per entitlement (e.g. RI-RISKMODELER or RI-EXPOSUREIQ) . To learn more, see Resource Management.

The settings and resources parameters are specified in the body of the request.

{
  "settings": [
    {
      "currency": {
        "currency": "USD",
        "currencySchemeName": "RMS Default",
        "currencyVersion": "RL25"
      },
      "eventInfo": {
        "eventDate": 1771329226558,
        "eventDateBehavior": "ignore"
      },
      "financialPerspectives": ["GR", "GU"],
      "name": null,
      "notes": "",
      "profileIds": ["15", "52", "70", "178", "182", "189", "233"],
      "tagIds": [],
      "applyProfileTags": true
    }
  ],
  "resources": [
    {
      "resourceType": "portfolio",
      "resourceUri": "/platform/riskdata/v1/exposures/2721091/portfolios/20",
      "portfolioProperties": {
        // No currency object here <----------
        "variationName": "ABC Commercial_2026-02-23T16:41:38.362Z",
        "treatyIds": []
      }
    },
    {
      "resourceType": "portfolioVariation",
      "resourceUri": "/platform/riskdata/v1/exposureVariations/3716838"
    }
  ]
}

The settings and resources parameters are required in the request body.

Settings

The settings array specifies a list of accumulation analysis configurations. Each object consists of currency, eventInfo, financialPerspectives, name, notes, tagIds, and applyProfileTags. The settings array specifies settings that apply to accumulation analysis including the currency schemes, event, financial perspectives, and an array of accumulation profiles.

Each accumulation profile specifies general analysis settings (peril, geocoding version, etc.), scope (regions, layers, or targets), damages (damage factors and filters), and optional workers compensation settings.

ParameterTypeDescription
currencyObjectCurrency scheme with the version of exchange rates used in accumulation analysis.
eventInfoObjectInformation about the event including the eventDate and eventDateBehavior ( one of ignore, location, policy, policyAndLocation, treaty, treatyAndLocation, treatyAndPolicy, treatyAndPolicyAndLocation)
financialPerspectivesArrayList of financial perspectives . See Financial Perspectives.
nameStringName of accumulation analysis.
notesStringDescription of analysis.
profileIdsArrayList accumulation profiles.
tagIdsArrayList of tags to apply to accumulations. See Tags.
applyProfileTagsBooleanIf true, profile tags are applied to accumulation analysis.

The eventsInfo object defines the basis for including events in the accumulation analysis. The eventDate filters locations, policies, and treaties by date. Events can be excluded based on the effective dates of applicable locations, policies, and treaties. The eventDateBehavior specifies the basis for including or excluding events in the accumulation analysis. One of ignore, location, policy, policyAndLocation, treaty, treatyAndLocation, treatyAndPolicy, treatyAndPolicyAndLocation

Resources

The resources array defines an list of portfolio resources to run accumulation on. One job is created per resource.

ParameterTypeDescription
resourceTypeStringType of resource. One of portfolio or porfolioVariation. If portfolio, the portfolioProperties is required. `
resourceUriStringURI of portfolio or porfolio variation resource, e.g. /platform/riskdata/v1/exposures/2721091/portfolios/20
portfolioPropertiesObjectPortfolio-specific settings including treatyIds and variationName. Supports the following types of treaties: working excess treaty, surplus share treaty, and quota share treaty. Required if resourceType is portfolio.

The portfolioProperties object defines portfolio-specific properties that are included in the analysis. Running a portfolio accumulation analysis automatically creates a new portfolio variation. The variationName specifies the name of the portfolio variation created from the accumulation analysis. The treatyIds array defines a list of treaties to include in the analysis. Supports working excess, surplus share, and quota share treaties:

The response contains a per-resource status indicating whether each job was accepted or rejected.

Export API

Create Export Task

The Create Export Task operation (POST /platform/export/v1/task) creates a task, a long-running process that performs an operation.

An RDM_ESTIMATE task is a task that estimates the size of the loss data (loss tables, EP metrics, and statistics) included in an RDM or RDM_DATABRIDGE export.

This operation can be used to calculate the size of RDM data prior to running an RDM or RDM_DATABRIDGE job. Both export types enable you to export loss data as an Risk Data Module (RDM).

All parameters are specified in the body of the request. Depending on the export type (RDM or RDM_DATABRIDGE), the request accepts different parameter values.

RDM exports

The RDM export type creates a DOWNLOAD_RDM job that exports the specified loss data to an RDM (result data module) as a database artifact (BAK or MDF) or Parquet file.

The RDM_ESTIMATE task estimate the size of loss data exported. The request accepts four parameters: taskType, exportType, resourceUris, and exportLossFormat:

{
    "taskType": "RDM_ESTIMATE",
    "exportType": "RDM",
    "resourceUris": [
      "/platform/riskdata/v1/analyses/183322"
    ],
     "exportLossFormat": "PLT"

The taskType, exportType, and resourceUris parameters are required. The exportLossFormat parameter is required only if the job exports HD analysis results.

ParameterTypeDescription
taskTypeStringType of export task, i.e. RDM_ESTIMATE.
exportTypeStringType of export, i.e. RDM or RDM_DATABRIDGE.
resourceUrisArrayList of results to export identified by resource URI, i.e. /platform/riskdata/v1/analyses/183322
exportLossFormatStringFormat of exported HD analysis data, i.e. ELT or PLT.

RDM_DATABRIDGE exports

The RDM_DATABRIDGE export type creates a DOWNLOAD_RDM job that exports analysis result data to a new or existing RDM database on Data Bridge.

The RDM_ESTIMATE task estimate the size of loss data exported. The request accepts four parameters: taskType, exportType, resourceUris, and settings. The settings object takes different parameters depending whether the export job exports data to a new or existing database.

If the export job exports data to a new RDM, the settings object specifies the serverId of a managed server instance on Data Bridge:


{
    "taskType": "RDM_ESTIMATE",
    "exportType": "RDM_DATABRIDGE",
	"resourceUris": [
  "/platform/riskdata/v1/analyses/183322"
    ],
    "settings":{"serverId":11}

If the export job exports data to an existing RDM, settings object must specify both the serverId of a Data Bridge server and the ID of the RDM database on that server:

{
  "taskType": "RDM_ESTIMATE",
  "exportType": "RDM_DATABRIDGE",
  "resourceUris": ["/platform/riskdata/v1/analyses/183322"],
  "settings": {
    "serverId": 11,
    "databaseId": 23044712
  }
}

The taskType, exportType, resourceUris, and settings parameters are all required.

ParameterTypeDescription
taskTypeStringType of export task, i.e. RDM_ESTIMATE.
exportTypeStringType of export, i.e. RDM or RDM_DATABRIDGE.
resourceUrisArrayList of results to export identified by resource URI, i.e. /platform/riskdata/v1/analyses/183322
settingsObjectConfiguration that specifies the managed database server or server and database.

If successful, returns a 201 HTTP response and add the ESTIMATE_RDM task to the task queue. The task URL (/platform/export/v1/tasks/{taskUUID}) is returned in the Location header to track the status of the task.

Get Export Task

The Get Export Task operation (GET /platform/export/v1/tasks/{taskUUID}) returns information about the specified export task.

A task is a long-running process that consumes significant system resources (e.g. CPU, memory, I/O ). Unlike jobs, tasks do not affect the tenant's resource quotas.

This operation takes a single path parameter than specified the UUID of a task.

This operation now returns information about a specific RDM_ESTIMATE task, which estimates the size of the loss data to be exported in an DOWNLOAD_RDM job. The response returns the current status of the task. One of PENDING, IN_PROGRESS, COMPLETED, or FAILED.

{
  "taskUuid": "231ff646-5e79-4afb-9dc1-e8981ed48c60",
  "taskType": "RDM_ESTIMATE",
  "taskName": "rdm-estimate-platform-1-analyses",
  "status": "COMPLETED",
  "createdAt": "2026-03-09T21:24:48.099394",
  "updatedAt": "2026-03-09T21:24:50.199686",
  "expiresAt": "2026-03-16T21:24:48.077713",
  "createdBy": "[email protected]",
  "output": {
    "errors": [],
    "log": {
      "rdmEstimate": {
        "mdfEstimateSize": "124 MB",
        "estimatedSizeBytes": 130701634,
        "numberOfAnalyses": 1,
        "analyses": {
          "acAnalyses": [],
          "hdAnalyses": [],
          "rlAnalyses": ["117740"]
        },
        "totalTables": 13,
        "topTablesBreakdown": "rdm_port_0_1_32263.parquet:61 MB, rdm_anlsevent_0_1_32361.parquet:7 MB, rdm_metadata:78 KB, rdm_anlsregions_0_1_23.parquet:66 KB, rdm_anlspersp_0_1_4.parquet:11 KB, rdm_anlstreaty_0_1_3.parquet:8 KB, rdm_trtyidmap_0_1_3.parquet:8 KB, rdm_treatydesc_0_1_3.parquet:8 KB, rdm_portstats:8 KB, rdm_ratescheme_0_1_1.parquet:2 KB",
        "totalProcessingTimeSeconds": 2,
        "validationStatus": "SUCCESS",
        "validationWarnings": [],
        "diskUsagePercentage": 0.0,
        "currentAvailableSpaceBytes": 2319194259456,
        "currentAvailableSpaceSize": "2 TB",
        "availableSpaceAfterExportBytes": 2319063557822,
        "availableSpaceAfterExportSize": "2 TB",
        "totalSpaceBytes": 4829741056000,
        "totalSpaceSize": "4 TB"
      }
    }
  }
}

If the status of the task is COMPLETED, the response returns estimates of the size of the exported results data. The data returned depends on the export type (RDM or RDM_DATABRIDGE) and the fileExtension of the exported data. RDM exports support exporting to BAK, MDF, or Parquet files. RDM_DATABRIDGE exports support exporting to PARQUET files.

The log object rerturns details about the size of the export job including rdmEstimate:

The response object may include the following properties:

PropertyTypeDescription
rdmEstimateSizeObjectCollection of mdfEstimateSize, estimatedSizeBytes, numberOfAnalyses, and analyses.
mdfEstimateSizeStringEstimated size of MDF file.
estimatedSizeBytesNumberEstimated size.
numberOfAnalysesNumberNumber of analysis results exported.
analysesObjectCollection of lists (acAnalyses, hdAnalyses, and rlAnalyses) that return the ID of the ALM, HD, and DLM results in the export.
totalTablesNumberNumber of database tables exported.
topTablesBreakdownStringComma-separated list of table names and sizes, e.g. (rdm_policy: 2 MB).
totalProcessingTimeSecondsNumberTime to estimate size of exported data.
validationStatusStringStatus of estimation job, e.g. SUCCESS.
validationWarningsArrayArray of warnings.
diskUsagePercentageNumberPercentage of Data Bridge disk space used by exported losses.
currentAvailableSpaceSizeNumberSize of Data Bridge disk space available.
availableSpaceAfterExportBytesNumberSize of Data Bridge disk space in bytes available after export.
availableSpaceAfterExportSizeNumberSize of Data Bridge disk space in TB available after export.
totalSpaceBytesNumberSize of available Data Bridge disk space in bytes.
totalSpaceSizeNumberSize of available Data Bridge disk space in TB.

Create Export Job

The Create Export Job operation (POST /platform/export/v1/jobs) supports the definition and submission of many different types of export jobs.

The RDM export type enables the tenant to export loss data (loss tables, EP metrics, and statistics). Unlike the RESULTS export type which exports loss data only, the RDM export type includes information about the exposures underlying those losses (e.g. resolution levels) that provides context for understanding those losses.

Formerly, this export type supported exporting loss data to BAK or MDF database artifacts only. This export type now supports exporting RDM loss data to Parquet files, which facilitates risk analysis outside the Intelligent Risk Platform. Tenants can import loss data that includes key exposure data so there is no need to import EDM.

Apache Parquet is a column-oriented data storage format that is optimized for use with big data processing frameworks. Parquet offers data compression and encoding schemes that deliver greater performance for handling complex data in bulk. The primary advantage for Intelligent Risk Platform tenants is that it faciliates data migration processes by capturing a greater range of loss data including metadata that maps losses to the exposures that created those losses. For a list of loss data that is exported to RDM tables, see Exportng DLM, ALM, HD Results to RDM in Help Center.

The fileExtension body parameter now accepts a value of PARQUET:

{
  "exportType": "RDM",
  "resourceType": "analyses",
  "settings": {
    "fileExtension": "PARQUET",
    "sqlVersion": "2019"
  },
  "resourceUris": ["/platform/riskdata/v1/analyses/{{analysis_id}}"]
}

📷

Note

The rdmName and sqlVersion parameters are not required in RDM export requests that export data to Parquet files.

These parameters are still required if the job exports data to a BAK or MDF database artifacts.

If successful, the request creates a new DOWNLOAD_RDM job and returns a 201 Created HTTP Status code. The Location header parameter returns the URL of DOWNLOAD_RDM job that can be used to poll the status of the job, and download the file when the job is finished.

Get Export Job

The Get Export Job operation (GET /platform/export/v1/jobs/{jobId}) returns information about the specified export job including the status.

This operation now supports DOWNLOAD_RDM export jobs that write loss data to Parquet files rather than BAK or MDF database artifacts.

The client can use this operation to poll the status of the job. A successful response returns the job object, which provides detailed information about this job including the submitTime, startTime, type, and its status (e.g. PENDING, QUEUED, RUNNING, FINISHED).

Once the status of the DOWNLOAD_RDM job is FINISHED, the response returns a pre-signed Amazon S3 URL to a downloadable Parquet file.

Peril Converter API

The Peril Converter API enables Intelligent Risk Platform tenants to update peril coverage (peril-specific policies and location coverages) in bulk for the location exposures associated with account or portfolio resources.

A location is a property, building, business, or other asset that may be damaged by catastrophe events. Each location exposure is defined by location coverage, which specifies the liability of the underwriter for damages to entities (buildings, building contents, businesses) at a specific location due to catastrophe events of a particular peril type, e.g. earthquake (EQ), fire (FR), flood (FL), terrorism (TR), severe convective storm (CS), windstorm (WS), winterstorm (WT).

Peril coverage is defined in a policy or HD step policy.

Portfolios or accounts with peril coverages that match the specified parameter settings are overwritten with the updated peril coverages. If the createBackup option is specified, the peril converter job creates a copy of the specified portfolios or accounts and applies the specified peril coverage settings to the copied resources. The copied portfolio or account is given a name that is the same as the name of the original but with the string \_Copy appended to the name. Copied portfolios include all the accounts in the portfolio. Peril conversion can also optionally include sub-policy terms, endorsements, policy restrictions and sublimit restrictions.

For detailed information about the RMS EDM tables that are updated, see EDM Tables Updated by Conversion.

Create Peril Converter

The Create Peril Converter Job operation (POST /platform/perilconverter/v1/jobs) creates a PERIL_CONVERTER job that converts the perils covered by the policies assigned to a specified list of accounts and portfolios.

A PERIL_CONVERTER job updates the location coverage applied to a list of exposures (account and portfolio resources) by converting the existing peril (as defined in the sourcePeril parameter) into another peril or list of perils (as defined in the targetPerils parameter). Both the original perils (sourcePeril) and updated perils (targetPerils) can be defined by peril and newCauseOfLoss values (for policies and step policies) or by causeOfLoss values alone (for HD step policies).

This operation accepts four required parameters: the x-rms-resource-group-id header parameter and the resourceUris, resourceTypes, and settings body parameters.

The request must pass a valid resource group ID in the required x-rms-resource-group-id header parameter.


curl --request POST \
     --url https://api-euw1.rms.com//platform/perilconverter/v1/jobs/ \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --header 'x-rms-resource-group-id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' \

All other parameters are specified in the body of the request. The resourceUris, resourceTypes, and settings are all required.

{
  "resourceUris": [
    "/platform/riskdata/v1/exposures/{exposureId}/portfolios/2",
    "/platform/riskdata/v1/exposures/1002106/accounts/83"
  ],
  "resourceTypes": ["portfolios", "accounts"],
  "settings": {
    "sourcePeril": {
      "peril": 2,
      "newCauseOfLoss": 27
    },
    "targetPerils": [
      {
        "peril": 4,
        "newCauseOfLoss": 28
      }
    ],
    "countryCodesFilter": ["CA"],
    "includeSubPolicyConditions": true,
    "includePolicyReinsurance": true,
    "overwriteExistingCoverage": false,
    "createBackup": false
  }
}

The settings object may require additional parameters depending on how peril coverage is defined in the portfolio or account. The countryCodesFilter, includeSubPolicyConditions, includePolicyReinsurance, overwriteExistingConverage, and createBackup parameters are all optional.

ParameterTypeDescription
sourcePerilObjectObject that identifies the peril or cause of loss to update based on the specified peril and newCauseOfLoss values or a CauseOfLoss value.
targetPerilsArrayList of objects. Each object identifies the peril or cause of loss to update based on the specified peril and newCauseOfLoss values or a CauseOfLoss value.
countryCodesFilterArrayList of ISO2A country codes, e.g. CA. If specified, the job updates peril coverage for location exposures in the specified countries only.
includeSubPolicyConditionsBooleanOne of true or false. If true, the peril conversion applies to policy conditions, e.g. policy restrictions and sublimit restrictions.
includePolicyReinsuranceBooleanOne of true or false. If true, the peril conversion applies to policy cessions, (e.g. facultative cessions or surplus share treaties.
overwriteExistingConverageBooleanOne of true or false. If true, overwrites existing coverage of the target perils with the specified source peril. For example, if the target peril is severe convective storm (3), and the exposure already includes severe convective storm policies and location coverages, the exposures will be updated with new policies and location coverages from the conversion.
createBackupBooleanOne of true or false. If true, the peril converter creates a creates a copy of each account or portfolio with updated peril coverage settings. The new account or policy shares the same name as the original, but with string _Copy appended the name. If you opt to make a copy of the portfolio, the copy includes all the accounts in the portfolio. -->

The sourcePeril and targetPeril objects select and update the perils covered by each policy in the request. The sourcePeril object identifies the current peril value that is to be converted based on the specified perilandnewCauseOfLossvalues. ThetargetPerils` object identifies a list of new peril values that will replace the current values.

Policies and Non-HD Step Policies

If peril coverage of the specified resourceUrls is defined by a policy or step policy, the settings object can identify source and target perils by peril value (for DLM models) or a combination of peril and newCauseOfLoss values (for HD models).

{
  "resourceUris": ["/platform/riskdata/v1/exposures/{exposureId}/portfolios/2"],
  "resourceTypes": ["portfolios"],
  "settings": {
    "sourcePeril": {
      "peril": 2,
      "newCauseOfLoss": 27
    },
    "targetPerils": [
      {
        "peril": 4,
        "newCauseOfLoss": 28
      }
    ]
  }
}

The peril and newCauseOfLoss parameters accept integers that identify the peril. The request must specify a peril value. The newCauseOfLoss parameter is optional and only applies to HD mdoels.

The peril parameter represents an EDM schema field (RMS_EDM.policy.POLICYTYPE) that identifies the perils that impact a policy or step policy regardless of peril model. The peril parameter can be specified for both DLM and HD models.

ValueCodeDescription
1EQEarthquake
2WSWindstorm/hurricane
3CS/WTSevere convective storm/winterstorm
4FLFlood
5FRFire
6TRTerrorism
7WCWorkers compensation/human casualty

The newCauseOfLoss parameter represents an EDM schema field (RMS_EDM.policy.NEWCAUSEOFLOSS) that identifies the perils that impact a policy or step policy. The newCauseOfLoss parameter can be specified for HD models only.

ValueCodeDescription
0Default to POLICYTYPE
1EQEarthquake
2WSWindstorm
3CSSevere Convective Storm
4FLFlood
5FRFire
6TRTerrorism
26WIWind
27WAWater
28WI,WAWind, Water
66CONVConventional
67CBRNChemical, Biological, Radiological, Nuclear
76CONV, EQConventional, Earthquake
77CBRN, EQCBRN, Earthquake
78TR, EQTerrorism, Earthquake

HD Step Policies

If peril coverage of the specified resourceUrls is defined by a HD step policy, the settings object can identify source and target perils by causeOfLoss values.

{
  "resourceUris": ["/platform/riskdata/v1/exposures/{exposureId}/portfolios/2"],
  "resourceTypes": ["portfolios"],
  "settings": {
    "sourcePeril": {
      "causeOfLoss": 27
    },
    "targetPerils": [
      {
        "causeOfLoss": 28
      }
    ]
  }
}

The causeOfLoss parameter represents an EDM schema field (RMS_EDM.hdsteppolicy.CAUSEOFLOSS) that identifies the perils that impact a HD step policy. The CauseOfLoss parameter can be specified for HD models only.

ValueCodeDescription
1EQEarthquake
2WSWindstorm
3CSSevere Convective Storm
4FLFlood
5FRFire
26WIWind
27WAWater

If successful, returns 201 Created and adds a PERIL_CONVERTER non-model job to the job queue for processing. The response header...

Search Peril Converter Jobs

The Search Peril Converter Jobs (Get /platform/perilconverter/v1/jobs) returns a list of peril converter jobs.

A PERIL_CONVERTER job is a type of non-model job that adds peril coverage to the location exposures associated with an account or portfolio. Clients with the RI-UNDERWRITEIQ entitlement can use the Create Peril Converter operation to update location coverage of accounts and portfolios in bulk.

[
  {
    "jobId": "string",
    "priority": "verylow",
    "userName": "string",
    "status": "QUEUED",
    "submittedAt": "2020-01-01T00:00:00.000Z",
    "startedAt": "2020-01-01T00:00:00.000Z",
    "endedAt": "2020-01-01T00:00:00.000Z",
    "name": "string",
    "type": "string",
    "progress": 0,
    "details": {
      "resources": [
        {
          "uri": "string" //inputted resourceUris
        }
      ],
      "summary": "string"
    },
    "tasks": [
      {
        "taskId": 0,
        "guid": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
        "jobId": "string",
        "status": "CANCELED",
        "submittedAt": "2020-01-01T00:00:00.000Z",
        "createdAt": "2020-01-01T00:00:00.000Z",
        "name": "string",
        "percentComplete": 0,
        "priorTaskGuids": ["3fa85f64-5717-4562-b3fc-2c963f66afa6"],
        "output": {
          "summary": "string",
          "errors": [
            {
              "message": "string"
            }
          ],
          "log": {
            "newResources": [
              {
                "resourceUri": "string",
                "exposureResourcename": "string"
              }
            ],
            "countryCodesFilter": "Not Available",
            "includeSubPolicyConditions": true,
            "includePolicyReinsurance": true,
            "overwriteExistingCoverage": true,
            "totalLocations": "6",
            "totalPolicies": "6",
            "sourcePeril": "Earthquake",
            "targetPerils": "Flood"
          }
        }
      }
    ]
  }
]

Get Peril Converter Job

The Get Peril Converter Job operation (GET /platform/perilconverter/v1/jobs/{jobId}) returns information about a specific peril converter job.

{
  "jobId": "string",
  "priority": "verylow",
  "userName": "string",
  "status": "QUEUED",
  "submittedAt": "2020-01-01T00:00:00.000Z",
  "startedAt": "2020-01-01T00:00:00.000Z",
  "endedAt": "2020-01-01T00:00:00.000Z",
  "name": "string",
  "type": "string",
  "progress": 0,
  "details": {
    "resources": [
      {
        "uri": "string" //inputted resourceUris
      }
    ],
    "summary": "string"
  },
  "tasks": [
    {
      "taskId": 0,
      "guid": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
      "jobId": "string",
      "status": "CANCELED",
      "submittedAt": "2020-01-01T00:00:00.000Z",
      "createdAt": "2020-01-01T00:00:00.000Z",
      "name": "string",
      "percentComplete": 0,
      "priorTaskGuids": ["3fa85f64-5717-4562-b3fc-2c963f66afa6"],
      "output": {
        "summary": "string",
        "errors": [
          {
            "message": "string"
          }
        ],
        "log": {
          "newResources": [
            {
              "resourceUri": "string",
              "exposureResourcename": "string"
            }
          ],
          "countryCodesFilter": "Not Available",
          "includeSubPolicyConditions": true,
          "includePolicyReinsurance": true,
          "overwriteExistingCoverage": true,
          "totalLocations": "6",
          "totalPolicies": "6",
          "sourcePeril": "Earthquake",
          "targetPerils": "Flood"
        }
      }
    }
  ]
}

Update Peril Converter Job

The Update Peril Converter Job (PATCH /platform/perilconverter/v1/jobs/{jobId}) updates the status of a peril converter job.

Reference Data API

Get Rollup Metadata

The Get Rollup Metadata operation (GET /platform/referencedata/v1/rollupmetadata/{metadataUuid}) returns the specified rollup metadata.

The metadataUuid path parameter is required.

If successful, the response returns 200 OK and the specified metadata object:

{
  "uuid": "0b0c15d8-bebe-403c-91f2-adf55cc3c40d",
  "name": "customTreatyLabel12",
  "label": "Country",
  "metadataType": "TREATY",
  "inputType": "LIST",
  "possibleValues": [
    "region 1",
    "region 2",
    "region 5",
    "region 3 ",
    "region 4",
    "region 1 8",
    "region 0",
    "region ",
    "sdfr",
    "resaf",
    "sagv",
    "region 6 ",
    "region 51",
    "region 6",
    "region 7",
    "region 55555"
  ]
}

The metadata object consists of the following properties:

PropertyTypeDescription
uuidStringUUID of rollup metadata, e.g. 0b0c15d8-bebe-403c-91f2-adf55cc3c40d.
nameStringName of rollup metadata.
labelStringLabel of rollup metadata.
metadataTypeStringType of rollup metadata, e.g. PORTFOLIO (i.e. primary insurance program), PROGRAM (reinsurance program). TREATY (program treaty)
inputTypeStringType of input. One of LIST or STRING.
possibleValuesArrayList of values that are applicable if inputType is LIST.

Update Rollup Metadata

The Update Rollup Metadata operation (PUT /platform/referencedata/v1/rollupmetadata/{metadataUuid}) updates the specified rollup metadata.

This operation supports updating the label, inputType, and possibleValues of the specified metadata object.

The metadataUuid path parameter is required.

{
  "label": "Todd_Test_Update",
  "possibleValues": ["Option1", "Option2", "Option3"]
}

All body parameters are optional:

ParameterTypeDescription
labelStringLabel that identifies metadata.
inputTypeStringOne of STRING
possibleValuesString

If successful, returns 204 No Content.

Risk Data API

Search Analysis Results

The Search Analysis Results operation (GET /platform/riskdata/v1/analysis) operation returns a list of analysis results.

This operation supports response filtering based the value of a subset of properties. Depending on the property, you may use a combination of comparison operators, list operators, and logical operators. To learn more, see Response Filtering.

This operation supports filtering responses by entitlement andresourcegroupId property values:

PropertyTypeComparisonList
entitlementString=IN
resourcegroupIdString=

Responses can now be filtered by entitlement values. Lists of analysis results can be filtered by entitlement in ASC or DESC order.

This operation now returns the resourcegroupId in the response.

The resource group ID identifies the resource group that submitted a job. A resource group ID must be specified in the x-rms-resource-group-id header parameter of request that initiate computationally expensive and long-running process that consumes significant system resources. Resource groups are a mechanism for tracking the resource quotas allocated to Intelligent Risk Platform tenants. A tenant is provisioned ("seeded") one resource group per entitlement (e.g. RI-RISKMODELER or RI-EXPOSUREIQ) .

[
  {
    "analysisId": 5760256,
    "analysisName": "RL181G_APFL_FM_A_107",
    "createDate": "2026-03-05T01:56:28",
    "resourcegroupId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
    ...
  }
]

Get Analysis Result

The Get Analysis Result operation (GET /platform/riskdata/v1/analysis/{analysisId}) operation returns the specified analysis result.

This operation now returns the entitlement and resourcegroupId properties in the response.

PropertyTypeDescription
entitlementStringEntitlment used by client to run model job that produced analysis result (e.g. RI-RISKMODELER).
resourcegroupIdStringID of resource group used by client to run model job that produced analysis result.

This operation now returns the resourcegroupId in the response.

The resource group ID identifies the resource group that submitted a job. A resource group ID must be specified in the x-rms-resource-group-id header parameter of request that initiate computationally expensive and long-running process that consumes significant system resources. Resource groups are a mechanism for tracking the resource quotas allocated to Intelligent Risk Platform tenants. A tenant is provisioned ("seeded") one resource group per entitlement (e.g. RI-RISKMODELER or RI-EXPOSUREIQ) .

[
  {
    "analysisId": 5760256,
    "analysisName": "RL181G_APFL_FM_A_107",
    "createDate": "2026-03-05T01:56:28",
    "resourcegroupId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
    ...
  }
]

Search EDMs

The Search EDMs operation (GET /platform/riskdata/v1/exposures) returns a list EDMs (exposure data module).

An exposure data module (EDM) is a database structure that stores the detailed exposure, hazard, and financial information for use in catastrophe modeling. Exposure data stored in an EDM may be used for risk analysis or for migration to other Intelligent Risk Platform products.

This operation now accepts the optional X-INCLUDE-TOTAL-COUNT header parameter, which accepts a Boolean value. If true the response returns the total number exposure variations in the X-TOTAL-COUNT response header. If false or unspecified the X-TOTAL-COUNT response header is not returned.

Get EDM

The Get Exposures operation (GET /platform/riskdata/v1/exposures/{exposureId}) returns the specified EDM (exposure data module).

An exposure data module (EDM) is a database structure that stores the detailed exposure, hazard, and financial information for use in catastrophe modeling. Exposure data stored in an EDM may be used for risk analysis or for migration to other Intelligent Risk Platform products.

This operation now accepts the optional X-INCLUDE-TOTAL-COUNT header parameter, which accepts a Boolean value. If true the response returns the total number exposure variations in the X-TOTAL-COUNT response header. If false or unspecified the X-TOTAL-COUNT response header is not returned.

Search Exposure Variations

The Search Exposure Variations operation (GET /platform/riskdata/v1/exposurevariations) returns a list of exposure variations.

An exposure variation is a snapshot of an account, aggregate portfolio, or portfolio exposure. Exposure variations enable tenants to better manage exposure data by ensuring that analyses, reports, and data exports accurately reflect the state of an exposure at a point in time.

This operation now accepts the optional X-INCLUDE-TOTAL-COUNT header parameter, which accepts a Boolean value. If true the response returns the total number exposure variations in the X-TOTAL-COUNT response header. If false or unspecified the X-TOTAL-COUNT response header is not returned.

Move EDM

The Move EDM operation (POST /platform/riskdata/v1/exposures/{exposureId}/move) supports moving data modules (EDMs) between the Intellgent Risk Platform and Data Bridge.

This operation accepts three required parameters: the exposureId path parameter, the x-rms-resource-group-id header parameter, and the serverId body parameter.

ParameterDescription
exposureIdID of an Exposure Data Module (EDM) on the Intelligent Risk Platform.
x-rms-resource-group-idID of resource group charged with the operation. To learn more, see Resource Management.
serverIdID of a managed SQL Server instance on Data Bridge.

If successful, the response adds a MOVE_DATABASE job to the workflow queue and returns a 202 Accepted HTTP resonse code with Location header that returns the job URL. Use this URL to poll the status of the job. For details, see Get Risk Data Job.

To perform this operation, the client must have the IC-DATABRIDGE entitlement and one of RI-EXPOSUREIQ, RI-RISKMODLER, or RI-UNDERWRITEIQ entitlements.

Get Risk Data Job

The Get Risk Data Job operation now returns information about MOVE_DATABASE jobs.

The Risk Data API now supports operations that enable Intelligent Risk Platform tenants to move databases (i.e.data modules ) between the Intelligent Risk Platform and Data Bridge. The Move Database operation creates a MOVE_DATABASE job that can move databases from the Intelligent Risk Platform to Data Bridge or from Data Bridge to the Intelligent Risk Platform.

The Get Risk Data Job operation enables the client to poll the status of these jobs and to view information about the database once moved.

{
    [
        {
          "jobId" : "20893064",
          "userName" : "[email protected]",
          "status" : "FINISHED",
          "submittedAt" : "2026-01-01T00:01:01.123Z",
          "startedAt" : "2026-01-01T00:05:00.123Z"",
          "endedAt" : "2026-01-01T00:06:01.123Z"",
          "name" : "Move Database:20260131_EDM01",
          "type" : "MOVE_DATABASE",
          "progress" : 100,
          "priority" : "veryhigh",
          "entitlement" : "RI-RISKMODELER",
          "resourceGroupId" : "6d3c943e-e18d-44b9-8d66-728e1acb96be",
          "details" : {
            "resources" : [ ],
            "summary" : "MOVE_DATABASE is successful",
            "log" : {
              "sourceServerName" : "sql-instance-1",
              "destinationServerName" : "databridge-1",
              "databaseType": "EDM",
              "databaseName": "20260131_EDM01_xyz",
              "edmName": "20260131_EDM01",
            }
          },
          "warnings" : [ ]
        }
    ]
}

The response returns log object that provides the databaseName, databaseType, destinationServerName, edmName, and sourceServerName of the database.

PropertyTypeDescription
databaseNameStringName of the database.
databaseTypeStringType of the database, e.g. EDM, RDM.
destinationServerNameStringName of the database server.
edmNameStringName of the data module.
sourceServerNameStringName of the database server.

To perform this operation, the client must have the IC-DATABRIDGE entitlement and one of RI-EXPOSUREIQ, RI-RISKMODLER, or RI-UNDERWRITEIQ entitlements.

Download Rollup Contributory Metrics

The Download Rollup Contributory Metrics operation (POST /platform/riskdata/v1/analyses/ {analysisId}/download-rollup-contributory-metrics) returns a downloadable file that contains contributory metrics for a specified position in a business hierarchy rollup analysis.

Contributory metrics measure the degree to which a particular position (PORTFOLIO, PROGRAM or GROUP) in a business hierarchy produce losses "in key return periods for my reporting Zones" and contribute "to my global peril numbers, overall and by country"

In the TreatyIQ, contributory metrics are collected in an Contract Contribution Report. This report enables you to identify which policies (primary insurance) are producing losses. Using this report, you may now breakdown the aggregate risk positions of a business hierarchy analysis in the following ways:

The required analysisId path parameter identifies a business hierarchy rollup analysis. This operation extracts the requested metrics from this analysis.

The body parameters specify position analyzed and the granularity of the returned contributory metrics.

{
  "positionUuid": "f67a1745-c9b6-4f42-ac36-a78268091159",
  "contributionOf": ["Contract"],
  "contributionTo": ["Continent"],
  "fileExtension": "JSON"
}

The positionUuid and fileExtension are required.

ParameterTypeDescriptiion
positionUuidStringUUID of a position, i.e. a PORTFOLIO (risk source), PROGRAM (program), or GROUP (collection of diverse positions). If PORTFOLIO, contributions are segmented by account where available. If PROGRAM, contributions are segmented by program treaty. If GROUP, contributions are segmented by both accounts and program treaties.
contributionOfArrayList of granularities for contributory metric segmentation, e.g. Contract, Admin1, Country.
contributionToArrayList of granularities for contributory metrics segmented by program treaty, e.g. Peril, Continent, Country.
fileExtensionStringExtension of downloadable contributory metrics file. One of CSV or JSON.

If successful, returns 200 OK and the presigned URL of the downloadable file in the response body:

[
  {
    "downloadUrl": "https://rms-tenants-xxxx-xxxx.s3.xxx.amazonaws.com/tiq/analysis/xxx/xxx/contribtory-metrics/xxx.json",
  }
]

Get EP Metrics

The Get EP Metrics operation (GET /platform/riskdata/v1/analysis/{analysisid}/ep) returns EP curves for the specified EP analysis job.

EP analysis may be computed based on ALM, DLM, or HD models. Exceedance probability (EP) analysis takes the full range of possible events and losses into consideration during analysis. These losses are expressed as occurrence exceedance probability curves (OEP curves) and aggregate exceedance probability curves (AEP curves).

This operation now returns the exposureResourceId, exposureResourceNumber, and exposureResourceType properties.

[
  {
    "jobId": 41212223,
    "epType": "AEP",
    "perspectiveCode": "GU",
    "exposureResourceId": 264,
    "exposureResourceType": "POLICY",
    "exposureResourceNumber": "ACC_91POL_264",
    "value": {
      "returnPeriods": [500.0],
      "positionValues": [1.2654241000524742e10]
    }
  }
]

These properties are returned regardless of the model type (ALM, DLM, HD) used in the analysis:

PropertyTypeDescription
exposureResourceIdNumberID of exposure resource modeled.
exposureResourceNumberStringUser-defined ID for exposure resource.
exposureResourceTypeStringType of exposure resource modeled, e.g. ACCOUNT, AGGPORTFOLIO, LOCATION, POLICY, PORTFOLIO, STEP_POLICY, TREATY, UNRECOGNIZED.

Search Program Sets

The Search Program Sets operation (GET /platform/riskdata/v1/programsets) returns a list of program sets.

This operation now returns the thumbprint of each program set.

Get Program Set

The Get Program Set operation (GET /platform/riskdata/v1/programsets/{id}) returns the specified program set.

This operation now returns the thumbprint of each program set.

Search Programs

The Search Programs operation (GET /platform/riskdata/v1/programs) returns a list of programs.

This operation now returns the thumbprint of each program.

Get Program

The Get Program operation (GET /platform/riskdata/v1/programs/{id}) returns the specified program.

This operation now returns the thumbprint of each program.

Search Program Variation

The Search Program Variations operation (GET /platform/riskdata/v1/programvariations) returns a list of program vartiatons.

This operation now returns the thumbprint of each program variation.

Get Program Variation

The Get Program Variation operation (GET /platform/riskdata/v1/programvariations/{id}) returns the specified program variation.

This operation now returns the thumbprint of each program variation.

Search Share Sets

The Search Share Sets operation (GET /platform/riskdata/v1/sharesets) returns a list of share sets.

This operation now returns the thumbprint of each share set.

Get Share Set

The Get Share Set operation (GET /platform/riskdata/v1/sharesets/{id}) returns the specified share set.

This operation now returns the thumbprint of each share set.

Search Business Hierarchy Sets

The Search Business Hierarchy Sets operation (GET /platform/riskdata/v1/businesshierarchysets) returns a list of business hierarchy sets.

This operation now returns the thumbprint of each business hierarchy set.

Get Business Hierarchy Set

The Get Business Hierarchy Set operation (GET /platform/riskdata/v1/businesshierarchysets/{id}) returns the specified of business hierarchy set.

This operation now returns the thumbprint of each business hierarchy set.

Search Business Hierarchies

The Search Business Hierarchies operation (GET /platform/riskdata/v1/businesshierarchies) returns a list of business hierarchies.

This operation now returns the thumbprint of each business hierarchy.

Get Business Hierarchy

The Search Business Hierarchy operation (GET /platform/riskdata/v1/businesshierarchy) returns the specified business hierarchy.

This operation now returns the thumbprint of each business hierarchy.

Search Business Hierarchy Variations

The Search Business Hierarchy Variations operation (GET /platform/riskdata/v1/businesshierarchyvariations) returns a list of business hierarchy variations.

This operation now returns the thumbprint of each business hierarchy.

Get Business Hierarchy Variation

The Search Business Hierarchy Variation operation (GET /platform/riskdata/v1/businesshierarchyvariation) returns the specified business hierarchy variation.

This operation now returns the thumbprint of each business hierarchy.

Tenant Data API

Search Tenant Data Job

The Search Tenant Jobs operation (GET /platform/tenantdata/v1/jobs) returns a list of jobs run by the tenant across all entitlements and Platform APIs.

The Intelligent Risk Platform now automatically "archives" jobs that are more than 30 days old. By default, archived jobs are only returned if the query explicity requested.

The optional includeArchivedJobs query parameter determines whether the request includes or excludes older jobs.

curl --request GET \
     --url 'https://api-euw1.rms.com/platform/tenantdata/v1/jobs?includeArchivedJobs=true' \
     --header 'accept: application/json'

The includeArchivedJobs parameter accepts a Boolean value. If true, the response returns information about archived jobs. If false or unspecified, the response returns information about jobs run in the last thirty days only.

This operation returns 200 or 202 response codes: If the response is ready, returns 200 OK HTTP status code and a list of jobs matching the request query. If the response is not ready after 30 seconds, returns 202 Accepted HTTP status code. Client can retry request to retrieve the response.