2026.04.d

🥥

Highlights

The 2026.04.d release adds enhancements to the Admin Data API, Risk Data API, and Tenant Data API.

  • The Admin Data API now supports operations for managing databases. New operation support moving databases between servers, reorganizing database indexes, updating database statistics, shrinking databases, archiving and deleting databases.
  • The Risk Data API introduces improvements to marginal impact analysis and the enrichment of location building attribute data.
  • The Tenant Data API now supports operations for managing VPN connections.
    Learn More

Models

Australia Bushfire Climate Change HD Model

In this update, Moodyʼs RMS is releasing the Australia Bushfire Climate Change HD Model, which encapsulates the current state of the science for the impact of climate change on bushfire in the region. This HD Climate Change Model covers the same geographical scope as the reference model for this region and is valid for results generated with the RMS 2025 Stochastic Rates - Urban Conflagration Reference and RMS 2025 Stochastic Rates - Urban Conflagration High simulation sets. It is a separately licensed extension to the existing reference model.

The climate change model captures conditioned views that combine the following parameters:

  • SSP 1-2.6, SSP 2-4.5, SSP 3-7.0, SSP 5-8.5
  • Time horizons from 2030-2100 in five-year intervals

Climate change models analyze the outputs of existing reference model losses towards a given climate conditioned view. You do not need to rerun HD analyses to produce new losses. Rather, you execute a post-processing step on the results of these analyses. The climate change model analyses recalculate average annual loss (AAL) and exceedance probability (EP) loss.

Deprecations

Microsoft to End Support of SQL 2016 in July 2026

Microsoft will end its extended support of SQL Server 2016 on July 14, 2026. As a result, the Intelligent Risk Platform™ will end support for SQL Server 2016 in June 2026.

Admin Data API

Move Database

The Move Database operation (POST /platform/admindata/v1/databases/{{databaseId}}/move) transfers a database between servers.

The operation accepts three required parameters.

The x-rms-resource-group-id header parameters specifies the resource group to charge with the resource quota. This operation counts against the specified resource group's nonmodeledConcurrentJobs quota.

The databaseId path parameter specifies the database to move and the serverId body parameter specifies the destination server. .

{
  "serverId": 10
}

If successful, adds a MOVE_DATABASE job to the workflow queue and returns a 202 Accepted HTTP status code. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

Re-index Database

The Reindex Database operation (POST /platform/admindata/v1/databases/{databaseId}/reindex) reorganizes the index of the specified PLATFORM or DATABRIDGE database.

This operation reduces fragmentation by reshuffling existing pages into sequences that match their logical order. Reorganizing an index is a lightweight operation that improves performance without an entire rebuild of the index.

This operation takes two parameters; both are required:

  • The x-rms-resource-group-id header parameters specifies the resource group to charge with the resource quota. This operation counts against the specified resource group's ConcurrentDataAdminJobs quota.
  • The databaseId path parameter identifies the database to re-index.

If successful, adds an REINDEX_DATABASE job to the workflow queue and returns a 201 Created HTTP status code. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Update Database Statistics

The Update Database Statistics operation (POST /platform/admindata/v1/ databases{databaseId}/statistics/update) updates statistics for the specified PLATFORM or DATABRIDGE database.

Moody's Insurance Solutions recommends that Intelligent Risk Platform tenants periodically update database statistics in order to improve performance and optimize queries.

This operation takes two parameters; both are required:

  • The x-rms-resource-group-id header parameters specifies the resource group to charge with the resource quota. This operation counts against the specified resource group's ConcurrentDataAdminJobs quota.
  • The databaseId path parameter identifies the database to re-index.

If successful, adds an DATABASE_UPDATE_STATS job to the workflow queue and returns a 201 Created HTTP status code. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Search Databases

The Search Databases operation (GET /platform/admindata/v1/databases) returns a list of databases on the specified database server.

This operation enables a Data Admin to view detailed information about all of the tenant's databases (EDM, RDM, and UNKNOWN) across the Intelligent Risk Platform and Data Bridge.

This operation supports response filtering based the value of a subset of database properties. Depending on the property, you may use a combination of comparison operators, list operators, and logical operators.

PropertyComparisonList
createDate=, !=, >=, <=
customServerName=, !=, LIKE, NOT LIKEIN, NOT IN
databaseId=, !=, LIKE, NOT LIKEIN, NOT IN
databaseName=, !=, LIKE, NOT LIKEIN, NOT IN
databaseType=, !=, LIKE, NOT LIKEIN, NOT IN
lastModified=, !=, >=, <=
owner=, !=, LIKE, NOT LIKEIN, NOT IN
securableId=, !=, >=, <=IN, NOT IN
securableName=, !=, LIKE, NOT LIKEIN, NOT IN
securableType=, !=, LIKE, NOT LIKEIN, NOT IN
serverName=, !=, LIKE, NOT LIKEIN, NOT IN
serverType=, !=, LIKE, NOT LIKEIN, NOT IN
sizeInMb=, !=, >=, <=IN, NOT IN

This operation supports sorting:createDate, databaseId, databaseName, databaseType, lastModified, owner, securableId, securableName, securableType, serverName, serverType, and sizeInMb.

The response returns detailed information about each database matching the specified query parameters:

[
  {
    "databaseId": 1,
    "databaseName": "_20231009_EDM_two_porfolios_Snqz",
    "databaseType": "EDM",
    "status": "READY",
    "serverType": "platform",
    "serverName": "sql-instance-1",
    "securableId": 178362,
    "securableType": "exposure set",
    "securableName": "20231009_EDM_two_porfolios",
    "sizeInMb": "11004",
    "schemaVersion": 25,
    "owner": "[email protected]",
    "createDate": "2025-01-01 01:01:00.00",
    "lastModified": "2025-06-12 19:58:45.00",
    "thumbprint": "EUW1.20260129.3"
  }
]

Client applications belonging to a user group that has been assigned the Data Admin role can perform this operation.

Get Database

The Get Database operation (GET /platform/admindata/v1/databases/{databaseId}) returns information about the specified database.

The databaseId path parameter is the only parameter accepted in the request.

[
  {
    "databaseId": 1,
    "databaseName": "_20231009_EDM_two_porfolios_Snqz",
    "databaseType": "EDM",
    "status": "READY",
    "serverType": "platform",
    "serverName": "sql-instance-1",
    "securableId": 178362,
    "securableType": "exposure set",
    "securableName": "20231009_EDM_two_porfolios",
    "sizeInMb": "11004",
    "schemaVersion": 25,
    "owner": "user.name.company.com",
    "createDate": "2025-01-01 01:01:00.00",
    "lastModified": "2025-06-12 19:58:45.00",
    "thumbprint": "EUW1.20260129.3"
  }
]
PropertyTypeDescription
createDateStringDate database was created.
databaseIdNumberID of the database.
databaseNameStringName of the database.
databaseTypeStringType of database. One of EDM, RDM, UNKNOWN.
lastModifiedStringDate database was last modified.
ownerStringOwner of the database.
schemaVersionNumberSchema version of database.
securableIdNumberID of securable.
securableNameStringName of securable.
securableTypeStringType of securable. One of PLATFORM, DATA_BRIDGE.
serverNameStringName of server.
serverTypeStringType of server.
sizeInMbStringSize of database in MB.
statusStringStatus of database. One of ACTIVE or INACTIVE.
thumbprintStringIntelligent Risk Platform™ build used to create the EDM, including the deployment region, creation date, and unique build number.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Delete Database

The Delete Database operation (DELETE /platform/admindata/v1/databases/{databaseId}) deletes the specified database.

This operation supports the deletion of any hosted database on the Intelligent Risk Platform or Data Bridge (registered or unregistered).

The operation takes two required parameters. The databaseId path parameter identifies the database to be deleted. The x-rms-resource-group-id header parameter specifies the ID of the resource group performing the operation.

If successful, adds an DELETE_DATABASE job to the workflow queue and returns a 201 Created HTTP status code. The job is a type of nonmodeledConcurrentJob that counts against the resource quota of the specified resource group. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Get Admin Data Job

The Get Admin Data Job operation (GET /platform/admindata/v1/jobs/{jobsId}) returns the specified Admin Data API job.

This operation now returns the details about ARCHIVE_DATABASE, DELETE_DATABASE, and SHRINK_DATABASE jobs.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Update Admin Data Job

The Update Admin Data Job operation (PATCH /platform/admindata/v1/jobs/{jobsId}) updates the priority or status of the specified Admin Data API job.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Archive Database

The Archive Database operation (POST /platform/admindata/v1/databases/{databaseId}/archive) creates an archive of the specified database.

This operation enables a Data Admin to create an archive of the specified database, a copy of a database that is saved to Data Vault.

The required databaseId path parameter identifies the database to archive. The optional expirationDate parameter determines whether the archive is permanent or temporary.

{
  "expirationDate": "2035-12-31T00:00:00.000Z"
}

If successful, adds an ARCHIVE_DATABASE job to the workflow queue and returns a 201 Created HTTP status code. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

The client must belong to a user group assigned the Data Admin role to perform this operation. This operation requires the RI-DATAVAULT entitlement.

Shrink Database

The Shrink Database operation (POST /platform/admindata/v1/databases/{databaseId}/shrink) shrinks the size of the data and log files on the specified database.

This operation accepts two required parameters: the databaseId path parameter and the x-rms-resource-group-id header parameter.

The request must pass a valid resource group ID in the required x-rms-resource-group-id header parameter. A resource group is a mechanism for allocating resource quotas to Intelligent Risk Platform tenants. To learn more, see Resource Management.


curl --request POST \
     --url https://api-euw1.rms.com//platform/admindata/v1/databases/56/shrink/ \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --header 'x-rms-resource-group-id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' \

The request body accepts two parameters: truncateLogsOnly and timeoutInHours.

{
  "truncateLogsOnly": "true",
  "timeoutInHours": "3"
}

Both body parameters are optional:

ParameterTypeDescription
truncateLogsOnlyBooleanIf true, only the logs are truncated. By default, false.
timeoutInHoursNumberNumber of hours before the job times out. If unspecified, job times out in two hours.

If successful, adds an SHRINK_DATABASE job to the workflow queue and returns a 201 Created HTTP status code. The ID of the job is returned in the Location header of the response. Use the Get Admin Job operation to poll the status of the job.

The client must belong to a user group assigned the Data Admin role to perform this operation.

Import API

Create Import Job

The Create Import Job operation (POST /platform/import/v1/jobs) creates an import job.

The EDM import type now supports the optional preserveName parameter. If true, the imported EDM is named using the original name, i.e. the name of the uploaded EDM. If false or undefined, the EDM uses the name specified in the exposureName parameter.

Risk Data API

Search Analysis Results

The Search Analysis Results operation (GET /platform/riskdata/v1/analyses) returns a list of analysis results.

This operation now supports filtering analysis results by entitlement and resourceGroupId.

PropertyData TypeComparisonList
entitlementString=IN
resourceGroupIdString=

To learn more about query string parameters see Filtering Responses.

Response data can be sorted by entitlement.

The response now returns three new properties:

PropertyData TypeDescription
resourceGroupIdstringUUID of resource group to which model job was allocated.
groupingSetIdint64ID of grouping set used in the analysis.
groupSetSimulationPeriodint64Simulation period of the grouping set used in the analysis.

Get Enriched Location Data

The Get Enriched Location Data operation (GET /platform/riskdata/v1/exposures/ {exposureId}/locations/{id}/enrichdetails) returns enhanced building attribute data for the specified location.

The response returns standard property data to all tenants. Tenants that have licensed enriched exposure attributes received additional data including extended building and hazard attributes, roof date components, architecture and roof system identifiers, cladding, garaging, tree density, deck and slope attributes, foundation and engineering identifiers, and related flags. Data is only returned if property values are available for the location.

ERD data is returned in addition to standard location property data. Tenants that have licensed receive both ERD data and ESDB data. Tenants that have not licensed ERD receive standard property data only. Enhanced Risk Data (ERD) that provides detailed information about the location including primary characteristics (e.g. construction and numOfStories attributes) and secondary modifier data (e.g. floorOccupancy attributes ) based on CAPE Property Intelligence and other data products. The ERD data product is available with a separate license.

The properties returned in the response depend on the tenant's Exposure Enrichment license. Depending on the license, the response may return wind, roof, cladding, and foundation properties.

Different data is returned depending on the tenant's license:

LicenseDescription
StandardStandard property data includes XXXX from the ExposureSource Database (ESDB).
ERDEnhanced Risk Data (ERD) includes XXX from CAPE Property Intelligence and other data sources.

Tenants that have licensed the ERD‑EEE‑US response combines existing location properties and ERD data:

If successful, the response returns 200 OK HTTP response status and all enriched detail properties that are available for the specified location.

The following response identifies extended location properties that are available with an enriched exposure license:

{
  "locId": 1,
  "airConditioningDescription": "E",
  "buildingHeight": 400,
  "buildingValuationLowerRange": 240000.0,
  "buildingValuationUpperRange": 540000.0,
  "construction": "ATC1",
  "constructionSource": "E",
  "currencyCode": "USD",
  "esdbVersion": "11.0",
  "floorArea": 1500,
  "floorAreaSource": "E",
  "floorOccupancy": "5",
  "floorOccupancyDescription": "2~Partial",
  "foundationSystemDescription": "0~Unknown",
  "heatingTypeDescription": "Forced Air",
  "isHD": 1,
  "isPrimaryBuildingFlag": 1,
  "multipleBuildingFlag": 0,
  "numberOfBaths": 0,
  "numberOfBedrooms": 0,
  "numberOfUnits": 0,
  "numOfStories": 100,
  "numOfStoriesSource": "E",
  "occupancy": "ATC1",
  "occupancy2": "ATC2",
  "occupancy3": "ATC3",
  "occupancySource": "E",
  "pool": "1",
  "yearBuilt": 1998,
  "yearBuiltSource": "E",
  "roofYear": [2005, 1, 1],
  "wsArchitectureId": 6,
  "wfArchitectureId": 6,
  "wsRoofAgeId": 3,
  "toRoofAgeId": 3,
  "frRoofAgeId": 3,
  "wsRoofSystemId": 7,
  "frRoofCoveringId": 7,
  "wsRoofGeometryId": 3,
  "toRoofGeometryId": 3,
  "wsCladTypeId": 1,
  "toCladTypeId": 1,
  "wfCladSysId": 1,
  "toGaragingId": 7,
  "toTreeDensityId": 2,
  "wfd2v": 100,
  "wfDeckSysId": 2,
  "wfSlopeId": 39,
  "wsRoofMaintenanceId": 4,
  "wsRoofEquipmentId": 7,
  "toRoofEquipmentId": 7,
  "wsFoundationId": 5,
  "flFoundationTypeId": 5,
  "eqEngineeringFoundationId": 1,
  "eqPoundingId": 1,
  "eqShapeConfigureId": 1,
  "multipleBuildingFlagDescription": "Multiple Buildings",
  "wfd2vFlagId": 13,
  "wfd2vUnitId": 2,
  "wfSlopeFlagId": 13
}

Calculate Marginal Impact

The Calculate Marginal Impact operation (POST /platform/riskdata/v1/analyses/{analysisId}/marginal-impact) calculates marginal impact analysis that measures the effect of adding additional accounts to an existing portfolio as differential losses.

This operation now supports calculating smoothed marginal impact metrics.

The optional settings object accepts a pltMethodology parameter that defines an array of analysis options.

{
  "outputType": "account",
  "currency": {
    "currencyAsOfDate": "2025-05-28",
    "currencyCode": "USD",
    "currencyScheme": "RMS",
    "currencyVintage": "RL25"
  },
  "marginalImpactAnalysisIds": [266014],
  "jobName": "smoothed_marginal_impact",
  "rateSchemes": [278],
  "outputLevel": "ACCOUNT",
  "settings": {
    "pltMethodology": ["SMOOTHED"]
  }
}

The settings object is optional. If omitted, the job returns STANDARD metrics.

The pltMethodology parameter defines a list of options for calculating the marginal impact of PLT analysis. One of STANDARD, SMOOTHED or BOTH.

ParameterTypeDescription
STANDARDStringIf STANDARD, the job calculates standard EP metrics, which can produce noisy values. Standard use a single period for AEP or a period/event pairing for OEP.
SMOOTHEDStringIf SMOOTHED, the job calculates smoothed EP metrics, which are better suited for PLT-based marginal impact analysis.
BOTHStringIf BOTH, the job calculates both standard and smoothed EP metrics.

Get Marginal EP Metrics

The Get Marginal EP Metrics operation (GET /platform/riskdata/v1/analyses/{analysisId}/marginal-ep`) returns EP metrics for the specified DLM analysis. <

This operation now supports the optional epType query parameter that enables the client to filter EP metrics by exceedence probability metric type. One of AEP (aggregate EP), CEP (stochastic conditional EP), OEP (occurence EP), TCE_AEP (tail conditional expectation AEP), TCE_OEP (tail conditional expectation OEP).

This operation now returns up to 50 EP return periods. Previously, this operation returned the standard 12 EP points per curve: 2, 5, 10, 25, 50, 100, 200, 250, 500, 1000, 5000, 10000, 50000.

This operation now returns smoothed metrics. Standard EP metrics are computed from a single period (AEP) or period/event (OEP), which can result in metrics that are unstable, noisy and, ultimately, ill-suited to decision making. Smoothed EP metrics produce marginal impact metrics that are better suited to PLT-based analysis.

Smoothed marginal metrics are available across all HD-supported financial perspectives For details. see Calculate Marginal Impact.

Get Exposure Variation Overview

The Get Exposure Variation Overview operation (GET `/platform/riskdata/v1/ exposurevariations/{exposureVariationId}/overview) returns ESG data.

Environmental, social, and governance (ESG) scores are calculated at the account level and reveal the combined impact on a portfolio.

A high ESG score indicates that the company follows best practices effectively in all environmental, social, and governance areas with a minimal negative impact on its employees or the environment. A mid-range score means that the company meets best practices in most ESG categories and has a low negative impact on its employees or the environment. A low score indicates that the company does not follow best practices, negatively impacting its employees and the environment.

{
  "esgData": {
    "e": 0.0,
    "s": 0.0,
    "orbisDate": "",
    "esg": 0.0,
    "dqs": 0.0,
    "g": 0.0,
    "esgWeightedFieldType": "",
    "numberOfAccountsByParameter": 0
  }
}

Tenant Data API

Search Tenant Jobs

The Search Tenant Jobs operation (GET /platform/tenantdata/v1/jobs) returns a list of all of the jobs run by the tenant across all of the Platform APIs.

This operation now supports filtering jobs by date. The optional startTime and endTime query parameters define a date range that filters the jobs returned by date.

Both parameters accepts dates in ISO 8601 format, e.g. 2025-01-01T00:00:00.000Z If both the startTime and endTime parameters are omitted, the response returns jobs from the previous 30 days only.

ParameterTypeDescription
startTimeDate of earliest job returned in ISO 8601 format, e.g. 2025-01-01T00:00:00.000Z .
endTimeDate of latest job returned in ISO 8601 format, e.g. 2025-01-01T00:00:00.000Z .

Search VPN Connections

The Search VPN Connections operation (GET /platform/tenantdata/v1/vpn-connections) returns a list of VPN connections.

This operation now returns the vpnConnectionName and awsVpnConnectionId properties.

Create VPN Connection

The Create VPN Connection operation (POST /platform/tenantdata/v1/vpn-connections) returns a VPN connection.

This operation now accepts the optional vpnConnectionName parameter, which defines the name of the VPN connection.

Get VPN Connection

The Get VPN Connection operation (GET /platform/tenantdata/v1/vpn-connections/{connectionId}) returns detailed information about the specified VPN connection.

This operation now returns the vpnConnectionName and awsVpnConnectionId properties.

Update VPN Connection

The Update VPN Connection operation (PATCH /platform/tenantdata/v1/vpn-connections/{connectionId}) updates the specified VPN connection.

This operation now accepts the optional vpnConnectionName parameter, which updates the name of the VPN connection.