Platform Jobs
Overview
The Intelligent Risk Platform manages long-running processes as platform jobs. A platform job is a process that manages, creates, or transforms data resources on the Intelligent Risk Platform.
This includes all catastrophe modelling processes, workflows to geocode locations or look up hazard data, bulk edit operations, and an all processes that import or export exposure or analysis result data.
This page identifies the different types of jobs that can be run on the Intelligent Risk Platform, discusses operations for polling the status of these jobs and retrieving response objects once they are complete, and describes how resource groups are used to allocate and manage tenant resource quotas.
Platform Jobs Types
A platform job is a system-defined task that manages a time-consuming or computationally expensive workflow process.
Many standard Intelligent Risk Platform workflows (e.g. catastrophe modeling, bulk or batch operations, data migration tasks) are managed and processed as platform jobs.
Each Platform API exposes operations for managing different types of platform jobs, i.e. jobs that perform API-specific processes. For example, the Export API provides operations for managing different types of export jobs, e.g. DOWNLOAD_EDM
, DOWNLOAD_EXPOSURE_RESOURCE
, DOWNLOAD_LOCATION_RESULTS
.
For each job type, the table lists the Platform API and the operation or operations that can create jobs of that type:
Job Type | API | Operation |
---|---|---|
ALM | Model API | Create Model Job |
ARCHIVE_EDM | Risk Data API | Archive EDM |
BATCH | Batch API | Create Batch Job |
BH_ROLLUP_DOWNLOAD_CSV | Export API | Create Export Job |
BULK_EDIT | Risk Data API | Exposure Bulk Edit, Create Import Job |
BULK_GEOHAZ | Bulk Geohaz API | Create Bulk Geohaz Job |
CEDE_IMPORT | Import API | Create Import Job |
CLIMATE_CHANGE | Risk Data API | Calculate Climate Change |
CONVERT_ACCOUNT_CURRENCY | Currency Conversion API | Create Currency Conversion Job |
CONVERT_AGGREGATE_PORTFOLIO_CURRENCY | Currency Conversion API | Create Currency Conversion Job |
CONVERT_EVENT_RATE_LOSS | Currency Conversion API | Convert Event Rates and Losses |
CONVERT_EVENT_RATE_LOSS | Risk Data API | Convert Event Rates and Losses |
CONVERT_PORTFOLIO_CURRENCY | Currency Conversion API | Create Currency Conversion Job |
CONVERT_RESULT_CURRENCY | Currency Conversion API | Create Currency Conversion Job |
CONVERT_TREATY_CURRENCY | Currency Conversion API | Create Currency Conversion Job |
COPY_ACCOUNT | Copy API | Copy Job |
COPY_AGGREGATE_PORTFOLIO | Copy API | Copy Job |
COPY_PORTFOLIO | Copy API | Copy Job |
COPY_TO_ARCHIVE | Copy API | Create Archives from Snapshot |
CREATE_EDM | Risk Data API | Create EDM |
DATA_BRIDGE_DATABASE_SYNC | Data Bridge API | Data Bridge API |
DATA_BRIDGE_NOTIFICATION | Data Bridge API | Data Bridge API |
DELETE_ACCOUNT | Risk Data API | Delete Account |
DELETE_AGGREGATE_PORTFOLIO | Risk Data API | Delete Aggregate Portfolio |
DELETE_ARCHIVE | Risk Data API | Delete Archive |
DELETE_EDM | Risk Data API | Delete EDM |
DELETE_PORTFOLIO | Risk Data API | Delete Portfolio |
DEREGISTER_DATA_BRIDGE_EDM | Risk Data API | Deregister EDM |
DLM | Model API | Create Model Job |
DOWNLOAD_EDM | Export API | Create Export Job |
DOWNLOAD_EXPOSURE_RESOURCE | Export API | Create Export Job |
DOWNLOAD_LOCATION_RESULTS | Export API | Create Export Job |
DOWNLOAD_PEQT | Export API | Create Export Job |
DOWNLOAD_RDM | Export API | Create Export Job |
DOWNLOAD_REPORT | Export API | Create Export Job |
DOWNLOAD_RESULTS | Export API | Create Export Job |
EDM2EDM | Export API | Create Export Job |
EDM | Import API | Create Import Job |
EDM_UPGRADE | Risk Data API | Upgrade EDM Data Version |
ENRICH_EXPOSURE | Enrich Exposure API | Create Enrich Exposure Job |
EXPOSURE_BATCH_EDIT_WITH_GEOHAZ | Batch API | Create Batch Job |
EXPOSURE_BATCH_EDIT | Batch API | Create Batch Job |
GEOCODING | Geohaz API | Geocode Location |
GEOHAZ | Geohaz API | Create Geohaz Job |
GROUPING | Grouping API | Create Group Job |
HAZARD | Geohaz API | Create Geohaz Job |
HD_ALM_GROUPING | Grouping API | Create Grouping Job |
HD_ALM | Model API | Create Model Job |
HD_GROUPING | Grouping API | Create Grouping Job |
HD_MAP_PERSPECTIVE | Risk Data API | Convert Financial Perspectives |
HD_MARGINAL_IMPACT | Risk Data API | Create Marginial Impact |
HD | Model API | Create Model Job |
MAP_PERSECTIVE | Risk Data API | Convert Financial Perspectives |
MARGINAL_IMPACT | Risk Data API | Create Marginal Impact |
MRI_IMPORT | Import API | Create Import Job |
OED_IMPORT | Import API | Create Import Job |
PATE | Risk Data API | Recalculate with PATE |
PLT_TO_ELT | Risk Data API | Convert PLT to ELT Result |
RDM | Import API | Create Import Job |
RDM_DATABRIDGE | Import API | Create Import Job |
REGISTER_DATA_BRIDGE_EDM | Risk Data API | Register EDM |
RENAME_ANALYSIS | Risk Data API | Rename Analysis Result |
REPORT_GENERATION | Risk Data API | Create Risk Data Report |
RERUN_EP | Risk Data API | Recalculate EP and Statistics |
RESTORE_ARCHIVE | Admin Data API | Restore Database from Archive |
SIMULATE_LOSSES | Risk Data API | Simulate PLT Analysis |
STOCHASTIC_CEP | STEP API | Create STEP job |
UPDATE_METRICS | Risk Data API | |
UPLOAD_EDM | Import API | Create Import Job |
UPLOAD_RDM | Import API | Create Import Job |
Some operations can create different types of jobs. For example, the Create Import Job operation may initiate a CEDE_IMPORT
, EXPOSURE_BULK_EDIT
, MRI_IMPORT
, OED_IMPORT
, RDM
, RDM_DATABRIDGE
job depending on parameter values specified. Such operations typically accept a type
or jobtype
parameter that specifies the job to be submitted.
User-defined workflows
A workflow is a standard and repeatable process that transforms or processes data. A workflow generally consists of multiple subprocesses that are performed in a specific sequence — a pipeline — by which the output of one step is used in the next step of the process.
Intelligent Risk workflows (e.g. cat modeling, data migration) generally comprise multiple API requests that utilize several different API operations.
The Create Batch Job operation enables clients to define and initiate a user-defined workflow jobs consisting of multiple sequenced operations. The user-defined job manages the both the sequencing of operations and the reporting of status of operations and job itself. All job attributes are specified in the request body.
Managing Jobs
Platform APIs expose operations for creating, fetching, and updating different types of platform jobs.
- A Create Job operation creates a platform job that performs a certain task and adds that job to the workflow engine queue for processing.
- A Search Job operation returns a list of jobs submitted using a particular Platform API. These operations typically support the filtering and sorting of platform jobs within a specific API.
- A Get Job operation returns detailed information about a specific task. This operation can be used to poll the status of jobs, debug failed jobs, or to retrieve the output for a completed job.
- An Update Job operation updates the status or priority of a platform job.
Creating Jobs
Most Platform APIs expose one or more operations for creating platform jobs.
In general, these operations do not accept path parameters or query parameters. Most optional and required parameter are specified in the body of the request in JSON format.
Unlike most Platform API operations, all Platform API operations that create platform jobs require that request pass a resource group ID in the x-rms-resource-group-id
header parameter.:w
curl --request POST \
--url https://api-euw1.rms.com/platform/model/v1/jobs \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header 'x-rms-resource-group-id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' \
--data '
{ // Body Paramerters}
'
A header parameter is a parameter passed in an HTTP request header that specifies information about the client making the request or about the requested resource. Platform APIs require that client applications specify the ID number of a resource group in the x-rms-resource-group-id
header parameter in every request that creates a model job or non-model job.
The Intelligent Risk Platform manages and tracks all job-related resource quotas via resource groups. The resource group manages and tracks the number of jobs that the tenant runs. It defines the maximum number concurrent of model jobs, the maximum number of concurrent non-model jobs, and the total number of jobs that the tenant may run during the day. For information see Resource Groups.
Workflow Engine Queue
On success, operations that create platform jobs returns a 202 Accepted
response and adds a job to the workflow engine queue. The Location
response header specifies the job ID as part of a URL that you can use to track the status of the job.
API requests that create platform jobs typically return a 202 Accepted
HTTP status code and a URI that enables polling of the job status.
Each Platform API exposes distinct resources for tracking the status of API jobs. Platform jobs have their own namespace but are globally identifiable. For example, An import job is identifiable with its URI: /platform/import/v1/jobs/123
; And the job 123 is also globally unique (Operations at tenant level is scheduled for future release
Viewing Lists of Jobs
Search job operations return a list of platform jobs created using a specific Platform API.
For example, the Search Risk Data Jobs operation returns a list of jobs created using Risk Data API operations. For each job, the response returns metadata about that job including its ID, status, and progress:
[
{
"jobId": "54",
"userName": "[email protected]",
"status": "QUEUED",
"submittedAt": "2024-01-01T00:00:00.000Z",
"startedAt": "2024-01-01T00:00:00.000Z",
"endedAt": "2024-01-01T00:00:00.000Z",
"name": "string",
"type": "string",
"progress": 0,
"priority": "medium",
"entitlement": "RI-RISKMODELER",
"resourceGroupId": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"details": {
"resources": [
{
"uri": "string"
}
],
"summary": "string"
},
"warnings": [
{
"message": "string"
}
]
}
]
The properties returned by a Search Job operation differ by API.
Property | Description |
---|---|
details | Object that returns detailed information about the job. Contents of this object differs depends on the API. |
endedAt | Date and time that job ended. |
entitlement | Entitlement of the principal that created the job. |
jobId | ID of the job. Use the ID to view detailed information about a specific job. |
name | User-defined name of the job. |
priority | Priority of the job. |
progress | Current progress of the job. |
resourceGroupId | Resource group ID specified in the job request. |
startedAt | Date and time that job was started. |
status | Status of the job. One of CANCELLED , CANCELLING , CANCEL_REQUESTED , FAILED , FINISHED , PENDING , QUEUED , or RUNNING . |
submittedAt | Date and time that the job was submitted. |
type | Type of the job. Some Platform APIs support multiple job types, e.g. the Risk Data API. |
userName | Login the principal that submitted the job. |
warnings | Object returned by some APIs. |
These operation supports response filtering based the value of a subset of properties. Depending on the property, you may use a combination of comparison operators, list operators, and logical operators. To learn more, see Response Filtering.
Search job operation return information only about platform jobs created using that API, e.g. the Search Risk Data Job operation returns information about platform jobs created using Risk Data API operations only. There is no Platform operation that returns a list of all platform jobs across all Platform APIs.
Viewing Job Details
Get Job operations returned detailed information about a specific job.
For example, the Get Risk Data Job operation returns information about a single job created using Risk Data API operations. The response returns metadata about that job including its ID, status, and progress:
{
"jobId": "string",
"userName": "string",
"status": "QUEUED",
"submittedAt": "string",
"startedAt": "string",
"endedAt": "string",
"name": "string",
"type": "string",
"progress": 0,
"details": {
"resources": [
{
"uri": "string"
}
],
"summary": "string"
},
"tasks": [
{
"taskId": "string",
"guid": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"jobId": "string",
"status": "QUEUED",
"submittedAt": "string",
"createdAt": "string",
"name": "string",
"percentComplete": 0,
"priorTaskGuids": ["3fa85f64-5717-4562-b3fc-2c963f66afa6"],
"output": {
"summary": "string",
"errors": [
{
"message": "string"
}
],
"log": {
"additionalProp": "string"
}
}
}
]
}
The properties returned by a Search Job operation differ by API.
Property | Description |
---|---|
details | Object that returns detailed information about the job. Contents of this object differs depends on the API. |
endedAt | Date and time that job ended. |
entitlement | Entitlement of the principal that created the job. |
jobId | ID of the job. Use the ID to view detailed information about a specific job. |
name | User-defined name of the job. |
priority | Priority of the job. |
progress | Current progress of the job. |
resourceGroupId | Resource group ID specified in the job request. |
startedAt | Date and time that job was started. |
status | Status of the job. One of CANCELLED , CANCELLING , CANCEL_REQUESTED , FAILED , FINISHED , PENDING , QUEUED , or RUNNING . |
submittedAt | Date and time that the job was submitted. |
tasks | A list of tasks processed within a job. |
type | Type of the job. Some Platform APIs support multiple job types, e.g. the Risk Data API. |
userName | Login the principal that submitted the job. |
warnings | Object returned by some APIs. |
Polling Jobs
Every Platform API that exposes an operation for initiating a job, also supports operations for "polling" the status of that job.
Polling is a technique that enables applications to make non-blocking requests that is useful when the application needs to use services that require long-running processes, such as Intelligent Risk Platform jobs.
API requests that create platform jobs typically return a 202 Accepted
HTTP status code and a URI that enables polling of the job status.
Each Platform API exposes distinct resources for tracking the status of API jobs. Platform jobs have their own namespace but are globally identifiable. For example, An import job is identifiable with its URI: /platform/import/v1/jobs/123
; And the job 123 is also globally unique (Operations at tenant level is scheduled for future release
A successful response returns the job
object, which provides detailed information about this job including the submitTime
, startTime
, type
, job details, and its status:
Status | Definition |
---|---|
PENDING | The job is pending. Follow by the QUEUED status. |
QUEUED | The job has been added to the queue. Follow by the RUNNING status. |
RUNNING | The platform is processing the job. Follow by the FINISHED or FAILED status. |
FINISHED | The job is finished. |
FAILED | The job has failed. |
CANCEL_REQUESTED | The platform has received a request to canel the job. Follow by the CANCELLING status. |
CANCELLING | The platform is cancelling the job. Follow by the CANCELLING status. |
CANCELLED | The job has been cancelled. |
Updating Jobs
Most Platform APIs offer an operation for managing a job after it has been added to the workflow engine queue.
These operations can be used to change the priority of a job or to cancel a job.
Retrieving Job Data
Operations that submit platform jobs do not return responses.
On success, these operation returns a 202 Accepted
response and adds a job to the workflow engine queue. The Location
response header specifies the job ID as part of a URL that you can use to track the status of the job. This URI identifies a specific job resource. Client applications can "-poll" this URI to track the status of the job.
Once the job is complete and its status is set to FINISHED
, the URI can be used to view informaiton about the changes made by the job, e.g. the objects created or the data available for download.
The typical response object of a FINISHED
job returns information about the job:
{
"userName": "[email protected]",
"status": "FINISHED",
"name": "ExBatchEdit",
"type": "EXPOSURE_BATCH_EDIT",
"progress": 100,
"priority": "medium",
"entitlement": "RI-RISKMODELER",
"resourceGroupId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"jobs": [],
"summary": {},
"output": {},
"messages": []
}
In addition to the standard metadata about the job (e.g. jobId
, name
, type
, entitlement
), the response returns information about the resources created by the job.
The response data returned in the response depends on the type of the job.
Property | Type | Description |
---|---|---|
details | Array | Includes resources array, summary |
jobs | Array | Information about successful subtassks performed as part of the job. |
summary | Object | Summary information about the job and resources inserted, edited, or deleted by the job. |
output | Object | Information about the resources created by the job or link to download resources. |
messages | Array | Warnings returned by the platform during processing. |
warnings | Array | Array of message objects. |
tasks | Array | Includes an array to task objects, which include an output object. |
Updated 2 days ago