Get storage bucket URL

Fetches the path to a storage bucket on S3 and temporary security credentials. Use this URI to upload exposure or result data to the storage bucket and migrate that data from the storage bucket to a cloud-based EDM or RDM.

The operation takes three required query parameters: filename, dbtype, and fileextension.

  • The filename query parameter specifies the name of the file. The file name must include the correct file format as well, e.g. filename.bak.
  • The dbtype query parameter specifies the database type of the data source. One of ANLSHAZARD, DLMPROFILES, EDM, EVENTINFO, GEOCODE, GEOGRAPHY, MAPCONFIG, NONE, RDM, REPORT, SYSTEMDATA, TABDATA, EXPOSURE_BATCH_EDIT, TARGET, USERCONFIG, UWG, VULNERABILITY, WEB.
  • The fileextension specifies the file format of the specified database file. One of BAK, CSV, JSON, MDF, or PARQUET.

Data Migration Recipe

This operation may be used in tandem with Amazon AWS and other Risk Modeler API operations to manage the transfer of on-premise EDMs or RDMs to the Intelligent Risk Platform.

  1. Create a database artifact.
  2. Get the storage bucket URI and temporary credentials.
  3. Upload the database artifact to S3 using Amazon S3 APIs.
  4. Upload EDM or RDM

For step-by-step instructions, see Import EDMs or Import RDMs.

Import Batch Exposures Recipe

This operation may be used as part of a workflow that imports large volumes of exposure data in batch.

To enable this workflow specify EXPOSURE_BATCH_EDIT as the dbType and JSON as the fileextension.

This workflow consists of four steps:

  1. Define exposure data in JSON file.
  2. Use this operation to retrieve the storage bucket URI, uploadId, and temporary credentials.
  3. Upload the JSON file to S3 using Amazon S3 APIs.
  4. Import exposure data using Manage Exposures in Batch operation. The request package specifies the appropriate uploadId.
Language
Credentials
URL