Fetches the path to a storage bucket on S3 and temporary security credentials. Use this URI to upload exposure or result data to the storage bucket and migrate that data from the storage bucket to a cloud-based EDM or RDM.
The operation takes three required query parameters: filename
, dbtype
, and fileextension
.
- The
filename
query parameter specifies the name of the file. The file name must include the correct file format as well, e.g.filename.bak
. - The
dbtype
query parameter specifies the database type of the data source. One ofANLSHAZARD
,DLMPROFILES
,EDM
,EVENTINFO
,GEOCODE
,GEOGRAPHY
,MAPCONFIG
,NONE
,RDM
,REPORT
,SYSTEMDATA
,TABDATA
,EXPOSURE_BATCH_EDIT
,TARGET
,USERCONFIG
,UWG
,VULNERABILITY
,WEB
. - The
fileextension
specifies the file format of the specified database file. One ofBAK
,CSV
,JSON
,MDF
, orPARQUET
.
Data Migration Recipe
This operation may be used in tandem with Amazon AWS and other Risk Modeler API operations to manage the transfer of on-premise EDMs or RDMs to the Intelligent Risk Platform.
- Create a database artifact.
- Get the storage bucket URI and temporary credentials.
- Upload the database artifact to S3 using Amazon S3 APIs.
- Upload EDM or RDM
For step-by-step instructions, see Import EDMs or Import RDMs.
Import Batch Exposures Recipe
This operation may be used as part of a workflow that imports large volumes of exposure data in batch.
To enable this workflow specify EXPOSURE_BATCH_EDIT
as the dbType
and JSON
as the fileextension
.
This workflow consists of four steps:
- Define exposure data in JSON file.
- Use this operation to retrieve the storage bucket URI,
uploadId
, and temporary credentials. - Upload the JSON file to S3 using Amazon S3 APIs.
- Import exposure data using Manage Exposures in Batch operation. The request package specifies the appropriate
uploadId
.