Import and Export Data
Using the provided BatchJob called Export in the C3 Agentic AI Platform, you can export data to a file. Before exporting, you can perform some action on the Type instance that holds the data. Data can be exported as JSON, XML, or CSV.
To use Export, start by specifying the details in an instance of BatchExportSpec. You can run the following examples on the windTurbine application included with the C3 Agentic AI Platform.
To use these examples as a tutorial, create an application with windTurbine as a dependency, generate demo data by running WindTurbineDataGenerator.createDemoData(), then follow the examples as written.
Examples
Export all the Type instance data
var exportSpec = BatchExportSpec.make({
targetType: WindTurbine,
numFiles: 100,
fileUrlOrEncodedPathPrefix: "TestBatchExportWindTurbine"
});
var job = Export.startExport(exportSpec);
// Check job status
job.status();
// View File URLs when job is complete
job.get().fileList.urlsIf job.get().fileList.urls returns an error or an empty array, verify that there is data in the Type being exported and check for any errors with the Export job run.
To view the content in the files, utilize the following code example:
// Change index to view other files
var index = 0;
var urls = job.get().fileList.urls;
// Make a File, then read content from the file
// Second parameter in readString() is end character to limit content
// Otherwise, all text is read into memory (proceed with caution)
var content = FileSystem.makeFile(urls[index]).readString(0, 1000);Export selected fields of a Type instance into a CSV file
When exporting data to a CSV, ensure that you provide contentType and csvHeader. If you do not provide csvHeader, no content is written to the resulting file(s).
var exportSpec = BatchExportSpec.make({
targetType: WindTurbine,
numFiles: 100,
fileUrlOrEncodedPathPrefix: "TestCsvExportWindTurbine",
contentType: "csv",
csvHeader: "id, location, manufacturer.id", // include fields to export
filter: "contains(id, 'TURBINE')"
});
var job = Export.startExport(exportSpec);
// Check job status
job.status();
// View File URLs when job is complete
job.get().fileList.urlsFor nested fields, ensure that you de-nest them appropriately (using dot notation) in the csvHeader field. For example, to export certain fields from Meta such as created and createdBy, define the BatchExportSpec as shown:
var exportSpec = BatchExportSpec.make({
targetType: "WindTurbine",
numFiles: 100,
fileUrlOrEncodedPathPrefix: "TestCsvExportWindTurbine",
contentType: "csv",
csvHeader: "id, meta.created, meta.createdBy", // include fields to export
filter: "contains(id, 'TURBINE')"
});If you only include meta in the csvHeader, the resulting field in the export is undefined.
Export results of evalMetrics as JSON
var expressions = ["GeneratorRotationSpeedDiff", "GearOilTemperatureDiff"];
var start = "2022-01-01";
var end = "2022-05-01";
var interval = "DAY";
var filter = "id == 'TURBINE-1'";
var evalSpec = EvalMetricsSpec.make({
expressions: expressions,
start: start,
end: end,
interval: interval
});
var actionParams = {"spec": evalSpec};
var exportSpec = BatchExportSpec.make({
targetAction: PartiallyAppliedAction.make({moduleName: "windTurbine", typeName: "WindTurbine", actionName: "evalMetrics", args: actionParams}),
filter: filter,
numFiles: 100,
contentType: "json",
fileUrlOrEncodedPathPrefix: "TestBatchExportEvalMetrics"
});
var job = Export.startExport(exportSpec);
// Check job status
job.status();
// View File URLs when job is complete
job.get().fileList.urlsSpecify BatchExportSpec for export
The following table describes a few key fields of BatchExportSpec for Export. For a full list of fields and their descriptions, run the command c3ShowType(BatchExportSpec) from Console.
| Field in BatchExportSpec | Data type | Description |
|---|---|---|
targetType | Type | Type to be exported. If targetAction field is specified, then this field should not be used. |
targetAction | PartiallyAppliedAction | Contains the target Type, the action name and the arguments for the action. The output is serialized to the default filesystem. The user cannot specify a filter in the args obj. The filter has to be provided with BatchExportSpec. |
fileUrlOrEncodedPathPrefix | string | Prefix for exported files. You should provide either this field or the field fileList. If you provide fileList, then this field is overridden. If you do not provide fileList or this field, the job fails with an error. The field fileUrlOrEncodedPathPrefix is a unique identifier of where the files are persisted in the default filesystem. To delete any existing files at the fileUrlOrEncodedPathPrefix, set deleteExisting to true. See "Specifying 'fileUrlOrEncodedPathPrefix'" section in this document for more details. |
fileList | FileList | List of files to which the objects are exported. If you do not provide this field, the list of files is generated from fileUrlOrEncodedPathPrefix. |
numFiles | int | Number of files to export the data to. The numFiles is also the the number of batch jobs to spawn because one batch job writes to one file. Specifying numFiles breaks up the data ALMOST uniformly. For example, if you have 10 objs and specify numFiles as 10, it does not guarantee 1 obj per file. It might result in 1 obj per file in most of the files, 2 objs in some, and 3 objs in another file. |
numObjPerFile | int | Number of IDs to process in one batch (which results in one file). You should provide either numFiles or numObjPerFile but not both. This is because each of these parameters determines the other. If both are provided and the parameters are impossible to satisfy, the job throws an error. |
contentType | string | Serialization format and content-type of the files to export (Default: "json"). |
contentEncoding | string | Content-encoding to use while exporting files (used for compression). |
deleteExisting | boolean | If set to "true", this deletes the file with the same fileUrlOrEncodedPathPrefix. |
filter | string | Filter expression. |
order | string | Order expression. |
limit | int | Limit the total number of object to export. |
csvHeader | string | Header when exported data as csv. Required when contentType is "csv". |
xmlInclude | string | Include spec when exporting data as XML. |
jsonInclude | string | Include spec when exporting data as JSON. |
Job instance
The startExport() returns a job instance on which you can call a BatchJob method; for example, setMaxConcurrency(100):
var job = Export.startExport(exportSpec);
job.setMaxConcurrency(100); // and other BatchJob functionsImport
Using Import you can import data into the C3 Agentic AI Platform by reading files that have been exported. The Import Type deserializes the files and then calls the callback action specified in the Import. The Import is also a BatchJob.
To use Import, start by specifying the details in an instance of BatchImportSpec. See the following examples.
You can export Stored Calculated fields along with standard persisted fields. However, when importing Type data, the data from the calculated fields are not persisted. Instead, you must manually refresh the calculated fields to populate them.
Examples
Import data of a Type instance
The following code works for any export file type (JSON, CSV, XML) that exists at the fileUrlOrEncodedPathPrefix specified.
var importSpec = BatchImportSpec.make({
targetType: WindTurbine,
fileUrlOrEncodedPathPrefix: "TestBatchExportWindTurbine"
});
var job = Import.startImport(importSpec);
// Check job status
job.status()After the job completes, you can fetch on the targetType to ensure the data was imported correctly.
Specifying BatchImportSpec for Import
The following table describes a few key fields of BatchImportSpec for Import. For a full list of fields and their descriptions, run the command c3ShowType(BatchImportSpec) from Console.
| Field in BatchImportSpec | Data type | Description |
|---|---|---|
targetType | Type | Type instance to which the data is imported. If you specify targetAction, this field should not be used. |
targetAction | PartiallyAppliedAction | Action that is called with the deserialized objs. The action is called just with the imported data and ignores the provided arguments. If you provide targetType, then this field should not be provided. |
fileUrlOrEncodedPathPrefix | string | Path to use to determine the files to import. If the provided path is not a URL, then it is treated as a file prefix. Important: It is highly recommended to always use fileList with Import to ensure that only the expected files/objs are imported. |
fileList | FileList | List of files from which the objects are imported. |
Job instance
The startImport() returns a job instance on which you can call a BatchJob method; for example, setMaxConcurrency(100):
var job = Import.startImport(importSpec);
job.setMaxConcurrency(100); // and other BatchJob functionsSpecifying fileUrlOrEncodedPathPrefix
The parameter fileUrlOrEncodedPathPrefix can be used to control the bucket and the URL to export and import data. You must provide a fully qualified URL for the target location.
If fileUrlOrEncodedPathPrefix is not a URL, then fileUrlOrEncodedPathPrefix is processed as a prefix for a file, and the Export exports to or Import imports from the following path of the default file system:
// Root URL for default file system
var rootUrl = FileSystem.rootUrl();
// Format of file name
"<rootUrl>/<fileUrlOrEncodedPathPrefix>/<fileUrlOrEncodedPathPrefix>-<RANDOM_HASH>-<FILE_NUMBER>.<extension>"Examples
The following are a few fileUrlOrEncodedPathPrefix examples with AWS S3 as the default file system:
sampleTest
The path used to export to or import from is:
s3://<defaultBucketName>/<ENV_NAME>/<APP_NAME>/fs/sampleTest/sampleTest-<RANDOM_HASH>-<FILE_NUMBER>.<extension>`For example, a sample file could be:
s3://c3--platform/testenv/windturbine/fs/sampleTest/sampleTest-PGDMPB0CMS-0.jsonsampleTest/somePrefix
The path used to export to or import from is:
s3://<defaultBucketName>/<ENV_NAME>/<APP_NAME>/fs/sampleTest/somePrefix-<RANDOM_HASH>-<FILE_NUMBER>.<extension>For example, a sample file could be:
s3://c3--platform/testenv/windturbine/fs/sampleTest/somePrefix-PGDMPB0CMS-0.jsons3://c3--platform/testenv/windturbine/fs/sampleFolder/
This puts all files in the folder specified by the path, with no file name prefix. For example, a sample file could be:
s3://c3--platform/testenv/windturbine/fs/sampleFolder/PGDMPB0CMS-0.jsonIf the path separator / is missing from the end, it is treated as file prefix and not a folder. See the following example.
s3://c3--platform/testenv/windturbine/fs/sampleFolder/samplePrefix
This puts all the files in the folder sampleFolder, and all files have the prefix samplePrefix. For example, a sample file could be:
s3://c3--platform/testenv/windturbine/fs/sampleFolder/samplePrefix-PGDMPB0CMS-0.json