In a single file load one file is loaded into the target database at a time. Source data files are uploaded into the Agile Data Engine default file load bucket/container and notifications per file are created into the notify bucket/container.
The files have to be named according to the naming convention described in this article. Parameter values given in a file name guide Agile Data Engine in the execution of a file load.
File naming convention
The naming convention for source files:
Parameter details and guidelines:
source entity name
Name of the SOURCE entity. If a SOURCE entity is not defined, name of the STAGE entity.
timestamp in milliseconds
File creation time that ensures the file name is unique.
file batch id
Identifier of the data batch from the process that generated the file. Batch id is essential with the Run ID Logic.
Describes if the file represents the full data from the source or just a part of it. When set to true, the target table is truncated before the file load.
Field delimiter of a CSV formatted data file.
number of header lines to skip
Number of header rows in a CSV formatted data file which should be skipped in the file load.
Data file type. Note that supported formats depend on the target database product.
Data file compression. Note that supported compression methods depend on the target database product.
Use lowercase letters in folder and filenames.
OPT_FILE_FORMAT_OPTIONS can be used to override file format options given in the filename. It can also be used to define target database management system specific format options that the naming standard might not support.
OPT_DATA_FILE_LOCATION can be used to override file location in Snowflake and BigQuery. In this case the data file can be placed in a different location than the default file load bucket/container.
Single CSV file load into a staging table
SOURCE and STAGE entities created in the Designer and deployed into a Runtime environment:
Parameter values for the filename:
File uploaded to (Azure):
File uploaded to (AWS):
Notification file uploaded to (Azure):
In AWS the notification file is automatically created:
Once deployed and enabled, the workflow is executed according to the schedule or it can be manually triggered from the Workflow Orchestration.