OPT_INSERT_UPDATE_BY_ID_AS_OVERWRITE
Target database: Snowflake, Databricks SQL
OPT_INSERT_UPDATE_BY_ID_AS_OVERWRITE is a load option that enables 1-1 loading with key
See also:
Usage
OPT_INSERT_UPDATE_BY_ID_AS_OVERWRITE: Boolean
Default: false
Notes
OPT_INSERT_UPDATE_BY_ID_AS_OVERWRITE can be used in table loads that have a single entity mapping (i.e. one source entity) and attributes with following attribute types defined:
DV_LOAD_TIME
either DV_HASHKEY or DV_REFERENCING_HASHKEY
DV_UPDATE_TIME (optional)
Hint: Can be used also with Run ID logic.
Examples
Insert, Update Load by ID as OVERWRITE option
This example presents how an entity load would work with this logic enabled.
Assumptions:
Data is staged to table stage.stg_taxi_zone_lookup and it will be loaded to tgt.latest_entries
Feature is enabled for the load by setting option:
OPT_INSERT_UPDATE_BY_ID_AS_OVERWRITE: true
Resulting DML generated by Agile Data Engine:
INSERT OVERWRITE INTO tgt.LATEST_ENTRIES
-- collecting data from target which do not exist in source by key
SELECT
....
FROM
tgt.LATEST_ENTRIES trg
WHERE
NOT EXISTS (
SELECT
1
FROM
(
SELECT
DISTINCT <key formula> AS latest_entries_key
FROM
stage.STG_TAXI_ZONE_LOOKUP src_entity
) src
WHERE
trg.latest_entries_key = src.latest_entries_key
)
UNION ALL
-- Data from source
SELECT
DISTINCT
...
FROM
(
SELECT
<key formula> AS latest_entries_key
...
FROM
stage.STG_TAXI_ZONE_LOOKUP src_entity
) src
-- LEFT OUTER JOIN is used if both DV_LOAD_TIME and DV_UPDATE_TIME existing (to store the initial insertion time)
LEFT OUTER JOIN tgt.LATEST_ENTRIES trg on (
trg.latest_entries_key = src.latest_entries_key
);