!!! Writing data to the Tags table
Recording data in the brand table, version 4.6.6.6.
In the transformation result, all the data to be recorded in the brand table is already present, with the field names, types, and primary keys configured. Now, they can construct the Automatic Box. They will use the Productivity button.
Clicking it, the Automatic Box will already have a name and descriptor. In Connections, they need to determine the source and destination. The source is a text file, and the destination is SQL Server.
Clicking on Source, the transformation will already be configured. However, in Target, click on Objects, then Save, and select the Data Warehouse database. Click Save. Clicking on Properties, they will determine the table where the data will be recorded, which in this case is Jean Brandt. Click Save, then New, and finally, Close.
Returning to the Automatic Box, they might not see the newly created Automatic Box, but upon refreshing, it will appear. This Automatic Box was built using the Productivity button, and the source was already configured from the container selected with the Productivity button.
They can check the Jean Brandt table. If they look at its contents, it will be empty. Returning to the PIS, clicking on the Automatic Box Load Jean Brandt and executing it, then going back to SQL Server and executing the query will show that the Jean Brandt table is now filled.
Additionally, if they observe the Data Warehouse schema, they will see a foreign key between the Jean Category and Jean Brandt tables. This foreign key links the primary key Code Category to the foreign key Code Category in the Jean Brandt table. When adding data, the process respected this foreign key.
Why? First, they loaded the data in order, with Jean Category first and then Jean Brandt. The Code Category in the Jean Brandt table, through the lockup process, retrieved the code previously stored in the Category table. Thus, there was no breach of referential integrity in the destination table by the ETL process developed by the PIS.
This is a crucial point. They must understand the destination tables and program the loading process in such a way that it does not generate data inconsistencies. Thank you, and until the next video.