Creating the staging tables
The Information Store does not ingest data directly from your data source. Instead, ingestion takes place from staging tables that you create and populate. This abstraction makes it easier for you to align your data with the Information Store, and allows i2 Analyze to validate your mappings before ingestion.
Before you begin
The staging tables that you use to ingest data into the Information Store must conform to a specific structure. For more information about the staging table structure, see Information Store staging tables.
About this task
The simplest approach to Information Store ingestion is to create a staging table for every entity type, and every entity-link-entity type combination, that you identified in your data. The i2 Analyze deployment toolkit and the ETL toolkit both have a command for creating one staging table at a time.
The deployment toolkit command looks like this:
setup -t createInformationStoreStagingTable
-p schemaTypeId=type_identifier
-p databaseSchemaName=staging_schema
-p tableName=staging_table_name
While the ETL toolkit command looks like this:
createInformationStoreStagingTable
-stid type_identifier
-sn staging_schema
-tn staging_table_name
In both cases, type_identifier is the identifier of one of the entity types or link types from the i2 Analyze schema that is represented in your data source.
staging_schema is the name of the database schema to contain the staging tables. (If you are using PostgreSQL or SQL Server, the schema must exist before you run the command. If you are using Db2, the command creates the database schema if it does not exist.)
staging_table_name is the name of the staging table itself, which must be unique, and must not exceed 21 characters in length.
Important: Many of the commands that are associated with the ingestion process modify the database that hosts the Information Store. By default, the commands use the database credentials that you specified during deployment in the credentials.properties file.To use different credentials in the deployment toolkit, add importName and importPassword parameters to the list that you pass to the command. To use different credentials in the ETL toolkit, modify the DBUsername and DBPassword settings in the Connection.properties file.
Procedure
If you are using the deployment toolkit, open a command prompt and navigate to the toolkit\scripts directory. If you are using the ETL toolkit, navigate to the etltoolkit directory.
For each entity type or link type that you identified for ingestion, run the createInformationStoreStagingTable command.For example:
setup -t createInformationStoreStagingTable -p schemaTypeId=ET5 -p databaseSchemaName=IS_Staging -p tableName=E_Person
By convention, you create all of the staging tables for the same source in the same database schema, which has the name IS_Staging in this example. It is also conventional to name the staging table itself similarly to the display name of the entity type or link type to which the table corresponds. In this case, the staging table is for the Person entity type.
Note: When the i2 Analyze schema allows the same link type between several different entity types, create separate staging tables for each combination:
setup -t createInformationStoreStagingTable -p schemaTypeId=LAC1 -p databaseSchemaName=IS_Staging -p tableName=L_Access_To_Per_Acc
setup -t createInformationStoreStagingTable -p schemaTypeId=LAC1 -p databaseSchemaName=IS_Staging -p tableName=L_Access_To_Per_Veh
This example illustrates an Access To link type (with identifier LAC1) that can make connections from Person entities to Account entities, or from Person entities to Vehicle entities. The commands create staging tables with different names based on the same link type.
Results
At the end of this procedure, you have a set of staging tables that are ready to receive your data before ingestion takes place. The next task is to make your data ready to populate the staging tables.