Creating the staging tables
The Information Store does not ingest data directly from your data source. Instead, ingestion takes place from staging tables that you create and populate. This abstraction makes it easier for you to align your data with the Information Store, and allows i2 Analyze to validate your data before ingestion.
The simplest approach to Information Store ingestion is to create a staging table for every entity type, and every entity-link-entity type combination, that you identified in your data. The i2 Analyze deployment toolkit and the ETL toolkit both have a command for creating one staging table at a time.
The deployment toolkit command looks like this:
setup -t createInformationStoreStagingTable
-p schemaTypeId=type_identifier
-p databaseSchemaName=staging_schema
-p tableName=staging_table_name
While the ETL toolkit command looks like this:
createInformationStoreStagingTable
-stid type_identifier
-sn staging_schema
-tn staging_table_name
In both cases, type_identifier is the identifier of one of the entity types or link types from the i2 Analyze schema that is represented in your data source. staging_schema is the name of the database schema to contain the staging tables. (If you are using Db2, the command creates the database schema if it does not exist. If you are using SQL Server, the schema must exist.) staging_table_name is the name of the staging table itself, which must be unique, and must not exceed 21 characters in length.
To
use different credentials in the deployment toolkit, add importName
and
importPassword
parameters to the list that you pass to the command. To use
different credentials in the ETL toolkit, modify the DBUsername
and
DBPassword
settings in the Connection.properties
file.
At the end of this procedure, you have a set of staging tables that are ready to receive your data before ingestion takes place. The next task is to make your data ready to populate the staging tables.