Posts

Showing posts from September, 2007

Values to be used for Staging Table mapping Keys

Hint: I recommend always mapping the Staging Table keys as numeric value (Leading zero-filled) values to ensure that the multi-threaded functionality in the CC&B batch processes can be utilised as part of your conversion run. Whilst the mapping tool allows for alphanumeric mappings (or trailing space-filled) it will restrict you to single-threaded instances of all conversion batch tasks. Where the mapping is dependent on complex mapping of multiple source entities into a single concatenated value, always attempt to ensure that the resulting field is numeric. eg. when concatenating Source Account and Service Identifier (eg. 12345 'Electricity') choose notional numeric values for each service type and zero fill to result in a CC&B staging value of '0012345001' (ie. TO_CHAR(Account_Number || TO_CHAR(DECODE(Service_type,'Electricity','1','Gas','2','Water','3'), '000'), '0000000000')

Inserting values > 4000 characters into CXC_* tables

Question Recieved: We are encountering the following Error: Error encountered when trying to build a particular SQL statement. The output of the error message is as follows: SQLException in com.oracle.ugbu.cc.run.builder.SqlBuildStatementRow Message: ORA-01461: can bind a LONG value only for insert into a LONG column SQLState: 72000 ErrorCode: 1461 Statement causing error: INSERT INTO CXC_SQL_BUILD_STMT ( CXC_SQL_BUILD_ID ,CXC_STEP_SEQ ,CXC_TOP_DOWN_SEQ ,CXC_STATEMENT_SEQ ,STMT_EXEC_CLASS ,CXC_SQL_TEXT ,ROW_VERSION ) VALUES (?,?,?,?,?,?,?) Answer: This error occurs if the SQL value being inserted into CXC_SQL_BUILD_STMT exceeds 4000 characters, consider using pre-compiled PL/SQL functions or Temporary staging tables to ensure that individual SQL statements are not overly complex.

Problems connecting to distributed databases using Oracle10g.

Question Recieved: We have encountered issues connecting to distributed databases using the internal CC&B subsystems (and TNSping) resulting in the following errors: *********************************************************************** Fatal NI connect error 12557, connecting to: (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=CCBTEST)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=CCBCONVT)(CID=(PROGRAM=C:\spl\CCBCONVT\runtime\CLBDSYNC.exe)(HOST=CCBTEST)(USER=s_cissys)))) VERSION INFORMATION: TNS for 32-bit Windows: Version 10.2.0.2.0 - Production Time: 04-SEP-2007 16:26:20 Tracing not turned on. Tns error struct: ns main err code: 12557 TNS-12557: TNS:protocol adapter not loadable ns secondary err code: 12560 nt main err code: 527 TNS-00527: Protocol Adapter not loadable nt secondary err code: 0 nt OS err code: 0 and -------------------------------------------------------------------------- --Executing Data Synchronization

Use of Lookups in Conversion Mapping

Question Received: When we develop the data migration the data conversions will require lookup tables to allow translation of data from the legacy system to CC&B. Where would we specify the lookup tables? We think it may be in the Input Table Definition. Can the Constant Value or Predicate columns be used to specify a SQL statement or script that can translate a value via a lookup table? Does the lookup table have to be declared to the Mapper or does it just have to be in the Oracle database? Answer: The Lookups and all other 'temporary' tables used as part of the conversion effort can be defined at either the input or staging table level, however if you are building them at the staging table level then you should use the standard CC&B ruling of prefixing the tables with 'CM_' . The Constant value can definitely be used to build SQL or PL/SQL components to be used for the mapping and these tables do not need to be defined within the mapper unless you are also

Linking instances together

Question Received: Is there a mechanism for linking the creation of several instances in several tables together? This means having a mapping for each row in each table created. We may need several instances of several mappings to be run in sequence. Some rows may also be optional based upon a business rule. Answer: The concept of linking the mappings together (and the order in which each is executed is handled by Conversion Central based on the sequence number of the mapping sequence attached to the source definition and the related column mappings.

Staging to Prod schema validation

Question Received: What is required to migrate data from the final staging tables into the production schema? The documentation refers to validation, creating keys, various types of balances. What quality does the data have to be? There are outstanding issues which raise questions about some data items and configuration is still not complete for every data item. Answer: In order to migrate the data from staging to Production, it is expected that the following will hold true: 1. all validation performed via the frontend validation routines will be met by the converted data (all table fields will meet the business needs and code-based tables contain values defined in the Config tables. 2. all Primary keys on the staging tables are unique 3. all foreign keys on the staging tables relate to parent records (no orphan foreign keys exist). 4. all inter-table relationships have been satisfied for particular business object (eg. a Person has at least one record on Person ID, and a Bill,

Multiple Output Rows from a single Input Row

Question Received: How do we use the CC&B Conversion Mapping tools to create several rows in a staging table for one row in a source input table? For example, if we have an account with one service in the legacy system, this may be represented by several service agreements in CC&B. We create one CI_ACCT row for the Legacy System Account. We then need to create several CI_SA rows under it. In some cases we will know the specific details of each row that must be created with a specific type of service eg water usage, drainage. In other cases we may have an unknown number of rows to create. Answer: The option of creating multiple input rows in CC&B for a single source row is accomplished by specifying multiple mapping sources (relating to the same input table). I am not sure that I understand how you could have an 'unknown' number of transactions to create, there has to be a business or conversion rule that specifies what records of defined types are required. t

Iterative Conversion

Question Received: I am toying with the idea of migrating the current information first and the remainder of the history at a later date. I have heard that the conversion tool copes with this, but haven't found any information on how it does. Answer: The Toolkit will support iterative conversions as long as the original master data key tables (the CK_* tables) are not cleared down from Staging (the already converted Transactional Data would need to be cleared down) and the Production instance being migrated into is actually Production (we have migrated into a pre-prod instance in the past and then unloaded this and loaded it into the real PROD instance, but this will not work for your situation. You need to be migrating directly into your intended environment). In this case the migration tool will still know all about the original keys and the generated keys for the primary objects (Account, SA, etc.) and as such it will be able to link the data converted as part of a second pass