tag:blogger.com,1999:blog-52899985570863779882024-03-12T14:23:59.126+11:00Stu's Braindump.. are we there yet?Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.comBlogger39125tag:blogger.com,1999:blog-5289998557086377988.post-29567748636239635852021-03-19T09:21:00.000+11:002021-03-19T09:21:12.417+11:00I'm Back..<p> It has been far too long between posts, but I am back!!! Still working as a member of the Oracle Utilities Black Belt Team, and still trying to post at a somewhat regular cadence (although some would argue that a gap of about 10 years is not very regular).</p><p><br /></p><p>Watch this space for more posts on the use of Oracle Utilities products and recommendations on optimising implementations</p>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-48785953944655526312011-04-15T16:16:00.003+10:002011-04-18T14:23:54.617+10:00Moving Home.I have moved my blog back within the Oracle blog-space. please update your links and RSS feeds to point to my new home at <a href="http://blogs.oracle.com/stusbraindump">http://blogs.oracle.com/stusbraindump</a>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-79964989608325366652011-01-07T10:56:00.000+11:002011-01-07T10:56:14.279+11:00Controlling your VersionsA handy hint from one of my colleagues (thanks SG) is to populate the VERSION field on each record with a unique number for each of your various data sources (ie 1000,2000,3000) to allow for easy identification of converted data at a later date, as well as the allowing for identification of the associated conversion stream which created this record (assuming each source has a different set of mapping rules).Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-10703875223132964722010-10-26T15:09:00.004+11:002010-10-26T16:35:23.142+11:00Embedding web-based external applications within the framework.My current project contained a requirement to embed an external application within the the CC&B framework and initiate calls to this application supplying a number of context elements to ensure that the correct information is displayed. The client also requested that the application be 'skinned' to maintain the CC&B look-and-feel.<br />
<br />
This is how we implemented it..<br />
<ol><li>Defined a number of Business Object to manage retrieval of data from some of the base CC&B objects in a flattened format (ie. Characteristics mapped to descriptive elements, unused fields not defined, etc).</li>
<li>Defined a UI map which forms the canvas for 'painting' the application within our framework.</li>
<li> A Query Zone to ensure that the required data associated with the Context Zones are retrieved (using Zone type F1-DE-QUERY).</li>
<li>A Business Service to wrap the call to the Query Zone into a usable service.</li>
<li>A Script to initiate the call to the Business Service with the associated Context Identifiers (from the global context).</li>
<li>A Portal used to render the solution within the Framework. </li>
<li>An Application Zone to link the Business Service to the UI Map (using Zone type F1-MAPEXPL) and the Portal.</li>
<li>A Navigation key to reference the Portals Service Program.</li>
<li>A Navigation Option to link the new functionality into the Menu subsystem.</li>
<li>Context Menu Items to link the new functionality to the Service Point and Service Agreement entities (our link to the external application is based on the Geographic Identifiers on the SP).</li>
<li>The required User Group entries to ensure that access is provided to the new Application Service associated with the Menu defined above.</li>
</ol>We also provided the application developer with cloned copies of the Oracle Utilities CSS files to ensure a consistent look and feel.<br />
<br />
Resulting in the application being rendered as:<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQT3AfUOzzJd7t2zofyCmJiBqX3JyCYPRHv41F0f72FnuvPZmzFqL4LibsmjSSyA9FHCZGwnqGZ0XR_7Odsvjz_WB0XLd3oev1S81eZJ-kX76TvGCl1YEsYa3ZoBvNgG3TZGSFuLWqpIZA/s1600/embedded.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="190" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQT3AfUOzzJd7t2zofyCmJiBqX3JyCYPRHv41F0f72FnuvPZmzFqL4LibsmjSSyA9FHCZGwnqGZ0XR_7Odsvjz_WB0XLd3oev1S81eZJ-kX76TvGCl1YEsYa3ZoBvNgG3TZGSFuLWqpIZA/s400/embedded.jpg" width="400" /></a></div><br />
<br />
<br />
We have used similar methods to render our online bill viewer within a pop-up window (using an in-house developed webpage, which calls our CC&B Bill Extract algorithm, resulting in the use of common code for both the online bill view and batch bill extract routines with no need for additional software such as Business Objects or DOC1).<br />
<br />
Feel free to drop me a message if you want any further detail.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com5tag:blogger.com,1999:blog-5289998557086377988.post-67571235271628115282010-03-19T07:22:00.001+11:002010-03-19T07:24:10.011+11:00Somewhat off-topic..I have been unable to access one of my Windows Live accounts (via Messenger and Mail) for the last couple of weeks with every attempt failing with a 80048820 error stating that the service was unreachable. Other accounts on the same machine connect successfully, and attempts to log-on from different devices result in the same message (for this account only).<br />
<br />
It appears that Microsoft have changed their approach to what is allowed in your Live Profile Name, and no longer support special characters (such as &).<br />
<br />
A quick review of <a href="http://windowslivehelp.com/thread.aspx?threadid=a65ac76d-902f-4b9a-ba4d-3450597bdfb5%20">this page</a> should point you in the right direction if you are experiencing the same issues.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-75592810242329152302010-03-03T10:40:00.000+11:002010-03-03T10:40:02.218+11:00Updates on the ShortenSpotA blog run by my colleague, Anthony Shorten, contains some very handy updates in regards to Production Configuration and the new Batch Submission methods released recently. I strongly recommend that you pay a visit to his blog.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-75281063255930565862010-02-02T12:15:00.000+11:002010-02-02T12:15:29.773+11:00I can't remember if I had a memory leak..Memory leaks seem to be one of our biggest issues on my current project (CC&B 2.2.0 on Windows Server x64 2003 R2). A review of the components experiencing these leaks has led us to upgrade the following (with thanks to Josh, Ken and Andre for their assistance):<br />
<ul><li>Sun Java has now been patched from 1.5.0_09 to 1.5.0_22</li>
<li>BEA Weblogic 10.0 has been patched from MP1 to MP2</li>
<li>Single Fix <b>8882447 </b>is recommended if your implementation includes complex plug-in scripts, since these appeared to suffer from a defect in regards to the caching of PreparedXQuery elements resulting in them consuming large amounts of memory.</li>
<li>Review all custom batch modules to ensure that they make use of the <b>createJobWorkForEntityQuery</b> method to build the JobWork object, or use <b>createJobWorkForQueryIterato</b>r and <b>createWorkUnit</b> if you need to add supplemental data. These methods cache the thread work units to a temporary disk file to reduce memory consumption, instead of the managing the ThreadWorkUnits list inside the code. Further details are available in the PFD for Single Fix <b>7447321</b>.</li>
</ul>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-9670508287466208212009-12-21T16:28:00.001+11:002009-12-22T08:31:37.134+11:00My Oracle Support Community linkThe "My Oracle Support" Community Customer Care and Billing page is now available at <a href="https://communities.oracle.com/portal/server.pt/community/customer_care_and_billing">https://communities.oracle.com/portal/server.pt/community/customer_care_and_billing</a> for all registered "My Oracle Support" users.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-45872983648958508032009-12-03T08:53:00.000+11:002009-12-03T08:53:11.278+11:00Service with a Smile..Oracle Utilities Framework 2.2.0 SP6 (Patch number 9042811) and Customer Care & Billing 2.2.0 SP6 (Patch number 9042819) are now available for download from My Oracle Support. Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-2917586427554403882009-11-19T09:42:00.001+11:002009-11-19T09:43:07.310+11:00Handy Links<div style="font-family: inherit;">A couple of handy My Oracle Support (Metalink) links for TUGBU customers...<br />
<br />
ID 804664.1 - <a href="https://support.oracle.com/CSP/ui/flash.html#tab=KBHome%28page=KBHome&id=%28%29%29,%28page=KBNavigator&id=%28viewingMode=1143&bmDocID=804664.1&from=BOOKMARK&bmDocType=HOWTO&bmDocDsrc=DOCUMENT&bmDocTitle=Important%20Patches%20for%20Customer%20Care%20&%20Billing%20V2.1.0%29%29">Important Patches for Customer Care & Billing V2.1.0</a><br />
ID 804612.1 - <a href="https://support.oracle.com/CSP/ui/flash.html#tab=KBHome%28page=KBHome&id=%28%29%29,%28page=KBNavigator&id=%28viewingMode=1143&bmDocID=804612.1&from=BOOKMARK&bmDocType=HOWTO&bmDocDsrc=DOCUMENT&bmDocTitle=Important%20Patches%20for%20Customer%20Care%20&%20Billing%20V2.2.0%29%29">Important Patches for Customer Care & Billing V2.2.0</a><br />
ID 804706.1 - <a href="https://support.oracle.com/CSP/ui/flash.html#tab=KBHome%28page=KBHome&id=%28%29%29,%28page=KBNavigator&id=%28viewingMode=1143&bmDocID=804706.1&from=BOOKMARK&bmDocType=HOWTO&bmDocDsrc=DOCUMENT&bmDocTitle=Important%20Patches%20for%20Enterprise%20Taxation%20Management%20V2.1.5%29%29">Important Patches for Enterprise Taxation Management V2.1.5</a><br />
<br />
</div>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-29569413412940256352009-10-14T14:24:00.005+11:002009-10-28T13:16:35.572+11:00Budget Types<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKQ-0aUeiJI2i3STGct0G2FNzqhogmpFiA-mdLujZDmSnGb5tBG3NxGFWzPlPXFZqC8E_MPZdQQRcMT2Mo0_MHhEE-6zqZz7ZcfaMRvF29Tm17zOnRO1jXG7KcpL85qQ6yEXrmJ0Zp7DYm/s1600-h/Budgets+Overview.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKQ-0aUeiJI2i3STGct0G2FNzqhogmpFiA-mdLujZDmSnGb5tBG3NxGFWzPlPXFZqC8E_MPZdQQRcMT2Mo0_MHhEE-6zqZz7ZcfaMRvF29Tm17zOnRO1jXG7KcpL85qQ6yEXrmJ0Zp7DYm/s320/Budgets+Overview.jpg" /></a><br />
</div>I have generated the following slide in an attempt to ease some of the confusion around what each of the various budget subsystems does in CC&B. It should be noted that in the past we have 'enhanced' the Direct Debit extract to ensure that the balance left on the account at due date is not collected from customers bank accounts when they have specific types of NBB plans in force, this allows us to configure both Balance-on-due-date and Bill-smoothing type structures within the NBB subsystem.<br />
<br />
As can be seen from this table, there is no single plan type which will cater for all permutations of a clients payment offerings, and as a result most implementations make use of a combination of these components to deliver the required functionality.<br />
<br />
(edit: minor update to reflect the fact that Payment Plan entries are not deleted once they are proceesed, where-as NBB entries are).Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-75224613974081827142009-09-23T11:30:00.002+10:002009-10-15T09:18:33.832+11:00Better than a Tardis..Changing the System Date in CC&B (to cater for testing requirements where the system has to be artificially rolled backwards/forwards) used to be done via the Database initialisation parameters, and the Batch Run Date parameter. But it appears that FEATURE CONFIGURATION now supports this requirement.<br />
<br />
Simply create a "General System Configuration" Feature Configuration instance (we tend to call ours CMTIMETRAVEL, but there is no restriction on this in the system), and define Option "System Override Date" with the required value.<br />
<br />
Once again, be aware that it is not recommended that you roll your dates back and forward, since this makes investigation of defects a bit more complicated, but it definitely has its place in relation to test execution (especially Collection and Severance testing, where the process runs across a large number of days and test cycles are typically defined for a much smaller period).Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-21007302982930326222009-09-18T09:14:00.001+10:002009-09-18T09:16:29.635+10:00Updated LinksA quick update to include "Gaffen Central" in my links zone. Allan's Blog tends to focus on Technical Architect aspects of Oracle UGBU FW/CC&B installs & upgrades and his page is well worth a visit.<br />
<br />
I have also included a couple of RSS registration links for readers who would like to subscribe to my blog via RSS, and changed the Top Tags zone to make it a bit easier to read.<br />
<br />
Enjoy.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-6978007131556875552009-09-18T09:08:00.002+10:002009-10-15T09:18:21.778+11:00Bundle Corrections.We have identified an issue with the use of bundling and UI Maps, whereby the resulting Bundle was wrapping the UI Map in a set of CDATA tags, but only after it had already removed all formatting from the source records, resulting in corruption of the UI Map HTML fragments. <br />
<br />
This has now been corrected by implementing Single Fix 8228025 on FW/CC&B 2.2.0.<br />
<br />
If you are using the Bundling subsystem in your implementation, I strongly recommend that you implement this patch.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com4tag:blogger.com,1999:blog-5289998557086377988.post-58980511918992885392009-09-08T11:15:00.005+10:002009-09-09T16:17:04.104+10:00A Little Bundle of Joy..The Bundle entity was introduced in CC&B 2.2.0 as a single fix (Patch 7009383 and part of Service Pack 2), and can prove useful in relation to the management of the more complex "Cool Tool" reference data components. This entity supports simple Version Control structures by taking a snapshot of the data elements at a point in time, and wrapping this as an XML component for implementation in a target environment.<br /><br /><span style="font-weight:bold;">1. A quick overview of the concepts...</span><br />1.1. Base functionality in bundling supports some of the more complex data structures..ie Data Area, Script, Business Service, etc.<br />1.2. We can extend this functionality by:<br /> 1.2.1 generating a custom BO for the Maintenance Object and associate this with a complete schema (by clicking on the generate Schema button on the dashboard)<br /> 1.2.2 navigating to the Maintenance Object associated with the underlying records we are trying to migrate, and define the following MO Options.. <br /> 1.2.2.1 Eligible for Bundling = 'Y'<br /> 1.2.2.2 Physical BO = the business object created in 2.1 above.<br /> 1.2.2.3 FK Reference = the primary key of the Maintenance Object (if none exists, create one in the Foreign Key Metadata).<br />NB: Ensure that the custom Business Objects created above exist in all environments and are named consistently.<br /> <br /><span style="font-weight:bold;">2. To Create an Export Bundle:</span><br />2.1. Logon to the Source Environment (e.g. CONTROL)<br />2.2. Navigate to Admin Menu, Export Bundle+<br />2.3. Provide a descriptive Name for your bundle.<br />2.4. Save the bundle.<br />2.5. Navigate to the components to be added to the menu (via standard Admin menu navigation).<br />2.6. Retrieve the records which form part of this bundle and click on the Add button in the Bundle zone on the dashboard (the Bundling dashboard option only appears when you are maintaining an object against which bundling is supported, see 1.1 and 1.2 above).<br />2.7 Navigate to the Bundle Export function, retrieve your bundle and click on the 'Bundle' button.<br /> <br /><span style="font-weight:bold;">3. To Create an Import Bundle:</span><br />3.1. Logon to the target environment.<br />3.2. Navigate to Admin Menu, Import Bundle+<br />3.3. Provide a descriptive Name for your bundle.<br />3.4. Cut-and-Paste the contents of the 'Bundle Entities' component from the Export Bundle created in 2 above into your new bundle created in 3.3 above. <br />3.5 Save the bundle.<br />3.6 Click on 'Apply' to submit this bundle to your environment.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-27111403603215211682009-09-01T14:29:00.007+10:002009-10-15T09:18:09.313+11:00Config Lab<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUty9X4t7JL0uou4-63F0R0MMlyfCx0aZ-CWr7JD5JzmwMbFdG3_wRhYfXQv76lRCk9PTGtXrF29AktgKFN0VGECcDdbCWAvK1E0k-gLEldeoUbFVLbmQoZ-dwM3xe1pqkl8mkJK3YiK7G/s1600-h/ConfigLab.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUty9X4t7JL0uou4-63F0R0MMlyfCx0aZ-CWr7JD5JzmwMbFdG3_wRhYfXQv76lRCk9PTGtXrF29AktgKFN0VGECcDdbCWAvK1E0k-gLEldeoUbFVLbmQoZ-dwM3xe1pqkl8mkJK3YiK7G/s320/ConfigLab.jpg" /></a><br />
</div><br />
The Oracle Utilities Software Configuration Management guidelines have a number of recommendations which should be adhered to.. not least of which is the implementation of a structured method of deploying config/reference data in a manner which allows the on-site team to guarantee the environment content (code and data) for each release, before users start using it.<br />
<br />
In the past I have always followed these guidelines in regards to migration of code, and associated technical reference data (batch controls, Algorithm Types, Lookups, etc), but I have been less diligent in tracking all Config through the recommended migration paths, and tended to adopt a Hub-and-Spoke approach to pushing reference data out from the Master copy environment.<br />
<br />
Unfortunately this means that the management of this data then becomes a bit of a nightmare for the Config Team, and although the Bundling subsystem developed as part of 2.2 goes some way to providing version control functionality for some reference data, it is largely restricted to the new "Cool Tools" functionality (ie, Scripts, Zones, UI Maps, etc), although this can be extended for some data types via the Business Object configuration).<br />
<br />
As a result of this, I have recommended that some of our local installs switch over to a reference data migration path which more closely aligns with the standard cascading migration, as detailed above.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-2324923159987312162009-09-01T14:23:00.004+10:002009-10-15T09:17:53.271+11:00Synchronise AccountsThe Synchronise Account functionality has been available in CC&B for quite a while now, and whilst we tend to shy away from using it, it has its place in environment management. <br />
<br />
With the recent performance fixes implemented on FW2.2, in regards to the CLBDSYNC module, this functionality now allows us to transfer subsets of records between environments in a quick and timely manner.<br />
<br />
Whilst it is not designed to be used for large sets of data, it does allow us to pull records from our PROD copy instance back into the Pre-Prod environments to provide a basis for problem investigation. <br />
<br />
Definitely something for you to investigate/consider as part of your on-going environment management effort.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-34282534761557130952009-05-01T12:40:00.002+10:002009-05-01T12:44:39.774+10:00Links...I have updated the link to Anthony's blog to reflect the fact that this has now moved from blogspot.com to back within Oracle.comStuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-87824155044210011312009-04-24T08:16:00.008+10:002009-10-17T16:04:15.921+11:00My Generation..I am aware of a number of performance issues with converting a large amount of transactional data, specifically around the Key Generation routines and the method whereby these jobs cannot be multi-threaded (resulting in massive rollback segments as a single SQL statement attempts to process all transactions in one hit).<br />
<br />
As a result, I have looked into the ability to tune this conversion phase and recommend the following:<br />
1. Review the code in CIPVBSGK (via AppViewer for an example of how CC&B conversion generates keys).<br />
2. Consider dropping all indexes on the CK_* tables before running KeyGen to reduce the amount of I/O (ensure that these are all rebuilt as soon as KeyGen has completed to ensure that other dependent tables do not suffer slowdown as a result of full table scans).<br />
3. Ensure that Rollback Segments, Transaction tables (i.e. Bills, FTs, Payments, Meter Read, etc), Master Data Table (ie Account, SA, etc) and Indexes are allocated their own table-spaces and I/O Channel at the database level to ensure that maximum throughput is ensured (my personal preference is to split the Meter Read and Interval-related Data sets out to yet another table-space and IO channel, but this will be dependent on the disk structure on your machines).<br />
4. Assess whether your partitioning is efficient (or even needed), Partitioning is done to speed up DB accesses where the application is I/O bound. It is not required in order for CC&B to run. If the Database is configured correctly on a single table-space, feel free to leave it as-is.<br />
5. Be aware that running the validation processes for every x records does expose you to the risk that not all errors will be encountered, but it does allow you to quickly identify the big ticket items that are incorrect in your mapping and get these out of the way easily. I recommend starting out running every x records as part of your initial conversion runs and setting the value to 1 for all of your dress-rehearsal iterations.<br />
<br />
Failing this, some sites have elected to customise the KeyGen, in the event that this is your preferred approach take care to:<br />
1. Consider running custom SQL to perform your key generation, aligned with your partition ranges (i.e. 10 concurrent SQL statements with high and low ranges aligned with your 10 partitions, or a factor of this split (e.g. 20 or 30 SQL statements each performing bite-sized chunks of each partition range).<br />
2. Consider running the base KeyGen process in De-duplication mode after your custom SQL has completed to ensure that no duplicates are encountered (if they are this process will 'bounce' the SQL-generated value to another unique value).<br />
3. Note that no manipulation of the input keys is expected as part of your Extract or Load processes in order to complete any of this functionality. <br />
4. Make sure that KeyGen (or your custom SQL) is executed in the order specified by the online help to ensure that inherited keys are generated correctly (e.g. Account should be run first, since most transactional data inherits a component of the primary key from the Account Id).<br />
5. Ensure that you create the initial "null-key" value on the CK_* table to support optional foreign key logic performed as part of the insert into PROD.<br />
6. Ensure that your key fields are all zero-filled numeric values with no trailing spaces (see my earlier posting in relation to <a href="http://stusbraindump.blogspot.com/2007/09/values-to-be-used-for-staging-table.html">Staging Table Keys</a>). <br />
<br />
Regardless of the approach adopted: <br />
1. Consider bringing down the Staging CC&B application during your runs to reduce the risk of table/record locking by individuals logged on to the environment while the batches are running (feel free to bring it up at various points in the overall Conversion process to perform sanity checks, but these must be well defined milestones in the conversion plan, and the application should then be shut down again once this phase has completed).<br />
2. The performance of the Validation steps seems to be affected, to a large degree, by the existence of Characteristics on entities, as a result, expect your runs to take longer if your data model calls for a large amount of characteristics.<br />
3. Financial Transactions that are converted with REDUNDANT_SW = 'Y' will normally be exempt from all further processing in CC&B (with the exception of bill segment processing, direct queries and archiving), as a result the actual key value assigned is of less importance than active transactions.<br />
4. Cross partition population of transactions for a single account will not "break" CC&B, it will simply incur a minor performance hit when retrieving these transactions.<br />
5. Due to the fact that historic Bill Segments are normally accessed regardless of the REDUNDANT_SW by the next bill creation process, it is recommended that inherited key structures be retained for this entity.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-17177927500729455472008-12-04T09:12:00.007+11:002009-10-15T09:13:30.607+11:00Database Upgrades and their impact on Conversion<span style="color: #660000;">Hint: It appears that the current strategy in relation to application of Single Fixes and Service Packs no longer supports application of these scripts against a staging schema (at least as part of CC&B 2.x.x). As a result it is recommended that you drop and rebuild the Conversion schema from the Production instance once this has been upgraded with the latest patch-sets.<br />
<br />
We have also noted that application of these single fixes may require a redelivery of all custom java code from the development environment, since the java compilation is not performed as part of the patch install process (unlike the Cobol implementation process).</span>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com4tag:blogger.com,1999:blog-5289998557086377988.post-13892307878547203402008-11-24T13:21:00.004+11:002009-10-15T09:13:19.972+11:00Foreign Key Characteristic ValuesWe have encountered issues where Conversion runs in CC&B 2.2.0 (and possibly previous versions) results in Foreign key characteristics being populated with a value which contains trailing spaces. Whilst CC&B handles these correctly, it appears that the majority of reporting steps defined as part of the reconciliation processes do not do the required trimming of values, resulting in no match being found.<br />
<br />
I recommend that all reconciliation reporting of foreign Key values includes the necessary 'TRIM' function calls in the selection criteria.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-79323472290869639762008-10-20T11:01:00.006+11:002009-10-15T09:13:07.063+11:00Running Keygen for Partial Conversion Runs<span style="color: #660000;">Hint: Whilst the majority of the CC&B Conversion process can be run as a subset of steps dependent on the type of conversion run being executed (ie. Convert on Person will only require that the Validation and Production steps specific to the Person entity be executed) it is important to note that once a project moves onto a conversion run involving a more complete 'V' structure, that the need to run all KeyGen steps becomes mandatory.</span><br />
<br />
<span style="color: #660000;">KeyGen will attempt to allocate a new CC&B key value for each record on the associated parent table, but it also creates a single 'blank' key value which is used by all Production Insert steps to link to optional key values. A prime example of this is the CI_ACCT MAILING_PREM_ID, this field is an optional field on the CI_ACCT records, but unless the associated CI_PREM KeyGen has been executed to create a ' ' entry on the CK_PREM table, none of the CI_ACCT records will be transferred from Staging to Production.</span><br />
<br />
<span style="color: #660000;">As a result, I recommend that once a site reaches the point where they are considering moving data from Staging to Production, that they include the full suite of KeyGen steps in the Conversion run schedule (regardless of how many of the CC&B data structures they are converting over). Given that these runs are being executed against a set of empty source tables, it is not anticipated that the overall run times will be extended to an unmanagable level.</span>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0tag:blogger.com,1999:blog-5289998557086377988.post-63969648826461290992008-09-02T08:15:00.005+10:002009-10-15T09:12:54.306+11:00Threading...<span style="color: black; font-size: 100%;"><span style="font-family: arial;">A common question that I get asked is.. how do we force multi-threaded conversion jobs to split the key ranges evenly across each thread instance? </span><br />
<br />
<span style="font-family: arial;">Unfortunately I don't have an easy answer. The base conversion jobs seem to take a simplistic approach of using the low and high override values (if supplied, otherwise they use values of 0000000000 and 9999999999) and then divide these by the number of threads to determine the ranges for each instance.</span><br />
<br />
<span style="font-family: arial;">But.. We have found that we can force a predefined range of values and simulate threading but having separate batch controls defined for each thread, with predefined high/low values and these can then be run concurrently through standard batch submission processes.. e.g..<br />
if we want to run VAL-ACCT in 4 threads and ensure that a known range of source keys between 0000000001 and 0000100000 is split between the 4 threads, we clone the existing batch control into<br />
VAL-ACCT1, with OVRD-LOW-ID = 0000000001 and OVRD-HIGH-ID = 0000025000<br />
VAL-ACCT2, with OVRD-LOW-ID = 0000025001 and OVRD-HIGH-ID = 0000050000<br />
VAL-ACCT3, with OVRD-LOW-ID = 0000050001 and OVRD-HIGH-ID = 0000075000<br />
VAL-ACCT4, with OVRD-LOW-ID = 0000075001 and OVRD-HIGH-ID = 0000100000<br />
<br />
These four jobs can then be submitted alongside each other with Thread Number and Thread Count both set to 1, with no chance of them being rejected by the framework (this normally restricts the same batch control from running more than once at a time). The actual number of instances and the thread ranges of each will need to be adjusted in the same way as we normally do for our optimisation tasks, it is just a little harder since we have to adjust the batch control parameters rather than just the thread counts.<br />
</span></span>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com5tag:blogger.com,1999:blog-5289998557086377988.post-11026670995431712372008-07-30T12:41:00.008+10:002009-10-15T09:12:43.012+11:00Tidy Balances.. I don't really hate itI have been asked why I tend to avoid running the Tidy Balances Conversion process, and the short answer is that 'I have never found a need to..'.<br />
<br />
The Tidy Balances process has been designed to accept an input file of Accounts and balance 'buckets' and will produce adjustments of a specific type to ensure that the account has aged balances of these values. This could be handy in situations where minimal transaction data is being converted into CC&B via the traditional Extract-transform-and-load means (ie. convert all source data into CC&B Adjustments and Financial Transactions and run the required Validation and Keygen processes), it does not fit well with my experience of bringing 2-5 years of history over (which seems to be a common requirement on CC&B projects) and needing to create an opening balance for the Service agreements which is essentially just a placeholder transaction which is no longer subject to any aging or collection processing .<br />
<br />
The effort required to create the Tidy Balances input table (i.e. summarise all historic transactions older than x days/months and generate a record on the work file for this account), can easily be serviced via the ConversionCentral data sources and some careful SQL, resulting in less effort required by the extraction team. The selection criteria for extracted data included in the standard mapping into Adjustments is normally in the format:<br />
<br />
<span style="font-size: 85%; font-weight: bold;"><span style="font-family: courier new;">Select Source_Account, Adjustment_type, Adjustment_Amount,....</span><br />
<span style="font-family: courier new;">From Source System Tables</span><br />
<span style="font-family: courier new;">Where Transaction Date >= Conversion_Cutoff_Date</span></span><br />
<br />
My approach to ensuring that all transactional data is processed in the run is to include a separate ControlCentral data source which generates a single 'Opening Balance' Adjustment record as:<br />
<span style="font-size: 85%; font-weight: bold;"><br />
<span style="font-family: courier new;">Select Source_Account, 'Opening Balance Adjustment Code', SUM(Adjustment_Amount),....</span><br />
<span style="font-family: courier new;"> From Source System Tables</span><br />
<span style="font-family: courier new;"> Where Transaction Date < </span></span><span style="font-size: 85%; font-weight: bold;"><span style="font-family: courier new;">Conversion_Cutoff_Date</span></span><span style="font-size: 85%; font-weight: bold;"><br />
<br />
</span><span style="font-size: 100%;">With a bit of manipulation to ensure that this is attached to the earliest converted bill, we ensure that these transactions are not reprocessed by the first CC&B bill, whilst still ensuring that the Account and SA balances conform to expected values.<br />
<br />
Given that these adjustments are older than 120 days, they are normally not subject to traditional aging and as such this solution proves to be easier (and quicker) than running a full Tidy Balances process.<br />
</span>Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com3tag:blogger.com,1999:blog-5289998557086377988.post-70300684287581224222008-07-16T09:04:00.006+10:002009-10-15T09:12:33.197+11:00A picture is worth a thousand words...<div style="text-align: center;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUB90Xf8e_1_apqif91YcjARk5dgSxuyIS46zxY7gB2mPHFaXvsegpfYOFZsx2H7ZtKz2RGMXYPXSvH4jV6SFOVwp7ML44mt2FCitXvG1FwayGlixY9MdtRkVmjQ26G-dXmJecxzsYJpx2/s1600-h/Cordaptix+Conversion+Process.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUB90Xf8e_1_apqif91YcjARk5dgSxuyIS46zxY7gB2mPHFaXvsegpfYOFZsx2H7ZtKz2RGMXYPXSvH4jV6SFOVwp7ML44mt2FCitXvG1FwayGlixY9MdtRkVmjQ26G-dXmJecxzsYJpx2/s320/Cordaptix+Conversion+Process.jpg" /></a><br />
</div><br />
</div>The attached image provides an overview of how the CC&B Conversion process hangs together. Hopefully this proves useful to you in understanding the various components of the process. For in-depth detail of the Server-side conversion modules, check the CC&B online help.Stuart Ramagehttp://www.blogger.com/profile/11557427196523528065noreply@blogger.com0