Core Conversion Misstep #3: Oops, We Didn’t Check the Core Data
This blog is the third in a series about things that are frequently overlooked or forgotten when a Credit Union is considering a Core Conversion. Subsequent blogs will be posted weekly and highlight each item in-depth. Make sure to follow along with the series on Olenick.com.
Your credit union’s core conversion project is either on the horizon or warmly breathing down the back of your neck. Hopefully, you have already considered a test and defect management repository decision and thought about the functionality you will performance test.
The next post in this blog series of often-overlooked components of a core conversion is about validating the integrity of the core data. It may seem obvious that your new core has the right data in the right place, but few things are less obvious. As a crucial system to your organization, it’s critically important that you check it.
WHY DATA INTEGRITY IS IMPORTANT IN YOUR CONVERSION
Your core is the brain of your credit union and a conversion is the equivalent of a transplant. Numerous functions need to be tested to ensure they were not altered. More than likely, your credit union has a large number of systems integrated with your core, receiving and sending member and transaction data. Bad data can have a negative snowball effect as it cascades from application to application. So, let’s make sure it’s accurate.
To validate your new core’s data, two things are reviewed for defects:
- First, that all of the old core data is moved to the new core and lands in the correct fields.
- Second, that the new core application is looking at the correct fields when using or displaying information.
Let’s use member name and address as an example. It is critically important that your members’ names and addresses are correctly placed in your new core, as these fields are used for greeting your member appropriately, authenticating and mailing important documents, among other things. Having a joint member name appearing in the primary member name record or a property address in a billing address field could have disastrous consequences.
It’s easier than you think to make mistakes when mapping data between systems. Software vendors use different names for similar fields and the “correct” field is not always obvious. In one system, your primary member’s name may be stored in the database as “Member_Name”, but in the other core, it could be stored as “Acct_Name”. Likewise, “BillingAddress” in one system database could easily be “Address1” in another. Database tables often contain similar or identically named fields yet have different data in them.
Testing enables you to verify that the field level data in your new core matches your old, trusted core system. To do this, you need to know where in the old database members’ names and addresses are being stored, and where in the new core database it is going to be copied to and stored.
PLANNING FOR TESTING YOUR DATA
The first step is to obtain two data dictionaries – your old core and your new one. A data dictionary is a physical listing of each field in the core application and the location of the field in the database. It may look something like this:
|Table Name||Field Name||Description|
|MemberInfo||MemberID||Member’s first name as used in communication, labels, etc.|
|MemberInfo||FirstName||Member’s first name as used in communication, labels, etc.|
|MemberInfo||LastName||Member’s last name as used in communication, labels, etc.|
|AccountInfo||Number||Account number for this account. Member owner is determined by AccountInfo.MemberID|
The dictionary for the new core will come from your new software vendor partner. It is highly likely that your IT department already has a data dictionary for your old core. Hopefully, they’ve been maintaining this document over the years, as you have changed and repurposed fields. If IT does not have one or if it’s not current, contact your old core software vendor.
Next, you’ll need to map the database. To do this, you’ll bring together people who are knowledgeable about your old core’s database and the vendor. That group will put the two data dictionaries next to each other and go down the list, field by field, deciding where your old core data should go in the new core.
There will be many hundreds of fields in dozens of tables to go through so bring lots of coffee and have lunch catered.
Some organizations choose to create test cases so they can formally capture test steps, screenshots, and the outcome. Others choose to use the database mapping document that you created and update it with the results of testing.
Either way, you will want to test many scenarios, covering a wide array of members, accounts, cards, and transactions. As an example, interest rate yield may be correct on a savings account record, but incorrect on a checking account record. Seeing it correctly on one share type does not mean it is correct in all of them.
Identify test scenarios that move data from your new core to an integrated system or from another system into your core. Add a step to your test case to validate the accuracy of the data after it has moved.
The team is likely to run into situations where it’s not perfectly clear how to map fields. Your new software vendor should provide consultation. Ultimately, you are responsible for approving the mapping results and the vendor’s recommendation could be incorrect. This is why testing the results is so important.
HOW TO EXECUTE TESTING
Once the new core has been installed in your test environment and loaded with a copy of your production data, testers can begin comparing the fields that were mapped. Update your repository with the outcome of your testing.
Ideally, you will test every single field, but if you have a compressed schedule, you’ll need to prioritize the fields. Know that this will introduce risk – some non-critical fields could end up incorrectly mapped. There are organizations that can provide short-term testing capacity, among other services, to ensure you are reducing your project risk.
If you need to prioritize the fields, work with each department that will be using the new core to identify the ten most commonly-used screens and the critical fields within them. Many departments use the same critical screens and fields so de-duplicate the consolidated list from all departments. Be cautious when de-duping: field names can appear identical even though they’re in different locations in the database.
As you move through your testing, you are likely to find defects. The root cause could be the technical conversion. Right data, wrong field. In this case, the mapping will be updated to direct the data to the correct field. In some cases, the application must be changed. This could mean renaming an existing field to match the old core or repurposing an unused field. In that case, the mapping does not need to be updated.
Log your defects and assign them to the new core provider, so they can diagnose, correct, and reassign to you for retest.
Your team will spend months testing your core and related applications, and many changes will be made over that time to address defects and other issues. It will be tempting to forget about data mapping after you complete your test cycle and resolve the defects, but you should review this again during mock conversions to see if other database changes negatively impact your field mapping. Focus on the highest priority fields versus testing everything again.
SUPPORTING PROCESSES IN PRODUCTION
Once your core conversion project is finally complete, there will be some level of post-production maintenance needed for the data dictionary and mapping. Someone must always know what data you have in your core and how it is configured in the database. Your IT department likely has a precedent for this when they were maintaining your old core.
The documents you created during this test process – the dictionary and mapping document – are important artifacts for future core upgrades and integration projects. Hand them off to IT and verify that there is clear ownership for its ongoing maintenance.
Awareness of potential data pitfalls and planning to test them are more than half of the battle. The integrity of your core data is absolutely crucial to project success. Assume there are issues, plan to find and correct them, and then breathe a sigh of relief that you’re on the right path.
In addition to a test repository, performance testing, and data integrity, downstream testing is often overlooked during core conversion planning. Not sure what that means? Stay tuned for our next blog post to learn more about the mistakes that occur when data moves between software applications.
Olenick is a global software testing firm with headquarters in Chicago, IL.