Today’s wealth management industry is seeing an unprecedented level of increasing regulatory pressure on banks, superannuation and investment funds in relation to disclosure and transparency of costs and fees and data automation. For these businesses, this means that maintaining quality data in their core platform is fundamental to success going forward.

Regulatory demands, fund consolidations and technological advances are requiring financial institutions to replace their current systems, which involves large-scale data migrations. Faced with moving large volumes of complex legacy data, these system transformation projects often find the migration stream as the critical path, and if not managed properly, can be responsible for major delays and budget blowouts.

To help program leaders within financial services organisations become better prepared to deliver consistent, predictable and affordable outcomes, this guide identifies the 5 key essential ingredients for a successful data migration.

1) Plan early

A key mistake in data migrations is underestimating how long it will take and commencing the migration stream too late. Activities should be well understood in the planning phase with key activities and milestones mapped out, such as availability of source data, target system configuration, and environment and resource availability.

Often, the migration stream finds itself as the critical path due to delays of a dependent stream, causing a delay in the migration. To mitigate this, a level of contingency should be built into the plan for any “unknowns”, with risks understood, and alternative paths considered.

Any programme involving large scale or complex data should plan for multiple trial runs and checkpoints to provide the team with enough time to smooth out any technical issues with the execution. It also helps ensure the transformed data is meeting quality expectations. At least one full dress rehearsal, where participants take up their ‘go-live’ roles, should also be included.

Lastly, a governance structure should be in place to support escalations, decision-making and sign-offs.

2) Create a migration strategy

Creating a migration strategy at the commencement of a project is essential. The effort in producing the strategy can be considerable, particularly for large and complex projects where multiple stakeholders are involved. However, this is significantly outweighed by the value it brings to a project. A migration strategy should consider:

  • Scope of data to be transferred e.g. how much historical data should come across, key customer data, closed accounts.
  • Decide whether it will be a ‘single event’ migration, or incremental loads split by logical groupings, such as product types. Often the decision will be based on whether it is large volume of similar data models, or complex data sets coming from different source systems.
  • Expectations of the quality of source data and agreement of remediation paths e.g. fix in source system, transform in target.
  • Testing phases and cycles e.g. data validations on load, front end tests, UAT.
  • Quality policy covering:
    • Definition of quality
    • Key data elements
    • Success criteria
    • Reconciliation strategy: ‘How do I know my data is clean?”

3) Secure access to experts

A mistake that many projects involving a data migration make is assigning too many data generalists to the migration stream. A successful data migration should have subject matter experts that understand the business logic of the source system and the target system. This needs to be across several areas, including technical analysts with experience using the migration tools, architects that understand the data structures, and business analysts and product specialists that understand how the system is configured and used across the business.  Project managers should secure the right resources early on and have access to the decision-makers.

4) Plan for multiple test phases

The testing of migrated data may seem like an obvious project task, however, without a clear strategy in place that defines the test phases, number of cycles, test scope and key success criteria, validation of the data can become an incidental outcome.

Testing of migrated data has three elements to it:

(i) In the first instance, basic system reconciliation testing to ensure the number of records have loaded as expected and appear on screens/web. Load failures, missing data and duplicates are uncovered at this stage, with minimal validation on the accuracy of the data.

(ii) Function tests that ensure that the data loaded meet the business requirements. These tests are defined in advance and cover key processes that have high business value and rely on the accuracy of the data. A fee process or an insurance claim are examples of this.

(iii) User Acceptance Testing (UAT) is usually conducted when nearing the end of the project.  Some argue that UAT should not be conducted on migrated data, as this phase represents acceptance by the user of the system functionality and data issues could muddle or slow down the testing. However, in the absence of a testing cycle that covers end-to-end ‘business as usual’ functions, UAT is the perfect opportunity to validate the migrated data. Having quality gates that need to be passed before the data is handed off to UAT and publishing any known defects allows testers to work around these issues.

5) Create the reconciliation process early on

The effort to define and build the reconciliation process can be substantial. Deferring the work to a later time in the project, puts the stream’s readiness for ‘go live’ at risk. The reconciliation strategy should be defined at project commencement and include objectives, scope, estimated effort and the tools to be used.

It may not be feasible to reconcile all data records; therefore, an upfront agreement of what data stakeholders are comfortable signing off on needs to occur. Success criteria should be defined and may vary depending on the data type e.g. financial values may require a 100% match.

There is a common perception that the reconciliation of the source data to target only occurs after the migration has occurred. While this is true for the final signoff, the availability of the tool earlier on, particularly for trial runs, will assist in the early identification of data issues. There will be data discrepancies, so it is better to fail early, than find systemic issues at the end of the project.

Conclusion

Data migrations are complex, but with the right level of planning and execution strategies carried out with a trusted partner, they are completely manageable. The 5 key essentials outlined in this guide provide a good framework for success, and ultimately, plays a critical role in delivering a fund a distinct competitive advantage in an increasingly data-driven world.

About the author

Roshan Ranasinghe

Head of Consulting, Australia

Roshan Ranasinghe is Head of Consulting – Australia, at Bravura Solutions. Based in our Sydney office, Roshan is responsible for providing strategic and technical implementation guidance for Sonata – Bravura’s market-leading next generation wealth administration platform for superannuation and pension products, unit trusts, wrap platforms and insurance products.With close to 20 years of experience working within the IT and financial services sector, Roshan has worked in numerous roles including senior consultancy, product architecture, and as business and technical analyst, across many financial services organisations. In that time, she has designed software, led implementation teams and managed client relationships at executive levels.

More Insights