Interclypse | Happenings

Interclypse Data Migration Framework

Written by Brian Walsh | Nov 20, 2024 2:30:00 PM

Following the lessons learned in our previous article, here we’ll cover our tested plan to achieve a successful data migration (see previous article for more details). For highly complex migrations, we use the following framework:

  • The process should be its own micro-service/application within the existing Source Control Management (SCM) system for the project.

The Data Migration application should be treated like a production service

  • Ensure the application can track and compare each execution per environment. The process should be repeatable, scalable, and require no manual steps

We suggest the following Layers within the Data Migration application:

Connectors (Input/Output): These subsystems connect to external systems to read or write information via a web service call, reading a file, etc. They encapsulate any complex logic to authenticate with the external systems.

  • Logging should be included to ensure global monitoring of discrete migration executions.

Filters (Validation/Security): Filters are used to remove old, unused or invalid data as part of the data migration.

  • Logging at this level should include when unknown cases are found (You will not know until you find it via script execution) and invalid/Failed data (Is this something to remove from the new system or fix before migration?)

Transformer: This system layer converts the old data into the new data format. This should maintain versions of separate executions to ensure consistency and progress toward the end goal. Additional quality checks should occur at this layer to ensure that old data is converted properly.

  • Logging should be included to ensure global monitoring of discrete migration executions.

Auditor: This layer provides the final authority prior to transfer to the new system and performs logging for metrics and status.

 

With this framework in place, we can assure the most accurate transfer of data to the new system with the least amount of errors. In our next article, we will dig into the Data Migration Process including micro-services, API Keys, and what to do when your system fails.