Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having done a large data migration for medical records, I would be sweating bullets doing this with presumably fluctuating transactional bank data. Out of curiosity, does anyone have recommendations for their favorite data migration tool?


IMHO it is unreasonable to expect these sorts of things to be possible to do using a big bang approach. The complexity exceeds cognitive capabilities and disaster strikes. Therefore, it is not optimal to see these projects as data migration but functionality migration ones. You move some customers to another set of functionality. Thus, my recommended tool is to use none


100% - what I always do is start the new logic/system so it processes in parallel with the old one and at the same time backfill the old data into the new system. When everything is proven working, feature flag to switch the new system as the main one, old one keeps running. After a period of quarantine, switch off the old system.

I don't understand the requirement of doing the "nuclear button" migration, except maybe a shortsighted way of trying to save costs.


We do this too and I think it's a great approach. One thing I will say is you should try and decommission the old system as soon as feasible - running both has additional testing and development costs, particularly if for some reason you need to add new features to both.

We generally aim to leave the old system toggled off for a release or two (allowing us to switch back in the case of a serious defect) and then rip all that old code out in the subsequent release.


This, they should have migrated test accounts or some employee accounts first and then slowly migrated other accounts.


Don't do migration in one go. Do it gradually. Make sure you can run both systems at the same time. It will take longer than you thought doing it in one go would take, but less time than actually doing it in one go and fixing all the problems.


It's a bit glib, because data varies so much, but my personal favourite data migration tool is Perl.

I've migrated a lot of "stuff" from one place to another, from filesystems, to databases, to backups, and there is usually some perl running in the background.

Of course the scale is very different. The biggest thing I've personally migrated was in the low Tb of data. Certainly nothing so large/necessary as banking stuff.


Depends significantly on the back-end data store, for some our in-house systems we’ve had good experiences with GoldenGate [1] to do a continuous sync between systems to enable a rollback scenario, and there are equivalent packages available for lots of datastores. Some of the larger finance packages also include export and snapshot functionality but this wont help with maintaining parallel systems.

The risk however is evaluating consistency between the systems before doing the rollback which in this case would probably require a more advanced testing capability than what was available...

[1] - https://www.oracle.com/au/middleware/technologies/goldengate...


I got pushed a graphical flow migration took down my throat by my manager. On the argument that in that way others would be able to read at reuse it because it was standard. In reality it was used once and I suffered all its quirks, also no version control was possible.


> does anyone have recommendations for their favorite data migration tool?

Backups!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: