http://qs321.pair.com?node_id=1178252


in reply to Re: Perl denormalized-to-normalized schema translation, maybe with DBIx::Class (maybe)
in thread Perl denormalized-to-normalized schema translation, maybe with DBIx::Class (maybe)

Greetings ysth,

I'm sure you're right, that this plan is a disaster in the making. Unfortunately, I don't know what other strategy I could use to fully refactor things iteratively over time. Everything in this galactic codebase uses the centralized database, with hacks upon hacks that are all dependent on the existing schema. I can't refactor even a small part of the schema without refactoring nearly all the code, which would be a monumental process.

I could use the same denormalized data store for each new web service, but then each new web service would still be tied to the old data store. And then after I had finished refactoring all the old code, I'd still have to have one big project to refactor the data.

UPDATE: An earlier thought I had was to use a 3-step process instead of a 2-step process.

  1. Write a "database abstraction layer" that sits in front of the legacy database but initially does very little apart from exposing a simple API (even as silly as SQL-in/JSON-out, even)
  2. Refactor blocks of CGIs into services, with each service calling built-as-needed methods exposed from the data abstraction layer
  3. Finally, iteratively refactor the database schema, which would be safer at this point because I'd have a set of services that all had reasonable tests on them

It's just an idea. Not sure if it's the right approach, though.