http://qs321.pair.com?node_id=1178227

Greetings all,

I have a strange situation, wondering if there's a DBIx::Class-ish option that might help me. We've got a very large, old, spaghetti, denormalized Oracle database and a sea of CGIs with embedded, hardcoded SQL. Ultimately, I'd like to refactor the schema, normalize it, add FK constraints, and all the usual good things. But I can't without considerable destabilization risk to the CGIs.

The strategy at this point is to carve out chunks of CGI functionality and refactor each chunk into an isolated web service with its own data store. But in doing this, we'll need to maintain two data stores for a while: the existing legacy denormalized store and the new and smaller/better normalized store. We need to keep these in sync for a while until we have refactored enough to deprecate the old schema. So what we need then is a "translation layer" behind each new web service that sees changes to the local data store and pushes those changes to the legacy database (until we can eventually deprecate the legacy db).

I'm envisioning this "translation layer" would need to understand both schemas (the new local store and the legacy Oracle), and would need to understand how to map data between the two. A lot of this will be heavily manual at the detail level, I understand. But I'm wondering if anyone knows of a scaffolding/framework option I should consider before starting something custom from scratch.

The intent is to use DBIx::Class::Schema::Loader to auto-build the Perl schema classes from the legacy database schema, and we'll probably use it also for the new schema. But once we have these two sets of schema classes, I'd like a way to link them. Certainly, the details of this linking will have to be very custom.

Thanks.