Background
I write a lot of large reports for and generally maintain a large health records DB (write SPs, functions, jobs, etc.). The original schema, and software that uses it, is from a different vendor, so I can't change much about it structurally. There are many records that require tracking such as labs, procedures, vaccines, etc. and they are scattered across dozens of tables, many of which are bloated and poorly indexed (I have been able to fix this somewhat).
The Problem
The issue is that because we have little control over the DB, and since it can change from any given update or patch, it makes writing and maintaining these reports difficult and tedious - especially when there's a large amount of overlap. All it takes is one patch and I'm stuck rewriting large portions of a dozen reports. Additionally, queries quickly become obfuscated and slow as the joins, nested selects, and applies pile up.
My "Solution"
My plan was to write all of these records to one "catch-all" table, and write triggers on the original tables to maintain the records in this aggregate table. Of course I'd need to ensure my triggers were intact after updates, but this would be much easier from a maintainability standpoint, and just referencing the data.
The table would be thin and long, storing only the data required, something like this:
CREATE TABLE dbo.HCM_Event_Log (
id INT IDENTITY,
type_id INT NULL,
orig_id VARCHAR(36) NULL,
patient_id UNIQUEIDENTIFIER NOT NULL,
visit_id UNIQUEIDENTIFIER NULL,
lookup_id VARCHAR(50) NULL,
status VARCHAR(15) NULL,
ordered_datetime DATETIME NULL,
completed_datetime DATETIME NULL,
CONSTRAINT PK_HCM_Event_Log PRIMARY KEY CLUSTERED (id)
)
Then I'd have various relational tables for things like the type_id and item groupings.
I'm starting to second guess this idea as several of these tables are written to quite a bit, the SPs and reports I'd be writing would reference the data a lot as well. So I'm concerned that this table would become a record locking and performance nightmare with so much I/O.
My Question
Is a bad or a good idea? I realize every situation is different in SQL Server (2008 r2 Standard Edition BTW), and the "sometimes" rule, but I'm really just looking for general advice.
I started to consider using a service broker, but I'd only be performing simple updates/inserts (See the alternative to the accepted answer). The data in many cases needs to be realtime, so using a backup DB wouldn't really work. Performance is already somewhat of an issue for us, but most of that is hardware related that'll be resolved soon.