I'm trying to come up with a solution.
Have a DB around 1TB+, that receives Batch update/loads throughout the day.
This of course slows to a crawl, production, as users using the system need to make changes SUIDs. Of course the db is poorly designed - this primary table has 40 cols, and is the majority of the data size.
So, the thought of the client is this: 1) Secondary DB would be used to server the users where they can make their minimal changes SUID. 2) Have the primary db a)receive the updates/loads, b) reindex, c)synch (get the changes from the secondary) 3) Then once a "good" copy of the master is complete, they want to push out (in essence replace) the Primary to the Secondary (ex: backup and restore) What do you think is best performance solution for this? Replication? SSIS?
How would we go about it given the size of the db? Backup/Restore could only be accomplished w/3rd pty compression software.
But, with backup/restore Replication would have to be setup again.
My idea is to use Merge Replication, and just have the updates from the user db to the Primary master db - but how will the changes from the primary db get to the secondary user db? I'm not worried about Merge changes from B to A, but A-B is large data size.