I've just been handed a project that has only one table, and one proc.
This table has 300 inserts per second, and grows to about 140 gig, before they truncate it. They truncate it once a week.
They are talking about using ETL, and/or breaking it out into two tables..one for the inserts one for the selects. Is this best done by re-writing the insert query to write to two indentical tables? Or write an SSIS package to do it?
I don't know much about this yet, (& don't know ETL) but I guess the problem is the slowdowns on the searching of this huge table. It's constantly doing inserts.
Would it be best to simply write all the inserts out to a text file? If so, how would I search it?
I'm told this is diagnostic data. Users, transactions, searches that sorta thing. The table only has about 10 columns tho, fwiw.
They are querying it in such a way that it needs to be immediatly available data. Not less then 1 second, but less then 30 seconds.
It's lots of differnet users from all over the place adding one row at a time.