Best way to handle massive inserts

Last Post 30 Jan 2009 08:41 AM by Krypto. 1 Replies.
AddThis - Bookmarking and Sharing Button
Author Messages
New Member
New Member

29 Jan 2009 08:46 AM
I've just been handed a project that has only one table, and one proc.

This table has 300 inserts per second, and grows to about 140 gig, before they truncate it. They truncate it once a week.

They are talking about using ETL, and/or breaking it out into two for the inserts one for the selects. Is this best done by re-writing the insert query to write to two indentical tables? Or write an SSIS package to do it?

I don't know much about this yet, (& don't know ETL) but I guess the problem is the slowdowns on the searching of this huge table. It's constantly doing inserts.

Would it be best to simply write all the inserts out to a text file? If so, how would I search it?

I'm told this is diagnostic data. Users, transactions, searches that sorta thing. The table only has about 10 columns tho, fwiw.

They are querying it in such a way that it needs to be immediatly available data. Not less then 1 second, but less then 30 seconds.

It's lots of differnet users from all over the place adding one row at a time.
New Member
New Member

30 Jan 2009 08:41 AM

As always, thanks for your reply.

I found out somee more info on this project, and have created a new topic.

It gives better detail, it's the thread above this one.

Thanks again!

Acceptable Use Policy