Best way to handle massive inserts

Last Post 30 Jan 2009 07:41 AM by Krypto. 1 Replies.
AddThis - Bookmarking and Sharing Button Printer Friendly
  •  
  •  
  •  
  •  
  •  
Sort:
PrevPrev NextNext
You are not authorized to post a reply.
Author Messages
Krypto
New Member
New Member

--
29 Jan 2009 07:46 AM
I've just been handed a project that has only one table, and one proc.

This table has 300 inserts per second, and grows to about 140 gig, before they truncate it. They truncate it once a week.

They are talking about using ETL, and/or breaking it out into two tables..one for the inserts one for the selects. Is this best done by re-writing the insert query to write to two indentical tables? Or write an SSIS package to do it?

I don't know much about this yet, (& don't know ETL) but I guess the problem is the slowdowns on the searching of this huge table. It's constantly doing inserts.

Would it be best to simply write all the inserts out to a text file? If so, how would I search it?

I'm told this is diagnostic data. Users, transactions, searches that sorta thing. The table only has about 10 columns tho, fwiw.

They are querying it in such a way that it needs to be immediatly available data. Not less then 1 second, but less then 30 seconds.

It's lots of differnet users from all over the place adding one row at a time.
Krypto
New Member
New Member

--
30 Jan 2009 07:41 AM
Gunneyk,

As always, thanks for your reply.

I found out somee more info on this project, and have created a new topic.

It gives better detail, it's the thread above this one.

Thanks again!
You are not authorized to post a reply.

Acceptable Use Policy