ost Posted January 31, 2004 Posted January 31, 2004 Hi I have a textfile with 1000000 lines and i need to INSERT these into a database. The keyword here is performance. I have tried to read a line and doing an INSERT for each line.. This means 1000000 executenonquery calls and it would take about 1 hour to do this import. Anyone have other ideas on how to do this? Quote
samsmithnz Posted January 31, 2004 Posted January 31, 2004 BCP is probably the best bet here. ITs a cmd line SQL app. look it up. Quote Thanks Sam http://www.samsmith.co.nz
iebidan Posted February 3, 2004 Posted February 3, 2004 try DTS or bulk copy, will be faster too What does BCP means???? Bulk Copy Process or something like that????? :D Quote Fat kids are harder to kidnap
Denaes Posted February 3, 2004 Posted February 3, 2004 I'd have the program read all of the data from file into a dataset entirely (or in batches if its too much information) and then use the dataAdapter.update method to update more than one record at a time. I don't know if behind the scenes, its really calling 1,000,000 INSERT statements. That may be the case. Thats how I'd do it with my oleDB. I'm not sure about MSDB. Quote
*Gurus* Derek Stone Posted February 3, 2004 *Gurus* Posted February 3, 2004 Look for the terms "sql server bulk insert" on Google. Performing anymore than a few consecutive inserts at once without doing a bulk insert is just asking for performance problems. Quote Posting Guidelines
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.