Hello,

I have a SQL Server DataBase and have about 10,00,000 entries to be inserted into the database before the database becomes operational. I am currently performing an insert query for each entry, as they have to be read from a file and decided into which table they should be put in. To update all the entries it will take a long time to complete. Can anyone suggest a faster way to do this ?

Hi

You should do this with BULK INSERT. There you have to choose an appropriate BATCHSIZE, possibly it's a good idea to split 1M records into some smaller portions. Also CHECK_CONSTRAINTS should NOT be used to prevent endlessly lasting checking of FK constraints and check clauses. If there is the possibility that you are inserting incorrect data then it's mostly a good idea to correct or remove these portions of data not until after the insertion task by invoking some sql statements.

You may also write a small program to round out plain mass data to complete BULK INSERT-statements.

-- tesu

Thanks for the quick reply,

Can u tell me more about BULK INSERT, also I am not inserting any bad data.. It's just that the insert query has to run 1000000 times in a loop with different input data as read from a file into appropriate tables. I am doing a simple INSERT into TABLE query.. but its the sheer number of transactions with the data base thats slowing down the process.

"I am doing a simple INSERT into TABLE query.."
This is really not a good idea inserting 1M records that way for simple INSERT always checks primary keys, foreign keys, not null etc. This cannot be disabled. Contrary to BULK INSERT (see: http://msdn.microsoft.com/en-us/library/ms188365.aspx and http://msdn.microsoft.com/en-us/library/ms188267.aspx) where such checks aren't performed except you demand it explicitly.

-- tesu

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.