I am running a script on our production database reffering two tables : our table of users (3700 of them) and the table of quotes that they have made (280000 of them). Quote is the main object in our application, a very large object, for whom many data tables are created and filled. My goal is to clean database from all quotes but those made of a small group of users.
I first create a temp table containing ids of those users (it is used else in the script also) and then a cursor that runs through the main table for the quotes, where they are listed, and for those quotes created from the user group does the necessary cleansing.
I see that this script is going to be executed for 26 hours approximately, which I consider peculiar since I need about 15 minutes for the database restoring in general, and I guess the heaviest sql is executed there. The db, though, weighs more than 100GB.
Is there some part of the script that I am making terribly non-optimal, or you have some suggestion how this could be done with much shorter execution.
We are running SQL Server 2008 R2.
`CREATE table #UsersIdsToStay(user_id int)
INSERT INTO #UsersIdsToStay
select user_id
from users
where user_name like '%SOMESTRING '
-----
declare @QuoteId int
declare @UserId int
declare QuoteCursor cursor for
select DISTINCT QuoteId, UserId
from QuotesTable
where UserId not in
(
select * from #UsersIdsToStay
)
open QuoteCursor
while 1=1
begin
fetch QuoteCursor into @QuoteId, @UserId
if @@fetch_status != 0 break
-- all the deletions from related tables are executed here using @QuoteId and @UserId
end
close QuoteCursor;
deallocate QuoteCursor`