Updating, inserting and deleting from large data sets with minimal locking
Use smaller transactions with systems that continually lock
When working with particularly large data sets, there can be issues with locking, however it doesn't need to be like this. If you can, break out the transactions into smaller pieces.
Lets look at some of the code below. First of all open SSMS and three query windows, and the Activity Monitor, set the update period to one second.
In the first window, we can create a table for use in testing.
This will take a while to run and will require a couple of windows open.
Create test table
CREATE TABLE (ID BIGINT IDENTITY(1,1) CONSTRAINT PK_ID PRIMARY KEY,GUID UNIQUEIDENTIFIER)CREATE INDEX IDX_GUID ON (GUID)
Create some records
Execute this code and then replace the code with below, which will insert 5 million random GUID's into the table one at a time, this then simulates constant updating and inserting for us. You can push this further if you like. It will also print the number of records inserted every 1000 rows, so you can keep track of it.
SQL
SET NOCOUNT ONDECLARE @ID BIGINT=1WHILE @ID<5000000 BEGININSERT INTO (GUID) SELECT NEWID()IF @ID%1000=0 PRINT @IDSET @ID=@ID+1END
Select a few records
If we open a second window, add the code below, which will select 10000 records that begin with 20.
SQL
SELECT TOP 10000 * FROM WHERE GUID Like '%20'
Loop a delete
Now open a third window, you can run the code below, which will loop through the GUID's and delete where they begin with a 0.
SQL
WHILE EXISTS (SELECT TOP 1 * FROM WHERE GUID LIKE '0%') BEGINDELETE TOP (1000) FROM WHERE GUID LIKE '0%'END
Testing the theory
Once you get above 500000 records in your fist window, run either or both of the statements in windows two and three, and check your activity monitor for what happens.
You should see fewer locks, and the lock that you do get should be for shorter periods of time, not preventing users from carrying on doing the work they need to, although it may slow down to a certain extent.