Hi There
i Have some t-sql that basically does something like :
INSERT INTO TABLEA SELECT * FROM TABLEB
There are alot of rows involved and sp_lock tell me that a table lock is being put on TABLEA during this insert.
This is a big problem for us. I have often used locking hints on selects etc but never an insert, how can i get the insert to never use a table lock ?
Thanx
You always have a lock on insert on something and have no control over the escalation of locks. Hints are only a SUGGESTION of how the server should handle the query. The server determines the locks used based on the statement.In your case, it is locking the table, because it has no idea how many inserts will be done or where they will be placed in the physical table.
You can set your read level to "READ UNCOMMITTED" and that will allow people to read the table during the insert process.
|||
Hi Tom
All good suggestions we have thought of but not possible.
Here is the situation, we are converting in new branch data into a central database, so we know that all rows inserted will be new data and at the end of the clustered index.
The other process doing the read is ETL which will never be interested in this new data, however the way it works combined with slowly changing dimensions we have to set it to read committed or the ETL will not work properly. The inserts are essentially locking an entire table and will never touch any of the rows that the ETL is trying to read (there are no updates) , however obviously the table lock pretty much stops all ETL.
So basically we are not sure what to do?
ALl i can do so far is commit as often as possible , but this only slightly helps the situation, bottom line is i wanted to know if there was anyway to control locking with an insert but you have answered that.
Thanx
|||With the more information you provided, I would change your INSERT into a BCP or BULK INSERT.The INSERT...SELECT * is pretty slow. If you export the data and use BCP or BULK INSERT, you will be 100 times (or more) faster and will lock the target table for a much shorter time. Also set the batch size to a number like 10,000 or 50,000 which will insert that many records as 1 transaction and drop the lock. That might let the ETL lock the table, and delay the import, but the ETL will finish.
This might fix your problem.
|||
Hi Tom
Thanx i will try that.
However the process must be automated, so i guess i will have to write say an ssis package that first exports the data to file and then uses bcp to import it back into the other tables?
Thanx
|||Hi, Tom.
Is it possible to perform a BULK INSERT from one table to another in the same SS05 database, where both tables have the same structure, and one has been TRUNCATED to accept the replacement data from the other?
I have been studying the T-SQL documentation and I keep noticing "glimmers of hope" that make it appear as if it is possible (without having to export the data and then import it), but I am unable to discern the syntax for the INSERT statement.
Does this look anywhere close to being correct?
INSERT INTO table_a (IGNORE_CONSTRAINTS)
SELECT *
FROM OPENROWSET(BULK
(And that's where I get stuck, since BULK seems to want a reference to an external data file, NOT a table within the same database on the same server, with only a different table name.)
Dan
P.S. Maybe it must be done outside the T-SQL environment, such as shown in VB and C# examples at http://msdn2.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx ? I was hoping there was a T-SQL solution.
(I just noticed an interesting comment in the C# example:
// Open the destination connection. In the real world you would
// not use SqlBulkCopy to move data from one table to the other
// in the same database. This is for demonstration purposes only.
So what is the "real world" solution for bulk copying data from one table to another in the same database?)
|||Tom
Any ideas ?
To uswe bulk insert i would first require a process to export to file and then back in again the time again on bulk insert would perhaps be lost on exporting to data to file and re-importing with bulk insert using SSIS or something ?
Is there a way to bulk insert data from 1 table to another using just tsql , and no exported data to file ?
Thanx
|||It is ALWAYS faster to export the file and bulk insert again, even with the added overhead of exporting it to file. The other methods require tons of writes to tempdb and log files, etc.You should really try it as bulk insert. You wil be much happier.
The "real world" solution, as you would see from the MS copy data options in DTS and SSIS, are exporting the data and using BCP/BULK INSERT to reinsert them, because it is much faster.
|||
Tom,
Thanks.
Dan
|||Hi Tom
Thanx, i was hoping to be able to do it in tsql and avoid ssis.
By the way is there now ay to bulk insert between tables ? Must bulk insert have a flat file source, seems that way from BOL.
Cheers
|||Look at the "bcp out" command. You can do that from tsql using EXEC.
No comments:
Post a Comment