Bulk updating multiple columns in oracle
During every test, I loaded a million rows into a table.
The target table had a heap structure with seven columns without indexes or constraints. begin get parsed data into variables @c1,@c2…@c7 insert into Target Table(c1,c2,..c7) values(@c1,@c2…@c7); for explicit transaction control, each 10,000 inserts were wrapped into transaction end This first approach to the problem doubled the speed of the prototype powered by Informatica, but we were itching to make it fast, so we set up parallel loading.To investigate the performance difference between the single row insert and multiple rows insert, I created a program which dynamically formed an statement, which loaded one million rows in 9 seconds.In contrast, single row insert took 57 seconds to load one million rows.I then created a procedure that built a loop that parsed the data and inserted it into the target table. We were going to split the staging data into two parts and process each part in parallel.We had enough server resources for two parallel processes, so we just needed to add two parameters in a procedure to assign the starting row and the ending row of the staging table, and implement parallel calls of procedures in the Informatica workflow.To investigate the influence the number of columns has on performance, I made an additional set of tests for tables with seven, ten and 23 columns.
I ran tests wrapping one, two, five, ten, 25, 50, and 100 rows in a single insert statement (abandoning the 5 row tests).
How do I pass the parameters to the stored procedure? In this article I’ll describe how I experimented and adjusted my methods to find the best way to insert a million of rows of data into SQL Server.
The more data you’re dealing with, the more important it is to find the quickest way to import large quantities of data.
Before implementing parallel loading we wanted to find a simple way to speed up the performance of our procedure, something small that made a big difference – multiple rows insert.
Instead of calling a single row insert, I sent five rows in the one statement.
We wanted to reduce the time spent on the loading process (16 hours), and by using stored procedures, we could now calculate some statistics with incrementing counters in the loop.