Author |
Topic |
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-07 : 17:11:30
|
hi everyone,I used a flat file connection to read a file and a script component to insert into db. But that would create 3000 insert statements in db( if file has 3000 rows). Because it reads row by row and assuming it inserts row by row.Is there a way to read the file and create one single insert statement to insert 3000 rows ?? I am not sure if that can be done in script task either because i think it will be still 3000 insert statements.Let me know if I wrong!Thanks |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-08 : 01:21:29
|
why should you use script task? wont a data flow task with flat file source and oledb destination be sufficient? Also you may use fast load option too to increase speed of transfer |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-08 : 06:39:13
|
I need the last identity value from db from the previous data flow task and then I need to insert into that value in the next table.I saved it in a global variable and thats why I used a script component. I not sure how to use ole db destination with variables!!It is like a normal tab delimited file. But I read row by row and insert row by row. thats my problem. I want to avoid row by row insert!! |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-08 : 21:23:36
|
nope. fast load uses bulk insert in oledb destination. to get last identity value inserted isnt it enough to include a execute sql task prior to data flow and get the values stored in a variable created. then you may use the variable in place where you need value.if you want to use it in oledb destination, then choose sql command from variable and use another string variable to create query using the id variable. |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-09 : 11:42:55
|
that looks like a good idea. I will try that one but from the flat file source I insert it into two tables . One using the identity value and then inserting other column into temp table. Can I configure ole db destination to insert into two tables using bulk insert at a single time....?? |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-12 : 16:25:32
|
Can I configure ole db destination to insert into two tables using bulk insert at a single time....???? |
 |
|
sakets_2000
Master Smack Fu Yak Hacker
1472 Posts |
Posted - 2009-02-12 : 16:46:27
|
quote: Originally posted by bondwiththebest Can I configure ole db destination to insert into two tables using bulk insert at a single time....????
dont think so. |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-12 : 17:00:20
|
hi all,This brings back to square 1 for me. I am unable to do a bulk insert into DB. I read the data from file using flat file connection. I can get the identity value and store in a variable. But how do i get the input columns from FILE and configure in a variable!!!In oledb destination there is no option as sql command from variable but it is in oledb sourec... |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-13 : 10:02:52
|
quote: Originally posted by bondwiththebest hi all,This brings back to square 1 for me. I am unable to do a bulk insert into DB. I read the data from file using flat file connection. I can get the identity value and store in a variable. But how do i get the input columns from FILE and configure in a variable!!!In oledb destination there is no option as sql command from variable but it is in oledb sourec...
if you need to insert into two tables why not use a procedure for that using OUTPUT operator to get generated values before putting into child table. then call procedure from execute sql task |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-13 : 14:50:56
|
Hi visakh16,Now I need to just read the Columns from the flat file connection and just insert into DB in one single table. I have over 45 columns from the file, I need to read and insert into the table. I am not sure how I can configure OLEDB destination to read values from variables from flat file connection and then insert it into table !!!!!!!!Can you please help me or just give me an idea? Like i said the way I was doing earlier is by script componenet and I want to avoid row by row by insert... IS There any other way ??? |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-14 : 02:04:39
|
quote: Originally posted by bondwiththebest Hi visakh16,Now I need to just read the Columns from the flat file connection and just insert into DB in one single table. I have over 45 columns from the file, I need to read and insert into the table. I am not sure how I can configure OLEDB destination to read values from variables from flat file connection and then insert it into table !!!!!!!!Can you please help me or just give me an idea? Like i said the way I was doing earlier is by script componenet and I want to avoid row by row by insert... IS There any other way ???
read values from variables in flat file? what does that mean? you mean your file name is dynamic?in that case you dont need to do anything at oledb destination end. what you need is to map the variable to connection property of flat file source. this will enable it to identify the correct source file based on variable value and then transfer data from file to table. but make sure your flat files will always have consistent structure (columns). you can alter metadata info of flat file at runtime in data flow. |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-17 : 08:33:36
|
hi visakh16,I have looked at the links you have send me . But I think I have confused you even more, I have a flat file connection that reads the file name dynamically. Then I read 45 columns from the 100 columns from that file.Below I have a script component where I configured it to read all the 45 columns. But It will be doing a row by row insert by reading all the 3000 lines from the file(let says file has 3000 rows). Right??Now, I want to avoid it. Is there a way you do this as a bulk insert ?? Basically to avoid 3000 insert statments??? I need the flat file connection thought but below it I tried to use OLEDB destination as well but it dint work. Let me know if you have any questions. |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-17 : 08:47:27
|
for bulk insertion, use oledb destination with fast load option. i cant understand what problem you're facing using it. may be you can give more info on that. unless, your columns to copy varies (metadata), it should work fine |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-17 : 10:30:39
|
hi,I need a to add another identity value from a different table in the OLEDB destination. I just don't want to insert from file directly!I have the identity value in a global variable. I am trying to figure out how to include that along with the flat file source.Here is how my final table looks like after insert from a single filewith 3000 rows.ID PrsID Col1 Col2 .............. col451 25 2.5 3.2 ............... 10.52 25 2.7 3.3 .............. 11.53 25 2.8 3.3 ............... 11.4. . .3000 25 2.5 3.8 ..................12.5Here ID is the identity and PrsID is from another table. I got value in a global variable. The values from col1 to col45 are from flat file connection. Now I am trying to use merge or merge join but I unable to do so.How Can I configure OLEDB destination now with some inputs from global variable and some inputs from file.. ?? |
 |
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2009-02-17 : 11:28:07
|
cant you add a derived column task to add the variable value to pipeline before oledb destination? |
 |
|
bondwiththebest
Starting Member
46 Posts |
Posted - 2009-02-17 : 12:42:22
|
hey visakh16,thats all I need, thanks for your help, I appreciate it...thank you!!! |
 |
|
|