Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
dhani
Posting Yak Master
132 Posts |
Posted - 2009-01-07 : 15:53:47
|
Hello ALL,i have simple package which reads data from text file and loads to database tableex customer text file to customer table (custID,custName,address)here the problem is we have primary key in customer table(custID) how can i check text file data with data in customer table if any custID is duplicate in file then remove it else load it to tablebecause some time we get text files (some records may duplicate)some other times we may get two files with same data, so at the time package is failing (due to primary key in table)please suggest me best possible ways |
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2009-01-07 : 16:29:53
|
Use a staging area to import all records first.Then after cleaning, insert them into iive tables. E 12°55'05.63"N 56°04'39.26" |
 |
|
Nagaraj
Starting Member
14 Posts |
Posted - 2009-01-16 : 02:09:56
|
Hiyou can use Aggregate/Sort function in order to remove duplicatesNagaraj. |
 |
|
|
|
|