Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
wolfenstein4u
Starting Member
6 Posts |
Posted - 2008-10-24 : 09:38:34
|
Hi,I have a requirement to import data from Informix 2000 to SQL Server 2005. Now first thing i have only read only rights on the Informix DB, so can only fetch the data from there. The intention is to develop the Replication between Informix and SQL Server 2005 that is not available to us.After searching i only find out to do it through the SSIS. I have developed a package that basically; 1. Fetches the current day Data from Informix 2000 and stores in the XML files. 2. Then i delete the current day data from sql server. 3. In the last step i make a Bulk Insert from XML file to sql server 2005.Now this works fine but have some issues. 1. Because of repetitive DELETE the log is full most of the time. As job runs after every 5 minutes. 2. When this process is running, user cannot view the data as it deletes the data first and then inserts it. 3. The purpose to delete the data is; we dont have any Primary keys on which you can check which data is updated/ inserted. so the last way is to just refresh the data.The major draw back is when this job runs then it holds the users to access the same day data.Now i need some useful ideas and suggestions to how to optimize this process? OR if is there any other work around to achieve the same functionality with better results then please share it.Really need some professional guidance.Thanks!Regards,Mohsin Shahzad |
|
wolfenstein4u
Starting Member
6 Posts |
Posted - 2008-10-24 : 15:35:11
|
Got 8 reads and 0 Reply yet Regards,Mohsin Shahzad |
 |
|
wolfenstein4u
Starting Member
6 Posts |
Posted - 2008-10-27 : 08:59:23
|
I'm really surprised that still i couldn't get any response from this forum. Is there anyone who can give me some professional suggestion?Regards,Mohsin Shahzad |
 |
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-10-27 : 09:31:18
|
Why job is running every 5 minutes,you are causing locking/blocking issue? Instead use update/insert technique. Make sure when you delete large records, you delete in small batches so you log won't grow or change it to simple recovery model if you don't care about point in time recovery. The best way is to load it to Staging table and from there you can insert/update into destination table. |
 |
|
wolfenstein4u
Starting Member
6 Posts |
Posted - 2008-10-27 : 14:32:45
|
We don't have the unique key information from the source tables that's why don't know which record to be updated.That's why i have to delete the data first and insert the new data again; means refresh it. Can you please explain it further about the staging table strategy?Regards,Mohsin Shahzad |
 |
|
|
|
|