Thursday, March 29, 2012

DTS re-uses old text file input

Hi All,

A simple DTS job I have is giving me fits. It is a straight copy column job from a pipe delimited text file into a table. The input file comes from a mapped drive linked to a shared filesystem on a sun solaris box.

The typical scenario. I run the DTS job to load 8000 rows from the input text file. Job succeeds.

A week later, the text file is updated with 9000 new rows. I run the DTS job with no changes and it loads 8000 rows from last week.

I reboot my Win XP pc and run DTS again. It now loads the 9000 new rows.

I tried mapping to a UNC to no avail.

Is it buffering the old file somewhere? I need help.

current environment:
SQL Server 2000 with all latest SP's and patches
Windows 2000 Server with all latest SP's and patches
Drive 'G' mapped to a shared filesystem on sun solaris via Samba?I don't think DTS will use such type of cache and read old settings, ensure the source file path is defined correctly and oin the next execution it may be reading old settings.|||I don't think DTS will use such type of cache and read old settings, ensure the source file path is defined correctly and oin the next execution it may be reading old settings.

Thanks Satya,

I'm not sure what you mean by "old settings"? Here's what I do know.

Drive 'G' is mapped to \\sun001\data

The input file name is weekly_updates.txt and does not change.

My DTS connection 1 properties uses
G:\weekly_updates.txt in the File Name box.

It always works correctly after a reboot, and more specifically, from any pc, not just mine. Any idea's?|||Might it be somehow related to Enterprise Manager? I wonder if there is a way to automatically force everything in EM to refresh. I dislike having to submit a job and then keep hitting refresh to see when it completes. I'm wondering if the "old" information stored in EM is related to my DTS issue?

No comments:

Post a Comment