Friday, February 17, 2012

DTS import works directly, fails when scheduled

I'm new to DTS packages, but managed to create one that successfully empties a table in SQL Server then imports data from a Foxpro file on another server into it. It runs fine if I execute it from DTS, but fails if I schedule it to run in SQL Server Agent (using the "Schedule Package" option in DTS). I think the relevant portion of the error returned when the job fails is:

Error string: [Microsoft][ODBC Visual FoxPro Driver]File 'hrpersnl.dbf' does not exist.

The file does exist. I also tried to execute it from a stored procedure, but got a similar error. Any thoughts on why it runs one way but not the other?

TIAI'm new to DTS packages, but managed to create one that successfully empties a table in SQL Server then imports data from a Foxpro file on another server into it.

TIA
Where do u kept the dbf file? Are u sure the connections are ok? I feel when the scheduling task is taking place your other server is some what down.That may be a cause.Plz check that ...|||When you run a DTS package directly (from the console), the job runs under whatever login/authorization that you use to connect to the SQL Server. This is typically your Windows login.

When you schedule a DTS package to run as a SQL Agent Job, you can choose what credentials the job will use by default (i.e. the job owner), and also for each job step (on the step's "Advanced" tab, near the Ok button). The default is almost never your own Windows Login.

The difference in the credentials used often leads to problems with permissions.

-PatP|||rudra: I had already checked that. The server/file are there.

Pat: I was not thinking a permissions problem, because the job does start. It completes the first step of emptying the table, then errors trying to find the dbf file. Since that's found through ODBC, my assumption is that it wouldn't matter how the DTS package is run.

In Enterprise Manager, I can execute the package in the DTS section and it will run fine. If I execute the job from SQL Server Agent, it fails with that error about the dbf file.|||Just to humor me, could you change the job owner to be whatever your credentials are when you run the job manually, just to see what happens?

-PatP|||I apologize Pat, I wasn't trying to imply you were wrong. You know a lot more about this than I do. I'm probably not understanding exactly where you're talking about. Maybe it's a Friday afternoon brain cramp. Here's a picture of where I'm looking:|||Sorry, you're looking in the DTS designer. I meant to find the job, which is what actually is scheduled and change the job's owner to be you (in other words, the job and the DTS package that it uses will then run as though you had started it manually).

-PatP|||Changing the job owner will not change the outcome.

Are you using UNC to point to the file location or through a mapped drive?|||I've gotta say, I'm feeling pretty dense right now. On the General tab of the Job's properties in SQL Server Agent (still in Enterprise Manager), the owner was listed as:

DomainName\Administrator

which is how I'm logged into the server. I tried sa, and it still doesn't work. I appreciate your persistence, and by now you must be thinking "how stupid is this guy?".

rdjabarov: I'm using a mapped drive, which is the only way I've ever been able to get the FoxPro ODBC to work.|||Very good call rdjabarov! I hadn't considered the use of a mapped drive, since I can't remember the last time I used one for a scheduled task.

The problem comes from the fact that the SQL Server Agent actually starts as a service before the Windows Explorer does. The mapped drive exists as a side effect of the Windows Explorer, so the mapped drive letter is usable by any process running as a result of user action (such as launching a DTS package), but the mapped drive is not available to a process started by SQL Agent.

In order to access the DBF file, it must exist in a place that the SQL Agent process can find it. The obvious choice would be to find a way for the DTS process to connect the ODBC driver via a UNC to the DBF file. The next would be to copy the DBF file to the machine running the DTS package so it can use a local drive (say via a command process).

It is usually easy to make the ODBC connection to the DBF file, but there can be complications. You'll need to make sure that the job step that accesses the DBF file on the remote machine runs as a Windows Login that has access to the UNC.

-PatP|||I think you've nailed it. I created a quick test, and it worked. I'll try it on the real package on Monday.

Thank you both very much!

:beer:|||I did a little testing yesterday and then let the scheduled job run overnight last night. It ran fine, and I can run the package from a stored procedure now, which I couldn't do before.

Thank you both very much for your time and expertise.|||I think you've nailed it. I created a quick test, and it worked. I'll try it on the real package on Monday.Ok, just being dreadfully curious at this point since you're already happy, but what exactly did you try, and how did that work in production?

-PatP|||Not sure what you're after in addition to post 12. I was able to create an ODBC connection with a UNC path. I did some testing with a test package on a test database Friday, and it worked. I modified the connection in the production package to use the new DSN. I let the scheduled job run last night, and it was successful. I was also able to execute the package from a stored procedure, which had failed before (same error).|||That's exactly what I was looking for...

I had presumed that you were using a UNC from the beginning, so I never thought to question that. I thought the problem you were having was due to the job starting as either a SQL login or a Windows login that didn't have permission to read from that UNC.

What I wanted (and you gave in post 14) was an explanation of what actually worked for you.

Thanks!

-PatP|||It's funny, because I almost always use UNC paths. For some reason years ago when we bought this program (a canned Payroll/HR package), the vendor set everything up with mapped drives and we stayed with it ("if it aint broke, don't fix it" type thinking).

I link to their data all the time, but when this problem cropped up, I couldn't figure out why it would work one way and not the other. Your explanation in post 10 was pure gold, and I understand now. It's nice to know the "why" in addition to the "what".

Big thanks to you and rdjabarov for digging me out of the hole!

No comments:

Post a Comment