Thursday, March 29, 2012
DTS Scheduled job always fails
I created and saved a DTS package to "SQL Server". When I "Execute Package" under the "Data Transformation -> Local Packages" folder in Enterprise Mgr, it executes perfectly.
However, when I Schedule it, the scheduled job Fails. The View Job History shows "The job failed. The Job was invoked by User sa. The last step to run was step 1 (ImportTest3)."
Could someone please help me with this? I don't understand why the manual execution works, but the scheduled job doesn't. Thanks for the help!Did you ever find a solution to this problem? I am having the same problem.
Originally posted by multiplex77
Hi,
I created and saved a DTS package to "SQL Server". When I "Execute Package" under the "Data Transformation -> Local Packages" folder in Enterprise Mgr, it executes perfectly.
However, when I Schedule it, the scheduled job Fails. The View Job History shows "The job failed. The Job was invoked by User sa. The last step to run was step 1 (ImportTest3)."
Could someone please help me with this? I don't understand why the manual execution works, but the scheduled job doesn't. Thanks for the help!|||You execute under different security contexts when executing a dts package manually versus scheduling it as a job. What is your dts script doing ?|||I set up a number of packages to process selected cubes in Analysis Services. The first job processes all dimensions and even that one fails.|||What message is returned for the failure ?|||ptcooper - In your case, you can execute the package and it succeeds - but when you schedule it as a job, it fails - is this true ?|||True, the problem is exactly as described by multiplex77. the only jobs that fail are the ones that are processing OLAP cubes/dimensions
Originally posted by rnealejr
ptcooper - In your case, you can execute the package and it succeeds - but when you schedule it as a job, it fails - is this true ?|||Who is the owner of the job ?|||The owner is sa and the password was specified when the DTS packages were saved. SLQ Server Agent is logged on as administrator.
Originally posted by rnealejr
Who is the owner of the job ?|||log the package and post the message(s) returned in the log.|||I don't see a way to log the package. I've never had to do that.
Originally posted by rnealejr
log the package and post the message(s) returned in the log.|||If the SQL Agent is running as the local administrator, is the local administrator a member of the local Olap Administrators group? Not sure if it defaults to this.|||The SQlSERVERAGENT does not log on as the local administrator. It logs on as a system wide administrator. We have been looking for the OLAP administrator group in the 'security' tab and the 'member of' tab for the administrator, but don't find it listed in any of the drop down lists. We have all the latest service packs installed.|||The Olap Administrators group is a local group created by the Analysis Services install. Anyone who is not a member of this group can not use the SQL Server tools to manage, modify, or otherwise peek at the cubes on that machine. Remember this only applies to the MMC tool. The OLAP Administrators group will not show up in the domain-wide dropdown lists, since it is local only to one machine. Does that help?|||Right click your "My Computer" -> Manage -> Local Users and Groups -> Groups -> Olap Admin ... Add the account that is used to run sql server agent.|||UPDATE: We discovered that the OLAP Administrator group had somehow been deleted from the Active directory. We tried re-installing service pack 3 for Analysis Services, recreating the OLAP Administrators group and setting the members, properties etc. None of this worked. We just re-installed Analysis Services and service pack 3 on the server and now the jobs will run from the workstation in EM|||The OLAP administrators group is not in AD. Never was. It is local to the Analysis Services machine.|||Our Analysis Services server is a Domain controller, and as such cannot use the Local Users.. snap in. You get directed to the Active Directory Users and Computers when you try opening it. (also ref. MS KB article 829738 'More Information')|||Once you installed it as a domain controller in ad, the local users/groups do not transfer - which is why the olap admin group never appeared. I am a little curious that other problems did not appear before this problem.
Thanks for posting the follow-up with the solution.
DTS Rounding problem
Data transformation rounds or truncate decimals!
I have an ODBC source in witch is a table with float numbers (ODBC Driver
publish it as float).
If I build a query form access or from excel with Query Analizer, I see all
decimal places, but when I try to insert data using DTS, float numbers will
be converted to its integer values.
A "select * from table", with table ODBC table, gives integer value when
runned from DTS to transform data from ODBC to MS-SQL Server table, and
gives float values from Access or other tools.
Where I can fix my problem?
Thanks.
Giorgioput convert(float,field) around your fields in the insert
example
declare @.i int
select @.i =1
select convert(float,@.i),@.i
http://sqlservercode.blogspot.com/
Sunday, March 11, 2012
DTS package execution time
table and inserts records into several tables via a custom ActiveX
transformation. The package uses logic to determine which table to
insert into and then calls DTSlookups to perform the inserts. The #
of records I'm working with is small (3,000-10,000). If I perform a
"test, the rows are actually inserted, and rather quickly. If I
execute the package through normal methods, the execution takes several
hours and often times out?
Any ideas'Running a DTS package within Enterprise Manager executes it locally on
the machine where EM is running. Running it using a schedule runs it
on the server where the agent service is running. Different machines
have different resources and are under different loads, and have
different distances from the data. Look for data traveling over the
network vs remaining local.
As you describe it, both the source data and the final destination of
the data are in SQL Server. Are they on the same SQL Server? If so,
did you consider using stored procedures? Keeping all the work
withing SQL Server itself has some performance advantages.
Roy Harvey
Beacon Falls, CT
On 27 Jul 2006 10:37:26 -0700, clawdaddy@.gmail.com wrote:
>Hi - I currently have a DTS package that takes raw data from a SQL
>table and inserts records into several tables via a custom ActiveX
>transformation. The package uses logic to determine which table to
>insert into and then calls DTSlookups to perform the inserts. The #
>of records I'm working with is small (3,000-10,000). If I perform a
>"test, the rows are actually inserted, and rather quickly. If I
>execute the package through normal methods, the execution takes several
>hours and often times out?
>Any ideas'|||clawdaddy@.gmail.com wrote:
> Hi - I currently have a DTS package that takes raw data from a SQL
> table and inserts records into several tables via a custom ActiveX
> transformation. The package uses logic to determine which table to
> insert into and then calls DTSlookups to perform the inserts. The #
> of records I'm working with is small (3,000-10,000). If I perform a
> "test, the rows are actually inserted, and rather quickly. If I
> execute the package through normal methods, the execution takes several
> hours and often times out?
> Any ideas'
>
There's really not enough info to come up with a cause, but if this is a
SQL-to-SQL process (reading from SQL/writing to SQL), I'd question why
you used DTS to do this. I think you'd get better performance, not to
mention easier debugging, by doing this a series of INSERT INTO/SELECT
statements.
Tracy McKibben
MCDBA
http://www.realsqlguy.com
DTS Package Error in the Transactional Replication
Now, I want to add a new table as a part of the replication without shutting
down replication. I can add it through sp_addaarticle and sp_addsubscription.
However the DTS can't be modified as one of the component in DTS is not
available in DTS designer.
This is the package that SQL Server creates automatically for you when you
specify DTS as a media for data transformation. Since, I dont have this
table's data driven task in DTS, I am getting following error in Distribution
DB.
Cannot find the Data Driven Query task in the DTS package that matches
command.
Thank you very much.
create a new publication for this article and add it.
"Mark" <Mark@.discussions.microsoft.com> wrote in message
news:F9F5749E-7B78-4398-A23F-DEE4FB9E74C2@.microsoft.com...
> We have a DTS package doing some transformation in Trasactional
> Replication.
> Now, I want to add a new table as a part of the replication without
> shutting
> down replication. I can add it through sp_addaarticle and
> sp_addsubscription.
> However the DTS can't be modified as one of the component in DTS is not
> available in DTS designer.
> This is the package that SQL Server creates automatically for you when you
> specify DTS as a media for data transformation. Since, I dont have this
> table's data driven task in DTS, I am getting following error in
> Distribution
> DB.
> Cannot find the Data Driven Query task in the DTS package that matches
> command.
> Thank you very much.
>
Dts Package Error
i am getting an error:
Microsoft DATA Transformation services {DTS} Data Pump
The number of failing rows exceeds the maximum specified.
TrasformCopy 'DTS Transformation_95' Conversion error:
Destination overflowed on column pair 1 (source column 'Col95'(DBTYPE_STR), destination column 'WGT01' (DBTYPE_NUMERIC)).
please help i am new to DTS PACHAGE.Check the collation settings for this table and target table too, see any difference. Refer thru Job steps from History.
Sunday, February 26, 2012
DTS migration from ss2000 --> ss2005
Right Click on Data Transformation Services and choose Migration Wizard
"Pablo Cognito" <pcognito@.gmail.com> wrote in message
news:uXs7OOJDHHA.3660@.TK2MSFTNGP02.phx.gbl...
> hi,
> I'm planning to move my existing DTS that I've made in ss2000 to ss2005.
> At this moment I'm not planning to migrate existing DTS to SSIS, I just
> want to use backward compatibility for now.
> What's the "best practice" technique for moving ss2000 DTS --> ss2005?
> Thanks
Sorry, firstly save the DTS as stucture storage file
secondly Right Click on Data Transformation Services and choose Import
Package File option
choose Import option
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:OMMOegJDHHA.3492@.TK2MSFTNGP06.phx.gbl...
> Pablo
> Right Click on Data Transformation Services and choose Migration Wizard
>
> "Pablo Cognito" <pcognito@.gmail.com> wrote in message
> news:uXs7OOJDHHA.3660@.TK2MSFTNGP02.phx.gbl...
>
DTS load to multiple tables
I have SQL Server 2003 Standard and am attempting to use DTS for a data load/transformation and I’m not sure if I am using the right tool for the job.I have a somewhat denormalized Access database that has to be loaded into a normalized SQL Server database.Values from one row in any of the source tables generally need to be separated and inserted into several destination (SQL Server) tables.There are no unique ids in the source data since it is coming from a 3rd party and the tables are not related to others.I’ve created a DTS Package and have the beginnings of several Transform Data Tasks.Each destination table has an Identity id column, which is calculated automatically.The roadblock I’ve run into is that I can’t figure out how to take each newly created ID and insert it into a new row in another table as a foreign key.Basically, I have to move data from a single input row to new rows in multiple destination tables and create ids (PK, FK) that tie these tables together.The Transform Data Task only allows me to reference one source and one destination, not multiple destinations.I hope this makes sense.Any suggestions would be appreciated.
Here’s a simple example that may help illustrate the problem.My database is much more complex.
Input table is called parcels and each row has three columns:Address, Owner, and Legal_Description.
Output database has two tables that will receive this data:Parcel table will have the Address, Owner and Parcel_ID (auto calculated).Legal table will have Legal_Description, Legal_Desc_ID (auto calculated), and Parcel_ID.
When the row is inserted into the Parcel table, the newly auto calculated Parcel_ID has to be captured.Next create a row into the Legal table and insert the Parcel_ID so the two rows are related.
How can I do this through DTS?Thanks for any suggestions, code snippets, or references.
Due to the complexity of this task, I would suggest maybe doing this in a .NET winforms application. Set up ODBC connections to the two databases. Now write some queries in the Access database to divide the data up appropriately. Next, write insert procedures in the SQL Server database, which return an outparameter which is the id field. To get this, in the insert proc, use the @.@.IDENTITY or the SCOPE_IDENTITY calls to get the id value of the inserted row. Capture this in the .NET application, and pass this in to the insert proc in the related table. Alternatively, keep a cache of the data mappings, maybe in a temporary table or a dataset, and do the inserts as bulk inserts. Then run an update procedure which sets the foreign key based on the database mappings in the original database. So, for example, based on the Legal_Description in the legal table, update the Parcel_Id in the legal table using the owner and address values that used to share a row with the Legal_Description. Either approach should work.let me know if you need more guidance here. Some of the SQL database engine people, or the SSIS people may be able to point you to a DTS solution that can do this. Alternatively, you could create a DTS solution that inserts into one table, inserts into the other, then calls the update procedure as described above. There is a separate SSIS forum (the new DTS), you may be better posting the question there.
HTH
For more T-SQL tips, check out my blog:
Friday, February 24, 2012
DTS Issues
I am using Data transformation services to transform my data, from one
data model to another.
My issue is executing some of the steps take a very long time(hours), I
am doing so activex processing, but nothing too funky, I end having to
stop the server with no errors logged.
Any clues, ideas.
Thanks
JackHow much data are you transferring?
Is it a straight copy of data from one place to another?
You may wish to describe each step in your DTS process, including and SQL and ActiveX processing that is used - otherwise we can't give you much of a help!
It may also be worth telling us what you want your DTS to actually be doing, as there may be more efficient ways to do this in the first place :)
Kind regards,
George|||There is a asbout 200 mb of data 40 tables, all the data is being transformed forom one data model to another,
the tasks balk on 4 tables.
table a - 1 table 90000+ records lots of data this would be 30% of the db 3 table joing into 1
table b - 1 table 1228 records nothing much- 3 table join into 1
table c - 1 table 40000+ records nothing much as well 2 table join into 1
table d - 1 table 25000+ records nothing much this table is the same as table b - 3 table join merge into 1
although the one table with the bulk of tha data 40% transforms properly with no issues 225000+ records
I am testing for null on every field that could be null.
The data is being transformed from MSSQL to MSSQL.
All the transformation tests work correctly, and parse correctly
I have created separate packages for the remaining tables with still the same problems.
Let me know if you need other details
Jack|||What version of SQLServer?
You need to identify which steps are taking so much time, via
package logging, or logging from your ActiveX, or profiling.
Usually this is caused by SQL inefficiency, locking, and/or transactional recording. Are you using NOLOCK hints where possible?
Your db logging level can slow things down too - Recovery Model 'full' vs. 'simple' adds a lot of weight. You can use bcp to avoid this.|||I am using MSSQL 2000 sp3.
I am using the package loggin and it's shows nothing other than started at??? it never completes
I will add the activex logging. not sure waht you mean by profiling.
Can I transform the data with a BCP.
Thanks
Jack|||Right the DTS logging doesn't seem to checkpoint until completion so it's often unreliable.
So try this...and probably easier than activex logging ...if you're still trying to identify the step, then log to the database - i.e. create an Audit table, and before each step, log the current time and step # using a straight insert/update. This way you can control the logging.
It also sounds like you haven't looked at the running processes while your DTS is running - see Ent Mgr under Management -> Current Activity -> Locks/ Object and Locks/Process ID - that will probably show you some locking you didn't expect and tells you exactly your problem.
You can also use the SQL Profiler tool to watch batch-by-batch what the db is doing, when all else fails; but the previous steps should be enough.
You're into the hard part - getting it to work well! Good luck -|||I will give those a try,
the activex looging seems to be reliable, I am testing on the first step and the last step and the log file says it's progressing.
I may just be a little impatient I've been working on this for 5 straight days and getting tired, 90% the way though I appreciate the tips.
Thanks
Jack|||When you check for null, what do you do, exclude it from your export, or put a dummy value in (e.g. "n/a" )?
I bet this will be causing performance issues either way if you are implementing the check using ActiveX.
The good news is that this can be easily (and not to mention - efficiently) remedied in SQL. The bad news is that it may require to re-write some of your DTS.|||After all said and done, it was a lookup query that was bogging the package.
Put the lookup into it's own package to run after
Everything works great.
Thanks for the assistance
Jack|||Right before I am to deploy the application, the DTS is crapping out.
I have some rogue characters that I need to remove before the data gets transformed.
The problem is I have lost the field.
the field is DTSSource("l_text") and it is empty, when I execute, I am doing the activex file logging and the file is empty as well. But when I test the ransformation everything works great.
Any Assitance would be appreciated
Jack
Tuesday, February 14, 2012
DTS help needed!
Question: Is there a way to change the default option to Delete rows in destination table, so I wouldnt have to go thru the transformation of dozens of tables every time I move databases around.
P.S. I know that I can save DTS package with my settings and use it later. The problem is that every time it might be different database or objects might change. So, it is not a solution for me.
ThanksPreceed the step with either DELETE or TRUNCATE script that you can easily generate yourself.
create view dbo.vw_DELETE_ALL_TABLES as
select 'DELETE ' + db_name() + '.dbo.[' + name + '] ' + char(13) + char(10) + 'go'
from dbo.sysobjects where objectproperty(id, 'IsTable')=1
and objectproperty(id, 'IsMSShipped')=0
Below is the OS step that you will need to create:
bcp <you_db>.dbo.vw_DELETE_ALL_TABLES out DELETE_ALL_TABLES.SQL -S <your_server> -T -c
osql -S <your_server> -E -i DELETE_ALL_TABLES.SQL|||Thanks rdjabarov,
It might work if you are dealing with a whole database.
But what if user asks you to refresh development database with the production data except a few tables? In this case you will need to modify your script, or ...?
That is why I was wondering if I could modify the default setting for DTS to "Delete rows in destination tables".
Thanks again|||Well, then have an exception table and populate it with tables that don't need to be refreshed. Then modify the view to return only records that are not found in the exception table. Easy, right? ;)|||It might be easy if you don't have 230 databases to support.
In my situation maintaining "exceptions" table in each database can be entire new project. So, I guess I will do it the old way - open and click to transform each table.
Thank you for your help.|||Well what I would do is create a quick web app that will pass the parameters such as what tables to transform and weather to delete the data or not to the dts package and then execute it.|||It might be easy if you don't have 230 databases to support...I used to support 1500 databases, now I support over 3500. If you think you can't afford to invest 1.5 hours in designing the structure and the process that would take care of it...I rest my case ;)|||1500? How many servers?
And that sound ludicrous...DTS..
I would dump and RESTORE...|||Calm down guys,
You forgot my initial question: Is there a way to make a default one of the DTS options in "Column mappings and transformations" window to "Delete rows in destination table" instead of "Append..."? That's it.
I'm pretty sure there are dozen of ways to transfer databases/data between the servers, as well as dozen of reason why you prefer to choose one way or another.
Thanks a lot for your help