I have a package that has 12 data pump tasks all executing in parallel.
It is transferring raw data from an AS400 DW to a MSSQLSvr Staging area.
Each pump task on completion assigns values to a set of global variables, then having done this passes these as parameters to a sproc which inserts them into a table.
This seems to work for 4 or 5 of the pump tasks but, the rest of the rows in the table are all the same because the remaining pump tasks are all executing before the sprocs.
Is there a way to make sure that the entire set of job steps completes, before starting another job set of steps while still keeping them running in parallel.
I had wondered if there was a way to use the PumpComplete phase of each pump step to fire off the sproc, but can't see how you execute the step.
Any ideas would be much appreciated.Darren Green suggested
Why not take a simper approach and only populate your progress list as
tasks start executing. You could drive this quite happily off events.
To determine order of execution you would need to enumerate all steps as
constraints are held by the task they go to, not from.
For each step, enumerate the PrecedenceConstraints collection, to get
the PrecedenceConstraint objects. The StepName is the preceding step, So
if a step as no PrecedenceConstraints it is the start step. Not sure
that this guaranteed to 100 accurate either as in theory you can change
the basis and result to in effect be a "On Preceeding Step Not Run", and
have a circular reference, but I suspect DTS itself may have the same
problem as you in this case, so probably not worth worrying about for
the start step, but perfectly valid elsewhere.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment