I'm trying to create a package with SSIS to replace the DTS process that we have in place already.
DTS package copy four table content from one server to another. I have created a simple SSIS to do the same processes but the process it alot slower than DTS!!
I did ran the SSIS package using ctrl+F5 and also from command prompt but still it's quite slow.
SSIS uses SMO to access to server and both are running on 2005
ThanksUse the "Table or view - fast load" option in your OLE DB Destinations.|||sorry for my ignorance, but is this an SSIS property? and if yes where can I find it
Thanks|||In the OLE DB Destination, it is a drop down option titled "data access mode". Double click on the OLE DB destination and it's on the main page.|||the problem is that I am using the transfer SQL server object task which it uses SMO by default not OLEDB. unless there is another way to copy the content of a table.
p.s. the tables do not exist on the remote server the transfer SQk server objects task, creates the table as well as copying the content.
cheers|||
Kolf wrote:
the problem is that I am using the transfer SQL server object task which it uses SMO by default not OLEDB. unless there is another way to copy the content of a table.
p.s. the tables do not exist on the remote server the transfer SQk server objects task, creates the table as well as copying the content.
cheers
Remember that SSIS is a data manipulation tool - not a schema manipulation tool. Hence, there isn't THAT much support for moving schema objects about.
If I were you I would create the tables using conventional methods (i.e. CREATE TABLE scripts run from the Execute SQL Task) and then use data-flows to pump data between them. This will not be slower than DTS. It will be alot more maintainable as well.
-Jamie
|||Thanks for your advise,
the problem is the table schema changes each time as the selected tables will be different. Is there a task to be able to extract the schema of the source table, so I can apply it to the destination server.
and then maybe use the data flow to push the data across.
Thanks again|||Data flows don't handle changing metadata, unless you are building them dynamically.|||Thanks , but even if I build the metadata for the tables in the destination in the control flow (which is not a problem) then I should be able to feed the data with the dataflow.
The problem is I have different tables and they change each time so I should be able to write one generic dataflow and change the parameter with a script each time I'm running.
cheers|||
Kolf wrote:
Thanks , but even if I build the metadata for the tables in the destination in the control flow (which is not a problem) then I should be able to feed the data with the dataflow. The problem is I have different tables and they change each time so I should be able to write one generic dataflow and change the parameter with a script each time I'm running.
cheers
As I think we have discussed on other threads - you can do this.
Apologies if I've missed any of your replies. The alerting functionality of these forums is broken (for me anyway).
-Jamie
No comments:
Post a Comment