Showing posts with label copies. Show all posts
Showing posts with label copies. Show all posts

Tuesday, March 20, 2012

Copying SPs from one DB to another

I'm working on a system that will create a new DB that copies the SPs from
an older DB.
The purpose of this project is to create smaller, yearly DBs that will
require fewer table scans (since most searches are done on only the current
year). I would like to mimic the old DB year by year, copying all table
structure, a few tables worth of data (so far so good on those queries), but
I'm unsure on how to copy SPs from one DB to another.
Any thoughts?
Chad Edge
Seattle, WAYou can create the procedures in model, so when you create the new DB, they
get created automatically.
You can also try SQL Compare from www.red-gate.com, which will help you
generate scripts to synchronize two databases.
"Chad Edge" <xcasex@.hotmail.com> wrote in message
news:D2746C57-08D3-4A58-B0C0-6D7725D27C73@.microsoft.com...
> I'm working on a system that will create a new DB that copies the SPs from
> an older DB.
> The purpose of this project is to create smaller, yearly DBs that will
> require fewer table scans (since most searches are done on only the
> current year). I would like to mimic the old DB year by year, copying all
> table structure, a few tables worth of data (so far so good on those
> queries), but I'm unsure on how to copy SPs from one DB to another.
> Any thoughts?
> Chad Edge
> Seattle, WA
>|||Aaron,
Thanks for your help on that. I'll look up the link and work on the scripts.
I'm actually wondering now if we even need to copy the SP's. I guess it will
depend on what amount of weight we want to put on the interface-level.
What we're trying to do is make smaller seasonal db's without changing much
of the interfaces access (there's an internal edit tool, and an external
read-only view tool). If we keep our current, 'master' database in-place w/
the SP's, then we could adjust those SP's to choose which smaller, seasonal
db to get data from. If we do things that way, then we won't need to manage
all the SP's on multiple db's.
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:eHxOBWAlGHA.1936@.TK2MSFTNGP04.phx.gbl...
> You can create the procedures in model, so when you create the new DB,
> they get created automatically.
> You can also try SQL Compare from www.red-gate.com, which will help you
> generate scripts to synchronize two databases.
>
>
> "Chad Edge" <xcasex@.hotmail.com> wrote in message
> news:D2746C57-08D3-4A58-B0C0-6D7725D27C73@.microsoft.com...
>|||> What we're trying to do is make smaller seasonal db's without changing
> much of the interfaces access (there's an internal edit tool, and an
> external read-only view tool). If we keep our current, 'master' database
> in-place w/ the SP's, then we could adjust those SP's to choose which
> smaller, seasonal db to get data from. If we do things that way, then we
> won't need to manage all the SP's on multiple db's.
I recommend having the schema spread across the dbs, which will allow for
the app to simply change its connection string, instead of your main control
database needing to generate a dynamic SQL string EXEC('EXEC ' + @.dbname + '
.dbo.ProcedureName');|||Aaron,
My worry about doing that would be having to manage the SP's from season to
season.
So just to clarify, you're saying this is not recommended:
1. Master db, currently the only db, is called MAINEVENTS1. This db has all
SP's, and stores about 10 years worth of information (events)
2. We'd like to create new, seasonal db's called 'EVENTS_YYxYY' (eg:
EVENTS_06x07)
I'm considering this approach since it would shrink the amount of table
data that would need to be scanned, since most event results are polled for
the current 'season'
3. When an administrator logs in using (currently CFM) our admin interface
(on our intranet), they would have a new dropdown "Select Season:" which
would set their session for a specific season
4. When visitors to our Web site (again, using CFM), they would by default
get the current season set as their session, but would have text-links to
previous seasons.
5. Passing the session variable of which season is selected to the query
string would fire the correct SP from MAINEVENTS1, which would then retrieve
the database information from the correct DB.
In our first year, that would mean that there'd be two DB's :
MAINEVENTS1 with the last 10 years worth of data, and EVENTS_06x07.
6. This fall, we'd be adding new event types (concerts, speaking
engagements, etc) that would only be added to the newer DB's in the seasons
that they happened (no need to add Public Speaking to the MAINEVENTS1 since
it never happened).
7. The next round of updates would allow us to split MAINEVENTS1 into
EVENTS_01x02, 03_04, etc. further shrinking our query source
8. The MAINEVENTS1 would then become a very small DB, just a gateway of SPs
that query other dbs and format the data.
The reason I've been approaching the project this way is so the Online
interface only needs minor changes, the SP's only need minor changes, and we
don't have to manage SP's from season to season (say we change the way we'd
like data formatted; we'd have to track the changes down and verify control
over each database - bad form in my opinion).
The main reasons for approaching the project this way:
1. Shrink the databases since there's thousands of rows in our ten-year db.
2. Lock past events so they don't get overwritten (we've had issues where a
venue will change it's name; if we update our database the way it's
organized now, all events in the past are then related to the new venue
name - this makes it a nightmare to try and track in the past where an event
occurred because the venue in the past will no longer exist in our db).
I know #2 should be addressed, but we're not able to make such drastic
changes to the data at this time. Perhaps a round 3 we can restructure. For
now, we're getting about 2 million hits a month for our events and I'd like
to keep the impact small.
Hopefully that's not boring you to tears, I just needed to dump my brain
onto the screen to make sure we're speaking the same language.
Thanks again for your suggestions. I'll look at the structure and the SP's a
little more and see what work is ahead. I've got to make a decision by the
end of this w so I can roll out the replication by EOM June.
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:u3DsmsIlGHA.2304@.TK2MSFTNGP02.phx.gbl...
> I recommend having the schema spread across the dbs, which will allow for
> the app to simply change its connection string, instead of your main
> control database needing to generate a dynamic SQL string EXEC('EXEC ' +
> @.dbname + ' .dbo.ProcedureName');
>|||> My worry about doing that would be having to manage the SP's from season
> to season.
What is there to manage? If you store the SPs in model, then every new
database you create will have the SPs automatically.
When you have a changed SP to deploy, you run a loop to create a series of
CREATE or ALTER procedure statements, to generate a script you can run,
deploying to all databases at once.
We have two production systems that use this model, one has over 3000
identical databases spread across three machines, and the other has over 750
identical databases across three different environments (dev, qa,
production).
I use red-gate's SQL compare to synchronize model between environments.
And by naming the databases with a common naming scheme, I was able to write
a stored procedure in master that gets all the databases from sys.databases
matching that naming scheme.
I store the create and/or alter scripts in a file called change.sql, then
run a C# command line program that calls the stored procedure, and creates
an output file for each database.
Deploying changes really couldn't be simpler (except when there are
exceptions between environments). Everything configurable (such as network
share paths, SMTP servers, ftp passwords, etc.) are stored in properties
tables, nothing is hard-coded, and it makes it really easy to take the one
customer who explodes in activity from one server to another. Just backup
the database, restore on another server, and update a flag in the control
database that tells you what server the app can find that database.
A|||> I store the create and/or alter scripts in a file called change.sql, then
> run a C# command line program that calls the stored procedure, and creates
> an output file for each database.
Sorry, let me explain that sentence a little better.
Let's say I have a fix to a single stored procedure that needs to be
broadcast to all databases in an environment.
I store a single ALTER PROCEDURE script in change.sql.
My C# app grabs the stored procedure from master, steps through the 750+
databases, and for each one, reads change.sql, adds a USE [dbname] in front
of it, and appends it to a file called deploy.sql. Now I can run a single
.sql script to deploy my changes to every single database.
Sure, there are pros and cons, and yes that many databases sounds like a lot
to manage, and it is. But "a lot" does not necessarily equate to "hard"...
there are many ways to make the multi-db scenario much easier to manage.
One of these days I'm going to write an article on it...
A|||Aaron,
I'm liking where you're going (excluding the fact that it's all new ground
to me; I've stayed as far from MS coding as I have been able in my career).
One thing I might not have mentioned: We're going to get into situations
where we'll need a *controller* sp that takes variables and chooses which
database to retrieve the data from. For example:
Right now (with MAINEVENTS1) we have a table of Artists (musicians, let's
say). If you want to retrieve information about a specific Artist (say,
their instrument, the places they've played, tickets sold, etc) you run a
query using their ArtistID (just an example). That works in the MAINEVENTS1
db.
Now lets say we create EVENTS_06x07 that adds additional columns of data
that weren't available in the past (such as, height and weight, which
changes as the year goes on - just to make it dynamic). We won't have that
information for anything in the past, so we'd like to not make any changes
to MAINEVENTS1. However, we would now need to make the query (SP) understand
where to get it's data, and how much detail (columns) to return, based on
the year passed (pre-06 or post-06).
I picture (in my ideal candy-land imagination) a controller SP that would
take two variables (the ArtistID, and the Year), determine which database to
run the SP (or better yet, add the correct formatting to the SP itself), run
the SP and return the proper column data.
This is where replicating SPs gets tricky: one DB may have changes that a
previous years DB won't.
Also, leaving SP's in the Model would not automatically roll-out the changes
until a new DB was created, or when we create an entirely separate
application (such as something in C#).
My example is very elementary; there's trickier situations (statistics about
artists, about venues, about groups, etc) that will change from year to
year. That's where centrally managing SP's and having the MAINEVENTS1 manage
the connections and returns makes so much sense to me.
Again, thanks for your insight. You're making a lot of sense and I've
already started working with the Red-Gate software today (the dependency
tracker is flipping me out).
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:%23ESReGJlGHA.1320@.TK2MSFTNGP04.phx.gbl...
> Sorry, let me explain that sentence a little better.
> Let's say I have a fix to a single stored procedure that needs to be
> broadcast to all databases in an environment.
> I store a single ALTER PROCEDURE script in change.sql.
> My C# app grabs the stored procedure from master, steps through the 750+
> databases, and for each one, reads change.sql, adds a USE [dbname] in
> front of it, and appends it to a file called deploy.sql. Now I can run a
> single .sql script to deploy my changes to every single database.
> Sure, there are pros and cons, and yes that many databases sounds like a
> lot to manage, and it is. But "a lot" does not necessarily equate to
> "hard"... there are many ways to make the multi-db scenario much easier to
> manage. One of these days I'm going to write an article on it...
> A
>|||> Now lets say we create EVENTS_06x07 that adds additional columns of data
> that weren't available in the past (such as, height and weight, which
> changes as the year goes on - just to make it dynamic).
So, in every db, add weight and height to the table(s) or view(s), adjust
procedure(s) that exist there, and only update those customers that have
that data (otherwise it is NULL).
Displaying these columns is something the app can decide based on the
settings in the control db, not based on the data itself.
Adding columns to a stored procedure should not change the behavior of the
app... but the app *can* use those new columns if it deems it should.
A

Copying one SQL Server Database into another

Hi, I'm trying to copy one SQL Server Database into another SQL Server DB within the same server. I learnt a bit about this copying, that it copies only the Database structure, the Tabels, constraints view stored procedures etc etc. Is there a tutorial where I can get clear instructions to do this. I just need the table structure without any data to be copied into this. Have even tried exploring creating and executing DTS packages but cudn't get much help with this. Any help wud be really appreciated.

ThanksYou can use Enterprise Manager to create the script. Once you have this all you have to do is run it to create another database. The GUI interface allows you to select many options about what exactly you want to include in the script.

Select the database in Enterprise Manager and thenAll Tasks and thenGenerate SQL Script

hope this helps,
sivilian|||Another easy way to do this is to create a SQL .bak and then to jsut restore it as a new DB name. The reason thsi is easier is because when you create the script, you can create the DB and ALL objects, but no data. So after you apply the script, then you have to DTS the data if you need it. The only reason I do scripts is if I do not have access to get the .bak file on a different server. When you create and restore backups, you can only do it from that local machine. So if you are moving to a different server and do not want to or can not get the .bak file on that new server, then using the Genereate script and applying it to the new DB is great. But if you are working on the same machine, just try creating a SQL .bak filke and then restoring it as the new DB name. This is really easy and works great!!

Rory|||I have a similar but in a way opposite problem:
I want to export a local SQL Server database into a .sql file and include all data. It's quite simple to do using "Generate SQL Script," but there's no way (at least that I've noticed) to export the current data as well.
I've been trying everything, it seems, but no luck.

Any clue would be appreciated.

Alex.|||So, here's answering my own question.
The solution I've found (there're simpler ones, I'm sure) is to use Microsoft SQL Web Administration Tool. I've downloaded it a while ago, and now used it to export a whole database and its data into a *.sql script file.

Hope that helps others.

Alex.

Copying of database

I tried to write a batch script which copies a database by making full
backup, then copying the backup file to another location and restoring it.
The backup schedule of the my database is this:
full backup on sunday at 23 h stored in file1
differential backup every day except sunday at 23h stored in file2
Now i'm not sure what will happen with my differential scheduled backups
when i delete the temporary backup file that is made only for copying during
the w. It's stored in another file (let's call it file3). I see that SQL
Server remembers every full backup, is there a way to tell it not to write
to backup history tables?
I'm thinking of another ways of copying database. One is to detach the
database or to stop the server but i think it's not good. Another way is DTS
package. Can you tell me if it can do the trick? Is there another way?
Thank you in advance
Georgi PeshterskiWays of copying a database:
1. Certainly backup and restore is an option, but not the first I would take
.
2. Detach, copy and attach also works, but it is kludgy.
3. Snapshot replication works well if you desire a snapshot to copy at any
one time (ie, the update is the entire database and instant update of data i
s
not important).
4. Transaction replication works well if you need real time updates
5. DTS works, as well, but can end up with unnecessary locking on the
original database while you are migrating. You have some control over this,
of course, but it gets more advanced.
I would aim for replication or DTS long before using a backup/restore or
detach/copy/attach scenario.
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA
***************************
Think Outside the Box!
***************************
"George Peshterski" wrote:

> I tried to write a batch script which copies a database by making full
> backup, then copying the backup file to another location and restoring it.
> The backup schedule of the my database is this:
> full backup on sunday at 23 h stored in file1
> differential backup every day except sunday at 23h stored in file2
> Now i'm not sure what will happen with my differential scheduled backup
s
> when i delete the temporary backup file that is made only for copying duri
ng
> the w. It's stored in another file (let's call it file3). I see that SQ
L
> Server remembers every full backup, is there a way to tell it not to write
> to backup history tables?
> I'm thinking of another ways of copying database. One is to detach the
> database or to stop the server but i think it's not good. Another way is D
TS
> package. Can you tell me if it can do the trick? Is there another way?
> Thank you in advance
> Georgi Peshterski
>
>|||Thank you, i'm working to do it with DTS
"Cowboy (Gregory A. Beamer) - MVP" <NoSpamMgbworld@.comcast.netNoSpamM> wrote
in message news:F34C25E5-AD13-4255-B1E0-38CBD3481C5F@.microsoft.com...
> Ways of copying a database:
> 1. Certainly backup and restore is an option, but not the first I would
take.
> 2. Detach, copy and attach also works, but it is kludgy.
> 3. Snapshot replication works well if you desire a snapshot to copy at any
> one time (ie, the update is the entire database and instant update of data
is
> not important).
> 4. Transaction replication works well if you need real time updates
> 5. DTS works, as well, but can end up with unnecessary locking on the
> original database while you are migrating. You have some control over
this,
> of course, but it gets more advanced.
> I would aim for replication or DTS long before using a backup/restore or
> detach/copy/attach scenario.
>
> --
> Gregory A. Beamer
> MVP; MCP: +I, SE, SD, DBA
> ***************************
> Think Outside the Box!
> ***************************
> "George Peshterski" wrote:
>
full
it.
backups
during
SQL
write
the
DTS

Wednesday, March 7, 2012

Copying BAK file

When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
this:
copy DBF_db_20040602.BAK .
it copies the entire file to the current directory where the cmd-file is
started.
So far so good...
But, when I change the cmd-file to do this:
copy DBF_db_20040602.BAK DBF_newfile.BAK
is only creates a file of only 6k in size. The original file is about 2 Gb
in size.
Anyone knows why this happens?
..Nico
hi
Have u tried using xcopy?
J
"Nico" <duiken@.nospam.nl> wrote in message
news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>
|||Nico,
Sounds strange. I suggest you post this to a windows forum, as likelyhood to find Windows experts should be
higher there... :-)
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Nico" <duiken@.nospam.nl> wrote in message news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>

Copying BAK file

When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
this:
copy DBF_db_20040602.BAK .
it copies the entire file to the current directory where the cmd-file is
started.
So far so good...
But, when I change the cmd-file to do this:
copy DBF_db_20040602.BAK DBF_newfile.BAK
is only creates a file of only 6k in size. The original file is about 2 Gb
in size.
Anyone knows why this happens?
.Nicohi
Have u tried using xcopy?
J
"Nico" <duiken@.nospam.nl> wrote in message
news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>|||Nico,
Sounds strange. I suggest you post this to a windows forum, as likelyhood to
find Windows experts should be
higher there... :-)
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Nico" <duiken@.nospam.nl> wrote in message news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...[vb
col=seagreen]
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>[/vbcol]

Copying BAK file

When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
this:
copy DBF_db_20040602.BAK .
it copies the entire file to the current directory where the cmd-file is
started.
So far so good...
But, when I change the cmd-file to do this:
copy DBF_db_20040602.BAK DBF_newfile.BAK
is only creates a file of only 6k in size. The original file is about 2 Gb
in size.
Anyone knows why this happens?
.Nicohi
Have u tried using xcopy?
J
"Nico" <duiken@.nospam.nl> wrote in message
news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>|||Nico,
Sounds strange. I suggest you post this to a windows forum, as likelyhood to find Windows experts should be
higher there... :-)
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Nico" <duiken@.nospam.nl> wrote in message news:40bdeb35$0$136$18b6e80@.news.wanadoo.nl...
> When I copy a BACKUP using a .cmd file from a command-box in Win2K, like
> this:
> copy DBF_db_20040602.BAK .
> it copies the entire file to the current directory where the cmd-file is
> started.
> So far so good...
> But, when I change the cmd-file to do this:
> copy DBF_db_20040602.BAK DBF_newfile.BAK
> is only creates a file of only 6k in size. The original file is about 2 Gb
> in size.
> Anyone knows why this happens?
> .Nico
>

Friday, February 24, 2012

Copy without Locks

I have a stored procedure which copies data from a view into a
temporary table (x2) and then from the temporary table into a table
which the users use. It takes 1 minute to get the data into the temp
table and seconds to update into the final one (hence the two stages).

When I do the initial copy from the view, it locks the various tables
used in the view and potentially blocks the users. It's a complex view
and uses plenty of other tables. We get massive performance issues
'generating' the data into a table as opposed to the view.

What I want to do is take all the data without locking it. I don't
want to modify the data, just read it and stick the data into a table.

Thanks

Ryan

SQL as follows :

/*Drop into temp tables first and then proper ones later as this
works out a lot less time when no data will be available*/

TRUNCATE TABLE MISGENERATE.dbo.CBFA_MISDATATemp -- Temp Table
TRUNCATE TABLE MISGENERATE.dbo.CBFA_MISPIPDATATemp -- Temp Table

INSERT INTO MISGENERATE.dbo.CBFA_MISDATATemp
SELECT * FROM MIS.dbo.CBFA_MISDATA -- View

INSERT INTO MISGENERATE.dbo.CBFA_MISPIPDATATemp
SELECT * FROM MIS.dbo.CBFA_MISPIPDATA -- View

/*Now drop this into full MIS tables for speed*/

TRUNCATE TABLE MISGENERATE.dbo.CBFA_MISDATA
TRUNCATE TABLE MISGENERATE.dbo.CBFA_MISPIPDATA

INSERT INTO MISGENERATE.dbo.CBFA_MISDATA -- Final Table
SELECT * FROM MISGENERATE.dbo.CBFA_MISDATATemp

INSERT INTO MISGENERATE.dbo.CBFA_MISPIPDATA -- Final Table
SELECT * FROM MISGENERATE.dbo.CBFA_MISPIPDATATempRyan (ryanofford@.hotmail.com) writes:
> What I want to do is take all the data without locking it. I don't
> want to modify the data, just read it and stick the data into a table.

You can say things like:

SELECT * FROM tbl WITH (NOLOCK)

although, I am uncertain how this works with a view.

You should be very careful with NOLOCK. Using NOLOCK may save you from
users screaming because they are blocked, but since you are reading
uncommitted data, you may produce incorrect or incoherent results. The
users may not scream about this - they will just make incorrect decisions
because of bad input.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

copy tuples from table tsql?

Hi all,
I'm trying to write a TSQL procedure that takes as an argument a
tablename in a database and then it copies the first 100 tuples of the
given table in the same table (it actually duplicates them). The table
doesn't have a standard schema and the query should be able to deal with
different kinds of schemas.
Any help on this?
Thanks in advance,
PeterWhere to start! Why would you ever want to copy a row* in the same table?
That's undesirable and unnecessary in SQL. Every table should have a key
that prevents duplicate rows. Maybe you mean you want to copy rows to
another table? But then you go on to say you want to copy the "first 100
tuples". Tables in SQL have no inherent logical order. What do you mean by
"first 100"?
Finally, parameterizing table names is generally a very bad idea in a
production system. Sometimes DBAs need to do that for admin tasks and the
way to do it is to use dynamic SQL. The following article has the gory
details and all the caveats that go with dynamic code:
http://www.sommarskog.se/dynamic_sql.html
I do recommend you study some relational database theory and practice and
then reconsider your requirements.
[ * In relational theory tuples by definition are not duplicated in a
relation. Rows are not tuples and your use of that term is incorrect. ]
David Portas
SQL Server MVP
--
"pnp" <pnpNOSPAM@.softlab.ntua.gr> wrote in message
news:OcHFgFUxFHA.2232@.TK2MSFTNGP11.phx.gbl...
> Hi all,
> I'm trying to write a TSQL procedure that takes as an argument a tablename
> in a database and then it copies the first 100 tuples of the given table
> in the same table (it actually duplicates them). The table doesn't have a
> standard schema and the query should be able to deal with different kinds
> of schemas.
> Any help on this?
> Thanks in advance,
> Peter

Sunday, February 19, 2012

Copy tables on remote servers

Hi all, I'm trying to create a script / stored proc that copies one table from a remote server to another table on a different remote server.

The approach I wanted to take was...

create a stored proc that can be called remotely

the stored proc tests to see if the table exists, if it does it is dropped, if not it is created

new table structure and data copied from remote server A to remote server B

...I feel that this should be quite easy, but I must admit I'm struggling. Any help would be really appreciated.

Cheers, Jon

Did you try to setup a linked server and then do a SELECT INTO?

How about using DTS instead?

|||

Thanks for your reply William.

I have tried SELECT INTO...

select * into remoteServer.intranetcms.dbo.woodford_bridge
from localServer.intranetcms.dbo.woodford_bridge

... but I get the error...

The object name 'remoteServer.intranetcms.dbo.' contains more than the maximum number of prefixes. The maximum is 2.

...I am familiar with DTS but what I would like to do is create a stored proc that can be executed inside an on_click event from a web page.

- Jon

|||

In Ado.net, there is a pretty good method for doing this. The api name is SqlBulkCopy.

Let me know if you want to know more about the api.

Thanks

Bei

|||

Thanks Bei, unfortunately we're not using .Net 2 so, as I understand it, can't use SQLBulkCopy. Any other ideas?

- Jon

|||

Hi, maybe very late, but I ran into the same problem and solved this by doing this:

Ex:

SELECT t.Bilagnr

FROM [172.18.165.25\ASNV].[AS_14830].[dbo].[tblSalesDetails] as t WHERE Pkey > 1

Hope this helps!

- Per S.

|||

I'll recommend the use of DTS. then

create an SP that triggers the DTS thru xp_cmdshell using dtsrun.

call the SP from ASP.net

Copy tables on remote servers

Hi all, I'm trying to create a script / stored proc that copies one table from a remote server to another table on a different remote server.

The approach I wanted to take was...

create a stored proc that can be called remotely the stored proc tests to see if the table exists, if it does it is dropped, if not it is created new table structure and data copied from remote server A to remote server B

...I feel that this should be quite easy, but I must admit I'm struggling. Any help would be really appreciated.

Cheers, Jon

Did you try to setup a linked server and then do a SELECT INTO?

How about using DTS instead?

|||

Thanks for your reply William.

I have tried SELECT INTO...

select * into remoteServer.intranetcms.dbo.woodford_bridge
from localServer.intranetcms.dbo.woodford_bridge

... but I get the error...

The object name 'remoteServer.intranetcms.dbo.' contains more than the maximum number of prefixes. The maximum is 2.

...I am familiar with DTS but what I would like to do is create a stored proc that can be executed inside an on_click event from a web page.

- Jon

|||

In Ado.net, there is a pretty good method for doing this. The api name is SqlBulkCopy.

Let me know if you want to know more about the api.

Thanks

Bei

|||

Thanks Bei, unfortunately we're not using .Net 2 so, as I understand it, can't use SQLBulkCopy. Any other ideas?

- Jon

|||

Hi, maybe very late, but I ran into the same problem and solved this by doing this:

Ex:

SELECT t.Bilagnr

FROM [172.18.165.25\ASNV].[AS_14830].[dbo].[tblSalesDetails] as t WHERE Pkey > 1

Hope this helps!

- Per S.

|||

I'll recommend the use of DTS. then

create an SP that triggers the DTS thru xp_cmdshell using dtsrun.

call the SP from ASP.net