Sunday, March 25, 2012
Copying the backup to another drive
backup up TRN's every night and the whole DB every Sunday using the Agent and
a fairly out-of-the-box Maint plan.
What I would like to do is copy the latest file to another machine on our
network after every backup run. That machine is attached to an offsight SAN.
I tried simply pointing the backup directory at that machine, but people
noted this is a bad idea and it's not all that easy to do anyway.
I think the easy way to do this is to edit the backup jobs and add a copy.
However I don't see a way to name the file to copy, which will change every
day. Also whenever I've made changes like this in the past, the maint plan no
longer works, yet this is very important for our install (so other people can
edit it).
Any advice?"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
>I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
> and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
> SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
> every
> day. Also whenever I've made changes like this in the past, the maint plan
> no
> longer works, yet this is very important for our install (so other people
> can
> edit it).
> Any advice?
The most easily maintainable method (in my opinion of your given scenario)
is to create a DTS job that will perform the backups and do the copying
process. The ActiveX scripting (VBScript) in DTS is conducive to making up
filenames, etc. and is fairly straightforward.
Then you simple schedule a job to fire off the DTS package.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||Actually, backing up to a remote file share is pretty easy and is IMHO, a
best practice.
HowTo: Backup to UNC name using Database Maintenance Wizard
http://support.microsoft.com/?kbid=555128
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
> I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
every
> day. Also whenever I've made changes like this in the past, the maint plan
no
> longer works, yet this is very important for our install (so other people
can
> edit it).
> Any advice?|||Geoff N. Hiten wrote:
> Actually, backing up to a remote file share is pretty easy and is
> IMHO, a best practice.
> HowTo: Backup to UNC name using Database Maintenance Wizard
> http://support.microsoft.com/?kbid=555128
>
I don't agree. You are completely throttling the backup based on your
network bandwidth. For small databases this may be fine, but everything
else will likely suffer in terms of backup speed. Given that a native
backup writes out a backup file the size of the database, using a
network share will likely slow it down. That applies as well to backup
products like LiteSpeed which thrive on available disk bandwidth. Plus,
do you really want to flood the network pipe on the server with backup
data which could also cause network performance problems for all
connected users.
My experience has been to back up locally (SAN, NAS, local drive) and
then move off to the final location (tape, network share, etc.).
--
David Gugick
Imceda Software
www.imceda.com|||I did include a warning in the article about possible network saturation.
In practice I use a dedicated backup network and compression software to
limit impact on production systems. Yes, it is more expensive in terms of
disk storage and network capacity, however I believe it is the best approach
for database recoverability and backup management. I combine the backup to
disk woth a tape archive rotation to give me some backup history. Since I
am truly paranoid about my backups, I even have a separate restore folder so
my tape process can continue uninterrupted eve if I have to restore from
tape.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23reYGbk1EHA.3468@.TK2MSFTNGP14.phx.gbl...
> Geoff N. Hiten wrote:
> > Actually, backing up to a remote file share is pretty easy and is
> > IMHO, a best practice.
> >
> > HowTo: Backup to UNC name using Database Maintenance Wizard
> > http://support.microsoft.com/?kbid=555128
> >
> I don't agree. You are completely throttling the backup based on your
> network bandwidth. For small databases this may be fine, but everything
> else will likely suffer in terms of backup speed. Given that a native
> backup writes out a backup file the size of the database, using a
> network share will likely slow it down. That applies as well to backup
> products like LiteSpeed which thrive on available disk bandwidth. Plus,
> do you really want to flood the network pipe on the server with backup
> data which could also cause network performance problems for all
> connected users.
> My experience has been to back up locally (SAN, NAS, local drive) and
> then move off to the final location (tape, network share, etc.).
> --
> David Gugick
> Imceda Software
> www.imceda.com
>|||Geoff N. Hiten wrote:
> I did include a warning in the article about possible network
> saturation.
> In practice I use a dedicated backup network and compression software
> to limit impact on production systems. Yes, it is more expensive in
> terms of disk storage and network capacity, however I believe it is
> the best approach for database recoverability and backup management.
> I combine the backup to disk woth a tape archive rotation to give me
> some backup history. Since I am truly paranoid about my backups, I
> even have a separate restore folder so my tape process can continue
> uninterrupted eve if I have to restore from tape.
>
I agree. For safety and faster recovery, it's good to have some backups
available on the network separate from the server drives as well as on
tape stored off-site. Compression is key here. But if you're using a
backup product that adds compression, like LiteSpeed, you'll still have
better performance backing up locally and then copying the files over
the network after the backup is complete. You could use LiteSpeed to
dump to a UNC, but you will throttle backup speed a lot and many
companies are very interested in keeping backup times to a minimum.
If you were not network or time constrained, you could back up directly
to a UNC, but I would still argue that using native backup in this
design would be silly because of the large backup sizes.
--
David Gugick
Imceda Software
www.imceda.com|||"David Gugick" wrote:
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site.
This is my basic goal. I want to have the file write out to disk, then ZIP
it, copy the ZIPed file to the SAN'ed drive, and then finally delete the ZIP.
The result would be an uncompressed original TRN on the DB's drive, which
I'll let SQL Server periodically delete.
So then back to the original question. When I add commands to the backup
steps it seems to make the Maint Plan stop working. Yet it is fairly
important to me to allow that to continue working so future admins can manage
the system easily. I actually find it very surprising the Agent doesn't have
Compress and Copy To options, given how painfully obvious these feature seem
to me.
So is there any way to do this? I believe the SQL Server account is logged
into a non-domain account, which might complicate things. But even having the
backup compress after saving would be useful, and then I could "copy from"
the SAN machine.|||I do use LiteSpeed. It definitely helps backup performance and space
consumption. I find a properly configured UNC share can digest a backup
much faster than all but the very fastest multi-drive tape libraries and
certainly close to a local drive under most conditions. Of course,
'properly configured' to me means a RAID 1+0 UltraSCSI-320 array with 146GB
10KRPM drives and a Ultra320 RAID controller and enclosure. With this
setup, I can run backups during a small overnight window and spin them to
tape during the day. I also use a dedicated backup network to separate the
load. Finally, since I am running a cluster, attaching the tape drive
directly to the host unit is not an option. SAN backup options were also
not useful since I needed to keep multiple LUNS in synch and I didn't want
to restore an entire LUN to retrieve a single database file. All the
benefits of a multi-stage disk/tape backup at about half of the cost.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:O96IJxl1EHA.164@.TK2MSFTNGP10.phx.gbl...
> Geoff N. Hiten wrote:
> > I did include a warning in the article about possible network
> > saturation.
> > In practice I use a dedicated backup network and compression software
> > to limit impact on production systems. Yes, it is more expensive in
> > terms of disk storage and network capacity, however I believe it is
> > the best approach for database recoverability and backup management.
> > I combine the backup to disk woth a tape archive rotation to give me
> > some backup history. Since I am truly paranoid about my backups, I
> > even have a separate restore folder so my tape process can continue
> > uninterrupted eve if I have to restore from tape.
> >
> >
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site. Compression is key here. But if you're using a
> backup product that adds compression, like LiteSpeed, you'll still have
> better performance backing up locally and then copying the files over
> the network after the backup is complete. You could use LiteSpeed to
> dump to a UNC, but you will throttle backup speed a lot and many
> companies are very interested in keeping backup times to a minimum.
> If you were not network or time constrained, you could back up directly
> to a UNC, but I would still argue that using native backup in this
> design would be silly because of the large backup sizes.
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
Copying the backup to another drive
backup up TRN's every night and the whole DB every Sunday using the Agent an
d
a fairly out-of-the-box Maint plan.
What I would like to do is copy the latest file to another machine on our
network after every backup run. That machine is attached to an offsight SAN.
I tried simply pointing the backup directory at that machine, but people
noted this is a bad idea and it's not all that easy to do anyway.
I think the easy way to do this is to edit the backup jobs and add a copy.
However I don't see a way to name the file to copy, which will change every
day. Also whenever I've made changes like this in the past, the maint plan n
o
longer works, yet this is very important for our install (so other people ca
n
edit it).
Any advice?"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
>I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
> and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
> SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
> every
> day. Also whenever I've made changes like this in the past, the maint plan
> no
> longer works, yet this is very important for our install (so other people
> can
> edit it).
> Any advice?
The most easily maintainable method (in my opinion of your given scenario)
is to create a DTS job that will perform the backups and do the copying
process. The ActiveX scripting (VBScript) in DTS is conducive to making up
filenames, etc. and is fairly straightforward.
Then you simple schedule a job to fire off the DTS package.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||Actually, backing up to a remote file share is pretty easy and is IMHO, a
best practice.
HowTo: Backup to UNC name using Database Maintenance Wizard
http://support.microsoft.com/?kbid=555128
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
> I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
every
> day. Also whenever I've made changes like this in the past, the maint plan
no
> longer works, yet this is very important for our install (so other people
can
> edit it).
> Any advice?|||Geoff N. Hiten wrote:
> Actually, backing up to a remote file share is pretty easy and is
> IMHO, a best practice.
> HowTo: Backup to UNC name using Database Maintenance Wizard
> http://support.microsoft.com/?kbid=555128
>
I don't agree. You are completely throttling the backup based on your
network bandwidth. For small databases this may be fine, but everything
else will likely suffer in terms of backup speed. Given that a native
backup writes out a backup file the size of the database, using a
network share will likely slow it down. That applies as well to backup
products like LiteSpeed which thrive on available disk bandwidth. Plus,
do you really want to flood the network pipe on the server with backup
data which could also cause network performance problems for all
connected users.
My experience has been to back up locally (SAN, NAS, local drive) and
then move off to the final location (tape, network share, etc.).
David Gugick
Imceda Software
www.imceda.com|||I did include a warning in the article about possible network saturation.
In practice I use a dedicated backup network and compression software to
limit impact on production systems. Yes, it is more expensive in terms of
disk storage and network capacity, however I believe it is the best approach
for database recoverability and backup management. I combine the backup to
disk woth a tape archive rotation to give me some backup history. Since I
am truly paranoid about my backups, I even have a separate restore folder so
my tape process can continue uninterrupted eve if I have to restore from
tape.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23reYGbk1EHA.3468@.TK2MSFTNGP14.phx.gbl...
> Geoff N. Hiten wrote:
> I don't agree. You are completely throttling the backup based on your
> network bandwidth. For small databases this may be fine, but everything
> else will likely suffer in terms of backup speed. Given that a native
> backup writes out a backup file the size of the database, using a
> network share will likely slow it down. That applies as well to backup
> products like LiteSpeed which thrive on available disk bandwidth. Plus,
> do you really want to flood the network pipe on the server with backup
> data which could also cause network performance problems for all
> connected users.
> My experience has been to back up locally (SAN, NAS, local drive) and
> then move off to the final location (tape, network share, etc.).
> --
> David Gugick
> Imceda Software
> www.imceda.com
>|||Geoff N. Hiten wrote:
> I did include a warning in the article about possible network
> saturation.
> In practice I use a dedicated backup network and compression software
> to limit impact on production systems. Yes, it is more expensive in
> terms of disk storage and network capacity, however I believe it is
> the best approach for database recoverability and backup management.
> I combine the backup to disk woth a tape archive rotation to give me
> some backup history. Since I am truly paranoid about my backups, I
> even have a separate restore folder so my tape process can continue
> uninterrupted eve if I have to restore from tape.
>
I agree. For safety and faster recovery, it's good to have some backups
available on the network separate from the server drives as well as on
tape stored off-site. Compression is key here. But if you're using a
backup product that adds compression, like LiteSpeed, you'll still have
better performance backing up locally and then copying the files over
the network after the backup is complete. You could use LiteSpeed to
dump to a UNC, but you will throttle backup speed a lot and many
companies are very interested in keeping backup times to a minimum.
If you were not network or time constrained, you could back up directly
to a UNC, but I would still argue that using native backup in this
design would be silly because of the large backup sizes.
David Gugick
Imceda Software
www.imceda.com|||"David Gugick" wrote:
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site.
This is my basic goal. I want to have the file write out to disk, then ZIP
it, copy the ZIPed file to the SAN'ed drive, and then finally delete the ZIP
.
The result would be an uncompressed original TRN on the DB's drive, which
I'll let SQL Server periodically delete.
So then back to the original question. When I add commands to the backup
steps it seems to make the Maint Plan stop working. Yet it is fairly
important to me to allow that to continue working so future admins can manag
e
the system easily. I actually find it very surprising the Agent doesn't have
Compress and Copy To options, given how painfully obvious these feature seem
to me.
So is there any way to do this? I believe the SQL Server account is logged
into a non-domain account, which might complicate things. But even having th
e
backup compress after saving would be useful, and then I could "copy from"
the SAN machine.|||I do use LiteSpeed. It definitely helps backup performance and space
consumption. I find a properly configured UNC share can digest a backup
much faster than all but the very fastest multi-drive tape libraries and
certainly close to a local drive under most conditions. Of course,
'properly configured' to me means a RAID 1+0 UltraSCSI-320 array with 146GB
10KRPM drives and a Ultra320 RAID controller and enclosure. With this
setup, I can run backups during a small overnight window and spin them to
tape during the day. I also use a dedicated backup network to separate the
load. Finally, since I am running a cluster, attaching the tape drive
directly to the host unit is not an option. SAN backup options were also
not useful since I needed to keep multiple LUNS in synch and I didn't want
to restore an entire LUN to retrieve a single database file. All the
benefits of a multi-stage disk/tape backup at about half of the cost.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:O96IJxl1EHA.164@.TK2MSFTNGP10.phx.gbl...
> Geoff N. Hiten wrote:
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site. Compression is key here. But if you're using a
> backup product that adds compression, like LiteSpeed, you'll still have
> better performance backing up locally and then copying the files over
> the network after the backup is complete. You could use LiteSpeed to
> dump to a UNC, but you will throttle backup speed a lot and many
> companies are very interested in keeping backup times to a minimum.
> If you were not network or time constrained, you could back up directly
> to a UNC, but I would still argue that using native backup in this
> design would be silly because of the large backup sizes.
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
Copying the backup to another drive
backup up TRN's every night and the whole DB every Sunday using the Agent and
a fairly out-of-the-box Maint plan.
What I would like to do is copy the latest file to another machine on our
network after every backup run. That machine is attached to an offsight SAN.
I tried simply pointing the backup directory at that machine, but people
noted this is a bad idea and it's not all that easy to do anyway.
I think the easy way to do this is to edit the backup jobs and add a copy.
However I don't see a way to name the file to copy, which will change every
day. Also whenever I've made changes like this in the past, the maint plan no
longer works, yet this is very important for our install (so other people can
edit it).
Any advice?
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
>I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
> and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
> SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
> every
> day. Also whenever I've made changes like this in the past, the maint plan
> no
> longer works, yet this is very important for our install (so other people
> can
> edit it).
> Any advice?
The most easily maintainable method (in my opinion of your given scenario)
is to create a DTS job that will perform the backups and do the copying
process. The ActiveX scripting (VBScript) in DTS is conducive to making up
filenames, etc. and is fairly straightforward.
Then you simple schedule a job to fire off the DTS package.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||Actually, backing up to a remote file share is pretty easy and is IMHO, a
best practice.
HowTo: Backup to UNC name using Database Maintenance Wizard
http://support.microsoft.com/?kbid=555128
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Maury Markowitz" <MauryMarkowitz@.discussions.microsoft.com> wrote in
message news:70154059-0A56-41A1-B07B-9DCD47C884BD@.microsoft.com...
> I have set up our new SS2k install to use the Bulk-Logged backup method,
> backup up TRN's every night and the whole DB every Sunday using the Agent
and
> a fairly out-of-the-box Maint plan.
> What I would like to do is copy the latest file to another machine on our
> network after every backup run. That machine is attached to an offsight
SAN.
> I tried simply pointing the backup directory at that machine, but people
> noted this is a bad idea and it's not all that easy to do anyway.
> I think the easy way to do this is to edit the backup jobs and add a copy.
> However I don't see a way to name the file to copy, which will change
every
> day. Also whenever I've made changes like this in the past, the maint plan
no
> longer works, yet this is very important for our install (so other people
can
> edit it).
> Any advice?
|||Geoff N. Hiten wrote:
> Actually, backing up to a remote file share is pretty easy and is
> IMHO, a best practice.
> HowTo: Backup to UNC name using Database Maintenance Wizard
> http://support.microsoft.com/?kbid=555128
>
I don't agree. You are completely throttling the backup based on your
network bandwidth. For small databases this may be fine, but everything
else will likely suffer in terms of backup speed. Given that a native
backup writes out a backup file the size of the database, using a
network share will likely slow it down. That applies as well to backup
products like LiteSpeed which thrive on available disk bandwidth. Plus,
do you really want to flood the network pipe on the server with backup
data which could also cause network performance problems for all
connected users.
My experience has been to back up locally (SAN, NAS, local drive) and
then move off to the final location (tape, network share, etc.).
David Gugick
Imceda Software
www.imceda.com
|||I did include a warning in the article about possible network saturation.
In practice I use a dedicated backup network and compression software to
limit impact on production systems. Yes, it is more expensive in terms of
disk storage and network capacity, however I believe it is the best approach
for database recoverability and backup management. I combine the backup to
disk woth a tape archive rotation to give me some backup history. Since I
am truly paranoid about my backups, I even have a separate restore folder so
my tape process can continue uninterrupted eve if I have to restore from
tape.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:%23reYGbk1EHA.3468@.TK2MSFTNGP14.phx.gbl...
> Geoff N. Hiten wrote:
> I don't agree. You are completely throttling the backup based on your
> network bandwidth. For small databases this may be fine, but everything
> else will likely suffer in terms of backup speed. Given that a native
> backup writes out a backup file the size of the database, using a
> network share will likely slow it down. That applies as well to backup
> products like LiteSpeed which thrive on available disk bandwidth. Plus,
> do you really want to flood the network pipe on the server with backup
> data which could also cause network performance problems for all
> connected users.
> My experience has been to back up locally (SAN, NAS, local drive) and
> then move off to the final location (tape, network share, etc.).
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
|||Geoff N. Hiten wrote:
> I did include a warning in the article about possible network
> saturation.
> In practice I use a dedicated backup network and compression software
> to limit impact on production systems. Yes, it is more expensive in
> terms of disk storage and network capacity, however I believe it is
> the best approach for database recoverability and backup management.
> I combine the backup to disk woth a tape archive rotation to give me
> some backup history. Since I am truly paranoid about my backups, I
> even have a separate restore folder so my tape process can continue
> uninterrupted eve if I have to restore from tape.
>
I agree. For safety and faster recovery, it's good to have some backups
available on the network separate from the server drives as well as on
tape stored off-site. Compression is key here. But if you're using a
backup product that adds compression, like LiteSpeed, you'll still have
better performance backing up locally and then copying the files over
the network after the backup is complete. You could use LiteSpeed to
dump to a UNC, but you will throttle backup speed a lot and many
companies are very interested in keeping backup times to a minimum.
If you were not network or time constrained, you could back up directly
to a UNC, but I would still argue that using native backup in this
design would be silly because of the large backup sizes.
David Gugick
Imceda Software
www.imceda.com
|||"David Gugick" wrote:
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site.
This is my basic goal. I want to have the file write out to disk, then ZIP
it, copy the ZIPed file to the SAN'ed drive, and then finally delete the ZIP.
The result would be an uncompressed original TRN on the DB's drive, which
I'll let SQL Server periodically delete.
So then back to the original question. When I add commands to the backup
steps it seems to make the Maint Plan stop working. Yet it is fairly
important to me to allow that to continue working so future admins can manage
the system easily. I actually find it very surprising the Agent doesn't have
Compress and Copy To options, given how painfully obvious these feature seem
to me.
So is there any way to do this? I believe the SQL Server account is logged
into a non-domain account, which might complicate things. But even having the
backup compress after saving would be useful, and then I could "copy from"
the SAN machine.
|||I do use LiteSpeed. It definitely helps backup performance and space
consumption. I find a properly configured UNC share can digest a backup
much faster than all but the very fastest multi-drive tape libraries and
certainly close to a local drive under most conditions. Of course,
'properly configured' to me means a RAID 1+0 UltraSCSI-320 array with 146GB
10KRPM drives and a Ultra320 RAID controller and enclosure. With this
setup, I can run backups during a small overnight window and spin them to
tape during the day. I also use a dedicated backup network to separate the
load. Finally, since I am running a cluster, attaching the tape drive
directly to the host unit is not an option. SAN backup options were also
not useful since I needed to keep multiple LUNS in synch and I didn't want
to restore an entire LUN to retrieve a single database file. All the
benefits of a multi-stage disk/tape backup at about half of the cost.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"David Gugick" <davidg-nospam@.imceda.com> wrote in message
news:O96IJxl1EHA.164@.TK2MSFTNGP10.phx.gbl...
> Geoff N. Hiten wrote:
> I agree. For safety and faster recovery, it's good to have some backups
> available on the network separate from the server drives as well as on
> tape stored off-site. Compression is key here. But if you're using a
> backup product that adds compression, like LiteSpeed, you'll still have
> better performance backing up locally and then copying the files over
> the network after the backup is complete. You could use LiteSpeed to
> dump to a UNC, but you will throttle backup speed a lot and many
> companies are very interested in keeping backup times to a minimum.
> If you were not network or time constrained, you could back up directly
> to a UNC, but I would still argue that using native backup in this
> design would be silly because of the large backup sizes.
> --
> David Gugick
> Imceda Software
> www.imceda.com
>
sql
Saturday, February 25, 2012
Copying a database
I want to copy a database from my SQL Server system and install it on another. Can anyone suggest me how I can copy the same along with log info, login info, permissions, etc. and install it on another system. Thanks!
Vik!Hi,
Basically, you can't. Not in one fell swoop. What you can do, however, is to detach the database from one server, copy it--along with the log file--to the new server, and attach it. But you'll need to recreate the logins, because the internal SIDs--security IDs--on the new server won't match those on the old.
If permissions are assigned to database roles, those will be intact.
Don|||If you have Enterprise Manager, you can just create a sql script, and script that database, and it's users.|||What donkiely says in true. However, I'd say the best techniques are...
1. Backup database on server 1, restore it on server 2, re-wire logins
2. Use SQLs Transformation Wizards from Enterprise Server...ok for small databases but network intensive. Best to sort out the security issue first for this one, that way the transform will do the bulk (ahem) of the work for you.
3. As previous posters says, script all the meta data, then copy the data. Basically the same issues as (2) but at least you've got some nice scripts to reapply if your db gets into trouble.|||KraGiE,
Just keep in mind that scripting the database won't copy the data. You'd then have to use DTS or some other means to copy it.
Don
Monday, February 13, 2012
Copy SQL Users from a server to another
We are using a SQL 2000 Std Server on a Windows 2000
Server. I just install a new Server running with Windows
2003 Standart and i installed SQL 2000 Std on it.
When i go into Enterprise Manager, On Security folder and
Logins tabs, i saw all the users that needs access to wich
DB etc.
I want to copy all this security information to my new SQL
Server... how did i do that?
thanks!Hi,
Have a look into the below link.
http://www.databasejournal.com/feat...cle.php/2228611
Thanks
Hari
MCDBA
"GuillauG" <Guillaume.Genest@.qc.teknion.com> wrote in message
news:2b4501c42875$89e027b0$a601280a@.phx.gbl...
> Hi,
> We are using a SQL 2000 Std Server on a Windows 2000
> Server. I just install a new Server running with Windows
> 2003 Standart and i installed SQL 2000 Std on it.
> When i go into Enterprise Manager, On Security folder and
> Logins tabs, i saw all the users that needs access to wich
> DB etc.
> I want to copy all this security information to my new SQL
> Server... how did i do that?
> thanks!|||Hi,
Thank, from your link i find this link:
http://www.databasejournal.com/feat...l/article.php/2
228611
Really easier for me than using sp_ function.
But after i follow the instruction on this web site i got
an "Unspecified error" while running the task. But all my
users seems to have been copied...
Someone know why i get this error?
Thanks!
>--Original Message--
>Hi,
>Have a look into the below link.
>http://www.databasejournal.com/feat...ql/article.php/
2228611
>Thanks
>Hari
>MCDBA
>
>"GuillauG" <Guillaume.Genest@.qc.teknion.com> wrote in
message
>news:2b4501c42875$89e027b0$a601280a@.phx.gbl...
and[vbcol=seagreen]
wich[vbcol=seagreen]
SQL[vbcol=seagreen]
>
>.
>|||Hi Guillaume,
You can transfer logins and passwords between instances following the
suggestions mentioned below:
246133 HOW TO: Transfer Logins and Passwords Between Instances of SQL Server
http://support.microsoft.com/?id=246133
If there is any thing unclear, please feel free to let me know.
Regards,
Michael Shao
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.|||I already saw that article, but when i want to copy users
from one Server to another i got an "Unspecified error"
while running the task. But all the users seems to have
been copied...
any suggestion?
>--Original Message--
>Hi Guillaume,
>You can transfer logins and passwords between instances
following the
>suggestions mentioned below:
>246133 HOW TO: Transfer Logins and Passwords Between
Instances of SQL Server
>http://support.microsoft.com/?id=246133
>If there is any thing unclear, please feel free to let me
know.
>Regards,
>Michael Shao
>Microsoft Online Partner Support
>Get Secure! - www.microsoft.com/security
>This posting is provided "as is" with no warranties and
confers no rights.
>.
>|||Hi Guillaume,
Please check the following article to see if it is able to fix your
problem.
305365 PRB: DTS Package Fails with "Unspecified Error" Error Message When
You
http://support.microsoft.com/?id=305365
If not, please provide the detailed repro steps so that I can reproduce
this problem on my side and perform further research.
Thank you,
Michael Shao
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.|||Thanks a lot. This is exactly my problems...
how bizzarre that they don't edit this error message to
give us something more specific :P
>--Original Message--
>Hi Guillaume,
>Please check the following article to see if it is able
to fix your
>problem.
>305365 PRB: DTS Package Fails with "Unspecified Error"
Error Message When
>You
>http://support.microsoft.com/?id=305365
>If not, please provide the detailed repro steps so that I
can reproduce
>this problem on my side and perform further research.
>Thank you,
>Michael Shao
>Microsoft Online Partner Support
>Get Secure! - www.microsoft.com/security
>This posting is provided "as is" with no warranties and
confers no rights.
>.
>