Showing posts with label project. Show all posts
Showing posts with label project. Show all posts

Thursday, March 29, 2012

Correct Setup for SQL Server 2005 for web project!

Ive been trying to get some type of Blogpost tutorial Etc on how to set up SQL Server 2005 to serve data to a website

1 How do I setup users?
a) Can I have 3 roles?
1a) Owner of DB can read/write

2a) reader Can Only read from database

3a) Writer. Can only write to database

How would I set this up? How can I call all these from ASP.NET depending on what the user is currently using on the website?

eg:

Just serving pages with content (reader)
Forms (writter)
admin (owner)

I also need to have the SQL keep sessions (Ive already ran aspnet_reqSQL.exe) and created all that im just unsure what type user can access all this

Any tutorials on how to set up a whole WEb application project from DB to VS 2005?

Thanks


Bump

No one knows how to correctly setup SQL Server?

|||

So no one know how to setup SQL Server 2005 efficiently and securely???????

*Sight*

|||

a) Can I have 3 roles?
1a) Owner of DB can read/write

2a) reader Can Only read from database

3a) Writer. Can only write to database

After you have installed sql server,you can create some user accounts and assign them permissions as you like. Open sql server-> select one database and expand-->select "security"--> click "Users" or "Roles"

sql

Tuesday, March 27, 2012

copying/ backing up a database

Hi. im shortly going to have to submit my project for uni which ive created using sql server. How can i copy everything that ive made so i can submit everything and it can be replicated if necessary. Do i use the backup database task in enterprise manager or do i have to do that and export data or..?

ive used tables and stored procedures and a diagram btw.

thanks for any advice

Look up sp_detach_db in Books Online. Detach the database, copyit, and submit it. Then they can re-attach it using sp_attach_db.

sql

Thursday, March 22, 2012

copying stored procedures to C drive on local machine

I have a project with around 50 stored procedures, can not seem to see the
dbase when using explorer but have access to it with enterprize manager and
visual studio.net server explorer. Just wondering if there is an easy way to
copy the stored procedures onto the workstation possibly using enterprize
manager or .net server explorer? thanks.
Paul G
Software engineer.
The stored procedures don't really have any meaning outside of the database,
and you should only see MDF/NDF/LDF files in windows explorer (they are not
separate files). What exactly are you trying to accomplish?
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.
|||You can generate a script of the stored procedures. In SQL Enterprise
manager, right click your database name and go to SQL Scripting... It will
walk you through the process..
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.
|||Hi thanks for the response. Just wanted to put the stored procedure code
onto a laptop for reference, so copying out to a client machine is sufficient.
"Aaron [SQL Server MVP]" wrote:

> The stored procedures don't really have any meaning outside of the database,
> and you should only see MDF/NDF/LDF files in windows explorer (they are not
> separate files). What exactly are you trying to accomplish?
> --
> http://www.aspfaq.com/
> (Reverse address to reply.)
>
>
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> and
> to
>
>
|||ok thanks will give it a try. Hopefully it allows you to select all the
stored procedures in some fashion as it would take a bit of time to do each
one individually.
"Wayne Snyder" wrote:

> You can generate a script of the stored procedures. In SQL Enterprise
> manager, right click your database name and go to SQL Scripting... It will
> walk you through the process..
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> and
> to
>
>
|||Yes, it does, try it out...
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:ABF8BA22-734A-472A-9019-91B0D16BD502@.microsoft.com...
> ok thanks will give it a try. Hopefully it allows you to select all the
> stored procedures in some fashion as it would take a bit of time to do
each[vbcol=seagreen]
> one individually.
> "Wayne Snyder" wrote:
will[vbcol=seagreen]
the[vbcol=seagreen]
manager[vbcol=seagreen]
way[vbcol=seagreen]
enterprize[vbcol=seagreen]

copying stored procedures to C drive on local machine

I have a project with around 50 stored procedures, can not seem to see the
dbase when using explorer but have access to it with enterprize manager and
visual studio.net server explorer. Just wondering if there is an easy way to
copy the stored procedures onto the workstation possibly using enterprize
manager or .net server explorer? thanks.
--
Paul G
Software engineer.The stored procedures don't really have any meaning outside of the database,
and you should only see MDF/NDF/LDF files in windows explorer (they are not
separate files). What exactly are you trying to accomplish?
--
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.|||You can generate a script of the stored procedures. In SQL Enterprise
manager, right click your database name and go to SQL Scripting... It will
walk you through the process..
--
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.|||Hi thanks for the response. Just wanted to put the stored procedure code
onto a laptop for reference, so copying out to a client machine is sufficient.
"Aaron [SQL Server MVP]" wrote:
> The stored procedures don't really have any meaning outside of the database,
> and you should only see MDF/NDF/LDF files in windows explorer (they are not
> separate files). What exactly are you trying to accomplish?
> --
> http://www.aspfaq.com/
> (Reverse address to reply.)
>
>
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> > I have a project with around 50 stored procedures, can not seem to see the
> > dbase when using explorer but have access to it with enterprize manager
> and
> > visual studio.net server explorer. Just wondering if there is an easy way
> to
> > copy the stored procedures onto the workstation possibly using enterprize
> > manager or .net server explorer? thanks.
> > --
> > Paul G
> > Software engineer.
>
>|||ok thanks will give it a try. Hopefully it allows you to select all the
stored procedures in some fashion as it would take a bit of time to do each
one individually.
"Wayne Snyder" wrote:
> You can generate a script of the stored procedures. In SQL Enterprise
> manager, right click your database name and go to SQL Scripting... It will
> walk you through the process..
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> > I have a project with around 50 stored procedures, can not seem to see the
> > dbase when using explorer but have access to it with enterprize manager
> and
> > visual studio.net server explorer. Just wondering if there is an easy way
> to
> > copy the stored procedures onto the workstation possibly using enterprize
> > manager or .net server explorer? thanks.
> > --
> > Paul G
> > Software engineer.
>
>|||Yes, it does, try it out...
--
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:ABF8BA22-734A-472A-9019-91B0D16BD502@.microsoft.com...
> ok thanks will give it a try. Hopefully it allows you to select all the
> stored procedures in some fashion as it would take a bit of time to do
each
> one individually.
> "Wayne Snyder" wrote:
> > You can generate a script of the stored procedures. In SQL Enterprise
> > manager, right click your database name and go to SQL Scripting... It
will
> > walk you through the process..
> >
> > --
> > Wayne Snyder, MCDBA, SQL Server MVP
> > Mariner, Charlotte, NC
> > www.mariner-usa.com
> > (Please respond only to the newsgroups.)
> >
> > I support the Professional Association of SQL Server (PASS) and it's
> > community of SQL Server professionals.
> > www.sqlpass.org
> >
> > "Paul" <Paul@.discussions.microsoft.com> wrote in message
> > news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> > > I have a project with around 50 stored procedures, can not seem to see
the
> > > dbase when using explorer but have access to it with enterprize
manager
> > and
> > > visual studio.net server explorer. Just wondering if there is an easy
way
> > to
> > > copy the stored procedures onto the workstation possibly using
enterprize
> > > manager or .net server explorer? thanks.
> > > --
> > > Paul G
> > > Software engineer.
> >
> >
> >

copying stored procedures to C drive on local machine

I have a project with around 50 stored procedures, can not seem to see the
dbase when using explorer but have access to it with enterprize manager and
visual studio.net server explorer. Just wondering if there is an easy way t
o
copy the stored procedures onto the workstation possibly using enterprize
manager or .net server explorer? thanks.
--
Paul G
Software engineer.The stored procedures don't really have any meaning outside of the database,
and you should only see MDF/NDF/LDF files in windows explorer (they are not
separate files). What exactly are you trying to accomplish?
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.|||You can generate a script of the stored procedures. In SQL Enterprise
manager, right click your database name and go to SQL Scripting... It will
walk you through the process..
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> I have a project with around 50 stored procedures, can not seem to see the
> dbase when using explorer but have access to it with enterprize manager
and
> visual studio.net server explorer. Just wondering if there is an easy way
to
> copy the stored procedures onto the workstation possibly using enterprize
> manager or .net server explorer? thanks.
> --
> Paul G
> Software engineer.|||Hi thanks for the response. Just wanted to put the stored procedure code
onto a laptop for reference, so copying out to a client machine is sufficien
t.
"Aaron [SQL Server MVP]" wrote:

> The stored procedures don't really have any meaning outside of the databas
e,
> and you should only see MDF/NDF/LDF files in windows explorer (they are no
t
> separate files). What exactly are you trying to accomplish?
> --
> http://www.aspfaq.com/
> (Reverse address to reply.)
>
>
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> and
> to
>
>|||ok thanks will give it a try. Hopefully it allows you to select all the
stored procedures in some fashion as it would take a bit of time to do each
one individually.
"Wayne Snyder" wrote:

> You can generate a script of the stored procedures. In SQL Enterprise
> manager, right click your database name and go to SQL Scripting... It will
> walk you through the process..
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "Paul" <Paul@.discussions.microsoft.com> wrote in message
> news:E3D99B30-FBA9-41E8-9F87-413D02810F43@.microsoft.com...
> and
> to
>
>|||Yes, it does, try it out...
http://www.aspfaq.com/
(Reverse address to reply.)
"Paul" <Paul@.discussions.microsoft.com> wrote in message
news:ABF8BA22-734A-472A-9019-91B0D16BD502@.microsoft.com...
> ok thanks will give it a try. Hopefully it allows you to select all the
> stored procedures in some fashion as it would take a bit of time to do
each[vbcol=seagreen]
> one individually.
> "Wayne Snyder" wrote:
>
will[vbcol=seagreen]
the[vbcol=seagreen]
manager[vbcol=seagreen]
way[vbcol=seagreen]
enterprize[vbcol=seagreen]sql

Tuesday, March 20, 2012

Copying SPs from one DB to another

I'm working on a system that will create a new DB that copies the SPs from
an older DB.
The purpose of this project is to create smaller, yearly DBs that will
require fewer table scans (since most searches are done on only the current
year). I would like to mimic the old DB year by year, copying all table
structure, a few tables worth of data (so far so good on those queries), but
I'm unsure on how to copy SPs from one DB to another.
Any thoughts?
Chad Edge
Seattle, WAYou can create the procedures in model, so when you create the new DB, they
get created automatically.
You can also try SQL Compare from www.red-gate.com, which will help you
generate scripts to synchronize two databases.
"Chad Edge" <xcasex@.hotmail.com> wrote in message
news:D2746C57-08D3-4A58-B0C0-6D7725D27C73@.microsoft.com...
> I'm working on a system that will create a new DB that copies the SPs from
> an older DB.
> The purpose of this project is to create smaller, yearly DBs that will
> require fewer table scans (since most searches are done on only the
> current year). I would like to mimic the old DB year by year, copying all
> table structure, a few tables worth of data (so far so good on those
> queries), but I'm unsure on how to copy SPs from one DB to another.
> Any thoughts?
> Chad Edge
> Seattle, WA
>|||Aaron,
Thanks for your help on that. I'll look up the link and work on the scripts.
I'm actually wondering now if we even need to copy the SP's. I guess it will
depend on what amount of weight we want to put on the interface-level.
What we're trying to do is make smaller seasonal db's without changing much
of the interfaces access (there's an internal edit tool, and an external
read-only view tool). If we keep our current, 'master' database in-place w/
the SP's, then we could adjust those SP's to choose which smaller, seasonal
db to get data from. If we do things that way, then we won't need to manage
all the SP's on multiple db's.
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:eHxOBWAlGHA.1936@.TK2MSFTNGP04.phx.gbl...
> You can create the procedures in model, so when you create the new DB,
> they get created automatically.
> You can also try SQL Compare from www.red-gate.com, which will help you
> generate scripts to synchronize two databases.
>
>
> "Chad Edge" <xcasex@.hotmail.com> wrote in message
> news:D2746C57-08D3-4A58-B0C0-6D7725D27C73@.microsoft.com...
>|||> What we're trying to do is make smaller seasonal db's without changing
> much of the interfaces access (there's an internal edit tool, and an
> external read-only view tool). If we keep our current, 'master' database
> in-place w/ the SP's, then we could adjust those SP's to choose which
> smaller, seasonal db to get data from. If we do things that way, then we
> won't need to manage all the SP's on multiple db's.
I recommend having the schema spread across the dbs, which will allow for
the app to simply change its connection string, instead of your main control
database needing to generate a dynamic SQL string EXEC('EXEC ' + @.dbname + '
.dbo.ProcedureName');|||Aaron,
My worry about doing that would be having to manage the SP's from season to
season.
So just to clarify, you're saying this is not recommended:
1. Master db, currently the only db, is called MAINEVENTS1. This db has all
SP's, and stores about 10 years worth of information (events)
2. We'd like to create new, seasonal db's called 'EVENTS_YYxYY' (eg:
EVENTS_06x07)
I'm considering this approach since it would shrink the amount of table
data that would need to be scanned, since most event results are polled for
the current 'season'
3. When an administrator logs in using (currently CFM) our admin interface
(on our intranet), they would have a new dropdown "Select Season:" which
would set their session for a specific season
4. When visitors to our Web site (again, using CFM), they would by default
get the current season set as their session, but would have text-links to
previous seasons.
5. Passing the session variable of which season is selected to the query
string would fire the correct SP from MAINEVENTS1, which would then retrieve
the database information from the correct DB.
In our first year, that would mean that there'd be two DB's :
MAINEVENTS1 with the last 10 years worth of data, and EVENTS_06x07.
6. This fall, we'd be adding new event types (concerts, speaking
engagements, etc) that would only be added to the newer DB's in the seasons
that they happened (no need to add Public Speaking to the MAINEVENTS1 since
it never happened).
7. The next round of updates would allow us to split MAINEVENTS1 into
EVENTS_01x02, 03_04, etc. further shrinking our query source
8. The MAINEVENTS1 would then become a very small DB, just a gateway of SPs
that query other dbs and format the data.
The reason I've been approaching the project this way is so the Online
interface only needs minor changes, the SP's only need minor changes, and we
don't have to manage SP's from season to season (say we change the way we'd
like data formatted; we'd have to track the changes down and verify control
over each database - bad form in my opinion).
The main reasons for approaching the project this way:
1. Shrink the databases since there's thousands of rows in our ten-year db.
2. Lock past events so they don't get overwritten (we've had issues where a
venue will change it's name; if we update our database the way it's
organized now, all events in the past are then related to the new venue
name - this makes it a nightmare to try and track in the past where an event
occurred because the venue in the past will no longer exist in our db).
I know #2 should be addressed, but we're not able to make such drastic
changes to the data at this time. Perhaps a round 3 we can restructure. For
now, we're getting about 2 million hits a month for our events and I'd like
to keep the impact small.
Hopefully that's not boring you to tears, I just needed to dump my brain
onto the screen to make sure we're speaking the same language.
Thanks again for your suggestions. I'll look at the structure and the SP's a
little more and see what work is ahead. I've got to make a decision by the
end of this w so I can roll out the replication by EOM June.
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:u3DsmsIlGHA.2304@.TK2MSFTNGP02.phx.gbl...
> I recommend having the schema spread across the dbs, which will allow for
> the app to simply change its connection string, instead of your main
> control database needing to generate a dynamic SQL string EXEC('EXEC ' +
> @.dbname + ' .dbo.ProcedureName');
>|||> My worry about doing that would be having to manage the SP's from season
> to season.
What is there to manage? If you store the SPs in model, then every new
database you create will have the SPs automatically.
When you have a changed SP to deploy, you run a loop to create a series of
CREATE or ALTER procedure statements, to generate a script you can run,
deploying to all databases at once.
We have two production systems that use this model, one has over 3000
identical databases spread across three machines, and the other has over 750
identical databases across three different environments (dev, qa,
production).
I use red-gate's SQL compare to synchronize model between environments.
And by naming the databases with a common naming scheme, I was able to write
a stored procedure in master that gets all the databases from sys.databases
matching that naming scheme.
I store the create and/or alter scripts in a file called change.sql, then
run a C# command line program that calls the stored procedure, and creates
an output file for each database.
Deploying changes really couldn't be simpler (except when there are
exceptions between environments). Everything configurable (such as network
share paths, SMTP servers, ftp passwords, etc.) are stored in properties
tables, nothing is hard-coded, and it makes it really easy to take the one
customer who explodes in activity from one server to another. Just backup
the database, restore on another server, and update a flag in the control
database that tells you what server the app can find that database.
A|||> I store the create and/or alter scripts in a file called change.sql, then
> run a C# command line program that calls the stored procedure, and creates
> an output file for each database.
Sorry, let me explain that sentence a little better.
Let's say I have a fix to a single stored procedure that needs to be
broadcast to all databases in an environment.
I store a single ALTER PROCEDURE script in change.sql.
My C# app grabs the stored procedure from master, steps through the 750+
databases, and for each one, reads change.sql, adds a USE [dbname] in front
of it, and appends it to a file called deploy.sql. Now I can run a single
.sql script to deploy my changes to every single database.
Sure, there are pros and cons, and yes that many databases sounds like a lot
to manage, and it is. But "a lot" does not necessarily equate to "hard"...
there are many ways to make the multi-db scenario much easier to manage.
One of these days I'm going to write an article on it...
A|||Aaron,
I'm liking where you're going (excluding the fact that it's all new ground
to me; I've stayed as far from MS coding as I have been able in my career).
One thing I might not have mentioned: We're going to get into situations
where we'll need a *controller* sp that takes variables and chooses which
database to retrieve the data from. For example:
Right now (with MAINEVENTS1) we have a table of Artists (musicians, let's
say). If you want to retrieve information about a specific Artist (say,
their instrument, the places they've played, tickets sold, etc) you run a
query using their ArtistID (just an example). That works in the MAINEVENTS1
db.
Now lets say we create EVENTS_06x07 that adds additional columns of data
that weren't available in the past (such as, height and weight, which
changes as the year goes on - just to make it dynamic). We won't have that
information for anything in the past, so we'd like to not make any changes
to MAINEVENTS1. However, we would now need to make the query (SP) understand
where to get it's data, and how much detail (columns) to return, based on
the year passed (pre-06 or post-06).
I picture (in my ideal candy-land imagination) a controller SP that would
take two variables (the ArtistID, and the Year), determine which database to
run the SP (or better yet, add the correct formatting to the SP itself), run
the SP and return the proper column data.
This is where replicating SPs gets tricky: one DB may have changes that a
previous years DB won't.
Also, leaving SP's in the Model would not automatically roll-out the changes
until a new DB was created, or when we create an entirely separate
application (such as something in C#).
My example is very elementary; there's trickier situations (statistics about
artists, about venues, about groups, etc) that will change from year to
year. That's where centrally managing SP's and having the MAINEVENTS1 manage
the connections and returns makes so much sense to me.
Again, thanks for your insight. You're making a lot of sense and I've
already started working with the Red-Gate software today (the dependency
tracker is flipping me out).
Chad
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:%23ESReGJlGHA.1320@.TK2MSFTNGP04.phx.gbl...
> Sorry, let me explain that sentence a little better.
> Let's say I have a fix to a single stored procedure that needs to be
> broadcast to all databases in an environment.
> I store a single ALTER PROCEDURE script in change.sql.
> My C# app grabs the stored procedure from master, steps through the 750+
> databases, and for each one, reads change.sql, adds a USE [dbname] in
> front of it, and appends it to a file called deploy.sql. Now I can run a
> single .sql script to deploy my changes to every single database.
> Sure, there are pros and cons, and yes that many databases sounds like a
> lot to manage, and it is. But "a lot" does not necessarily equate to
> "hard"... there are many ways to make the multi-db scenario much easier to
> manage. One of these days I'm going to write an article on it...
> A
>|||> Now lets say we create EVENTS_06x07 that adds additional columns of data
> that weren't available in the past (such as, height and weight, which
> changes as the year goes on - just to make it dynamic).
So, in every db, add weight and height to the table(s) or view(s), adjust
procedure(s) that exist there, and only update those customers that have
that data (otherwise it is NULL).
Displaying these columns is something the app can decide based on the
settings in the control db, not based on the data itself.
Adding columns to a stored procedure should not change the behavior of the
app... but the app *can* use those new columns if it deems it should.
A

Monday, March 19, 2012

Copying Dimensions in BIDS

Is it possible to copy a dimension from one solution/project to another. I have a generic time dimension and a few other common dimensions that I would rather just copy from one project to another rather than setting it up entirely from one project to another.

I tried copying the XML file from one project to another, but that did not work because they had different DSV's, and thus different design-time unique names.

any suggestions?

-ClaytonUsing SSMS, you should be able to script out the creation of the dimension and then execute it in the other database. If you are using different DSV (or DS) names, then you will have to modify the script before you execute it on the other system but hopefully that will be easy. At least it is just editing a single file.

_-_-_ Dave

Wednesday, March 7, 2012

Copying a project. How?

We recently migrated to SQL2005 and have moved all OLAP databases to the new server. I need to create new OLAP databases. In SQL2000, just a copy would do it, but I am having problem in 2005. I tried to script the database as 'create' or even created an empty one and try to change it by 'alter' from a valid DB, but no luck (XMLA file was modified to have the new names).
Any pointers, suggestion would be appreciated. I know how to change the DataSource and DataView later, just need to know how to copy one!!!!
Also please let me know what tool is used (Management Studio, Visual Studio, etc...)
Thanks in advance!

There is a special project in BIDS where you can reverse engineer a SSAS 2005 database.

If you have a BIDS project with a cube you can choose to save the whole project under a new name.

HTH

Thomas Ivarsson

|||I used BIDS, made a new 'Import Analysis Services 9.0 Database' from the template and imported a working database with everything. <-- Great.
Then made a new DataSource pointing to the right source and changed the DSV in XML to point to it <-- Great
Problem now is that, it is throwing and error that The 'Role' with 'ID' ='suchandsuch' doesn't exist in the collection when I go to build it. I saw this Role in the DataSource when viewed in XML and obviously can change it, but will not compile even when changed to a more previleged role. Same error!
The other OLAP database runs fine.
Any comments, suggestion will be appreciated as always, as I hope this info will help the next person.
noorbakhsh|||Got it. The above will fix the problem except we had a removed user which I had to remove from users in roles manually. Also the database I copied was corrupted so when I re-did it it worked fine.
I hope this helps others!
noorbakhsh|||One more follow-up since the other stuff had errors and there was an easier way:

I used BIDS, made a new

'Import Analysis Services 9.0 Database' from the template with the new databse name and imported

a working database with everything. <-- Great.

Then I just right-clicked on the DataSource/View Designer and 'edit' the connection string pointing to the new DataBase.

Right clicked on the database/properties and changed the deployment parameters

Deploy - Done

I hope this helps!!! - It should :)

noorbakhsh

Sunday, February 19, 2012

Copy Table in same database

I'm using partitions in my project to age data that's older than some predetermined number of days. Normal deletion of the data takes too long, but the following sequence of operations seems to work well when testing with raw sql queries:

    Split (to create a new partition for new data) Create (a new temporary table to hold the oldest partition's data) Switch (to move the oldest partition's data into the new temporary table) Merge (to combine the oldest two partitions, "removing" the oldest one in the process) Drop (to drop the temporary table and all the old data that we don't want anymore)

My current problem arises at step 2, "Create." The precise column/index/etc layout of the data being aged is not be known/hardcoded into my program. I've been trying to use SMO to create a copy of an existing table into the same database with a new name, but I've been failing miserably. It may include a varbinary(max), e.g., and the "max" part doesn't seem to be exposed by Smo.Column.Datatype.

Is there a way to dynamically create a copy of a table that already exists inside the same database, changing only the name and without bringing along the table's data?

Or should I smack my DBA upside the head and tell him to put in some extra temporary tables that I can just truncate instead of making me create them dynamically?

One technique is to use a Select Into:

Select * Into myNewTableName From myOldTableName Where 1=2 -- usually returns no records ;)|||Sure, I could SELECT INTO, but then what's the point of SMO? Isn't the goal, here, to avoid manually building my own query?|||

The point of SMO is that it provides an object library for you to write code to make your management of SQL Server easier for you. If something can be easily accomplished with Transact-SQL, then why not use it? It's not about platform evangelism, it's about making your job easier.

That being said, this code will bring in the HumanResources.Employee table from the AdventureWorks database and replicate it into a new table called HumanResources.NewEmp:

Dim srv As Server
Dim srvConn As ServerConnection
srv = New Server("MyServer")
srvConn = srv.ConnectionContext
srvConn.LoginSecure = True

Dim db As Database
db = srv.Databases("AdventureWorks")

Dim tblExisting As Table
Dim tblNew As Table

tblExisting = db.Tables.Item("Employee", "HumanResources")
tblNew = New Table(db, "NewEmp", "HumanResources")
Dim colExistColumn As ColumnCollection
Dim clmExist As Column
colExistColumn = tblExisting.Columns
For Each clmExist In colExistColumn
Dim clmNew As Column
clmNew = New Column(tblNew, clmExist.Name)
clmNew.DataType = clmExist.DataType
clmNew.Nullable = clmExist.Nullable
tblNew.Columns.Add(clmNew)
Next
tblNew.Create()

Good luck.

|||

Thanks for the info, Allen, but I've already gone down that road. Don't forget an index loop that goes something like:

foreach (Index idx in tblOld){
String idxName = idx.Name + "_newTable"; // Or whatever, not important here
tblNew.Indexes.Add(new Index(tblNew, idxName));
tblNew.Indexes[idxName].IsClustered = idx.IsClustered;
tblNew.Indexes[idxName].IndexKeyType = idx.IndexKeyType;
tblNew.Indexes[idxName].IgnoreDuplicateKeys = idx.IgnoreDuplicateKeys;
tblNew.Indexes[idxName].IsUnique = idx.IsUnique;
foreach (IndexedColumn col in idx.IndexedColumns){
tblNew.Indexes[idxName].IndexedColumns.Add(new IndexedColumn(tblNew.Indexes[idxName], col.Name));
}
}

Recall that the tables have to be sufficiently compatible for me to switch a partition from the original into the temporary table. Creating just a table with the same column types doesn't cut it. The code sample you provide (which is nearly identical to what I wrote in my own code as well) also won't deal properly with all column data types. Try that code on a table with a type of varbinary(max), e.g. The temporary table that's created will be of type varbinary(1).

Unhandled Exception: Microsoft.SqlServer.Management.Smo.FailedOperationException: Switch partition failed for Table 'dbo.tblOld'. > Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. > System.Data.SqlClient.SqlException: ALTER TABLE SWITCH statement failed because column 'col' has data type varbinary(max) in source table 'DbName.dbo.tblOld' which is different from its type varbinary(1) in target table 'DbName.dbo.tblNew'.

Bang. I'm dead. Do I have to create more explicit special-cased logic to handle var* types? That'd be silly.

If I have to jump through that many hoops to perform a copy operation, it tells me that copying via this technique isn't supported by the SMO libraries. I'm essentially trying to perform a manual deep-copy via a public interface. Perhaps I need to subclass the table type? Create a class "CopyableTable"? I don't think the protected methods/members will provide sufficient support for what I want either, though.

If something can be easily accomplished with T-SQL, why not use it? Company policies that have been put into place to mitigate the possibility SQL Injection would be one reason. SQL strings in the source code raise big red flags, and SMO should be able to handle this use-case.

|||

Sorry, Greg, I forgot to mention that in SMO you set the column to use varbinary(MAX) by setting the column.DataType.MaximumLength = -1

I'll be doing a demo of that in my presentation at PASS in two weeks. Your point on company policies is well taken, and I had a hunch the issue was something along those lines. I don't know why the MaximumLength property isn't carried forward to the new table - it's probably a bug, but knowing how the MAX datatype is internally coded (and I haven't found documentation about this, I just discovered it by poking around) can help resolve the problems you're facing.

Hope that helps.

|||

Thanks, Allen. I haven't tried that, so I'll give it a go and see what happens.

I agree, it's weird that MaximumLength isn't carried through.

|||

First try:

colNew.DataType.MaximumLength = colOld.DataType.MaximumLength;

Failed.

Second try:

if(colOld.DataType.MaximumLength == -1){
colNew.DataType.MaximumLength = -1;
}

Failed.

Third try:

if(colOld.DataType.MaximumLength == -1){
colNew.DataType.MaximumLength = 5;
}

Failed to set it to 5. The created table still has a varbinary length as 1. At this point I revisited the DataType interface, suspecting that these var* types must be special somehow, and I noticed some promising possibilities.

Fourth try:

if(colOld.DataType.Name == "varbinary"){
if(colOld.DataType.MaximumLength == -1)
colNew.DataType = DataType.VarBinaryMax;
else
colNew.DataType = DataType.VarBinary(colOld.DataType.MaximumLength);
}

No luck. Grrrr. I really expected that last one to work. If I replace VarBinaryMax with VarBinary(5), the tblNew.Create() method will kick out a varbinary(5). Which brings us back to your suggestion:

if(colOld.DataType.Name == "varbinary"){
if(colOld.DataType.MaximumLength == -1)
colNew.DataType = DataType.VarBinary(-1);
else
colNew.DataType = DataType.VarBinary(colOld.DataType.MaximumLength);
}

Which works. http://forums.microsoft.com/TechNet/ShowPost.aspx?PostID=662296&SiteID=17 had similar info.

Thanks again, Allen.

|||

Good technique.

I must copy also Dependencies of Stored Procedure.

Do you know anyidea ?

Copy Table in same database

I'm using partitions in my project to age data that's older than some predetermined number of days. Normal deletion of the data takes too long, but the following sequence of operations seems to work well when testing with raw sql queries:

    Split (to create a new partition for new data)

    Create (a new temporary table to hold the oldest partition's data)

    Switch (to move the oldest partition's data into the new temporary table)

    Merge (to combine the oldest two partitions, "removing" the oldest one in the process)

    Drop (to drop the temporary table and all the old data that we don't want anymore)

My current problem arises at step 2, "Create." The precise column/index/etc layout of the data being aged is not be known/hardcoded into my program. I've been trying to use SMO to create a copy of an existing table into the same database with a new name, but I've been failing miserably. It may include a varbinary(max), e.g., and the "max" part doesn't seem to be exposed by Smo.Column.Datatype.

Is there a way to dynamically create a copy of a table that already exists inside the same database, changing only the name and without bringing along the table's data?

Or should I smack my DBA upside the head and tell him to put in some extra temporary tables that I can just truncate instead of making me create them dynamically?

One technique is to use a Select Into:

Select *

Into myNewTableName

From myOldTableName

Where 1=2 -- usually returns no records ;)|||Sure, I could SELECT INTO, but then what's the point of SMO? Isn't the goal, here, to avoid manually building my own query?|||

The point of SMO is that it provides an object library for you to write code to make your management of SQL Server easier for you. If something can be easily accomplished with Transact-SQL, then why not use it? It's not about platform evangelism, it's about making your job easier.

That being said, this code will bring in the HumanResources.Employee table from the AdventureWorks database and replicate it into a new table called HumanResources.NewEmp:

Dim srv As Server
Dim srvConn As ServerConnection
srv = New Server("MyServer")
srvConn = srv.ConnectionContext
srvConn.LoginSecure = True

Dim db As Database
db = srv.Databases("AdventureWorks")

Dim tblExisting As Table
Dim tblNew As Table

tblExisting = db.Tables.Item("Employee", "HumanResources")
tblNew = New Table(db, "NewEmp", "HumanResources")
Dim colExistColumn As ColumnCollection
Dim clmExist As Column
colExistColumn = tblExisting.Columns
For Each clmExist In colExistColumn
Dim clmNew As Column
clmNew = New Column(tblNew, clmExist.Name)
clmNew.DataType = clmExist.DataType
clmNew.Nullable = clmExist.Nullable
tblNew.Columns.Add(clmNew)
Next
tblNew.Create()

Good luck.

|||

Thanks for the info, Allen, but I've already gone down that road. Don't forget an index loop that goes something like:

foreach (Index idx in tblOld){
String idxName = idx.Name + "_newTable"; // Or whatever, not important here
tblNew.Indexes.Add(new Index(tblNew, idxName));
tblNew.Indexes[idxName].IsClustered = idx.IsClustered;
tblNew.Indexes[idxName].IndexKeyType = idx.IndexKeyType;
tblNew.Indexes[idxName].IgnoreDuplicateKeys = idx.IgnoreDuplicateKeys;
tblNew.Indexes[idxName].IsUnique = idx.IsUnique;
foreach (IndexedColumn col in idx.IndexedColumns){
tblNew.Indexes[idxName].IndexedColumns.Add(new IndexedColumn(tblNew.Indexes[idxName], col.Name));
}
}

Recall that the tables have to be sufficiently compatible for me to switch a partition from the original into the temporary table. Creating just a table with the same column types doesn't cut it. The code sample you provide (which is nearly identical to what I wrote in my own code as well) also won't deal properly with all column data types. Try that code on a table with a type of varbinary(max), e.g. The temporary table that's created will be of type varbinary(1).

Unhandled Exception: Microsoft.SqlServer.Management.Smo.FailedOperationException: Switch partition failed for Table 'dbo.tblOld'. > Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. > System.Data.SqlClient.SqlException: ALTER TABLE SWITCH statement failed because column 'col' has data type varbinary(max) in source table 'DbName.dbo.tblOld' which is different from its type varbinary(1) in target table 'DbName.dbo.tblNew'.

Bang. I'm dead. Do I have to create more explicit special-cased logic to handle var* types? That'd be silly.

If I have to jump through that many hoops to perform a copy operation, it tells me that copying via this technique isn't supported by the SMO libraries. I'm essentially trying to perform a manual deep-copy via a public interface. Perhaps I need to subclass the table type? Create a class "CopyableTable"? I don't think the protected methods/members will provide sufficient support for what I want either, though.

If something can be easily accomplished with T-SQL, why not use it? Company policies that have been put into place to mitigate the possibility SQL Injection would be one reason. SQL strings in the source code raise big red flags, and SMO should be able to handle this use-case.

|||

Sorry, Greg, I forgot to mention that in SMO you set the column to use varbinary(MAX) by setting the column.DataType.MaximumLength = -1

I'll be doing a demo of that in my presentation at PASS in two weeks. Your point on company policies is well taken, and I had a hunch the issue was something along those lines. I don't know why the MaximumLength property isn't carried forward to the new table - it's probably a bug, but knowing how the MAX datatype is internally coded (and I haven't found documentation about this, I just discovered it by poking around) can help resolve the problems you're facing.

Hope that helps.

|||

Thanks, Allen. I haven't tried that, so I'll give it a go and see what happens.

I agree, it's weird that MaximumLength isn't carried through.

|||

First try:

colNew.DataType.MaximumLength = colOld.DataType.MaximumLength;

Failed.

Second try:

if(colOld.DataType.MaximumLength == -1){
colNew.DataType.MaximumLength = -1;
}

Failed.

Third try:

if(colOld.DataType.MaximumLength == -1){
colNew.DataType.MaximumLength = 5;
}

Failed to set it to 5. The created table still has a varbinary length as 1. At this point I revisited the DataType interface, suspecting that these var* types must be special somehow, and I noticed some promising possibilities.

Fourth try:

if(colOld.DataType.Name == "varbinary"){
if(colOld.DataType.MaximumLength == -1)
colNew.DataType = DataType.VarBinaryMax;
else
colNew.DataType = DataType.VarBinary(colOld.DataType.MaximumLength);
}

No luck. Grrrr. I really expected that last one to work. If I replace VarBinaryMax with VarBinary(5), the tblNew.Create() method will kick out a varbinary(5). Which brings us back to your suggestion:

if(colOld.DataType.Name == "varbinary"){
if(colOld.DataType.MaximumLength == -1)
colNew.DataType = DataType.VarBinary(-1);
else
colNew.DataType = DataType.VarBinary(colOld.DataType.MaximumLength);
}

Which works. http://forums.microsoft.com/TechNet/ShowPost.aspx?PostID=662296&SiteID=17 had similar info.

Thanks again, Allen.

|||

Good technique.

I must copy also Dependencies of Stored Procedure.

Do you know anyidea ?

Friday, February 17, 2012

Copy table and/or records from SQL Server 2000 to 2005 Express

How can I copy a table from my sql server 2000 db to my sql server 2005 express edition?

I have a project in VS.NET 2005 and I have a db in App_Data folder. However, when I look into that folder, there is nothing visible. I now need to copy a table from my existing sql server 2000 to my db located in my project's App_Data folder.

Any help would be appreciated..

Regards,

You can use the DTS to do that.

Also, use the SQL managment Studio (enterprise manager in sql 2000) to manipulate MSSQL 2005

http://www.microsoft.com/downloads/details.aspx?FamilyID=82afbd59-57a4-455e-a2d6-1d4c98d40f6e&DisplayLang=en

Regards

Copy Table and Data from one database to another

I am working on a migration project.

Doing the standard processing, taking source data in a staging database, where I then create the new target tables, transform the source data into the new target table structure, and load the data.

However, having created the new target tables in my staging database, I cannot accurately migrate these tables into the new database.

An example table.

The following is the script created by SQL Management Studio if I right click and script the table as CREATE.

CREATE TABLE [dbo].[tbPRO_Package](
[PRO_PackageID] [int] IDENTITY(1,1) NOT NULL,
[CreatedDate] [datetime] NOT NULL CONSTRAINT [DF_tbPRO_Package_CreatedDate] DEFAULT (getdate()),
[CreatedBy] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL CONSTRAINT [DF_tbPRO_Package_CreatedBy] DEFAULT (suser_sname()),
[ModifiedDate] [datetime] NOT NULL CONSTRAINT [DF_tbPRO_Package_ModifiedDate] DEFAULT (getdate()),
[ModifiedBy] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL CONSTRAINT [DF_tbPRO_Package_ModifiedBy] DEFAULT (suser_sname()),
[GUID] [uniqueidentifier] NOT NULL CONSTRAINT [DF_tbPRO_Package_GUID] DEFAULT (newid()),
CONSTRAINT [PK_tbPackage_ID] PRIMARY KEY CLUSTERED
(
[PRO_PackageID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

Note that the field PRO_PackageID has an identity, and is a primary key.

Also note the constraints on several fields and the default values.

If I try to move this table and data using the SQL Server Import and Export Wizard, telling it to drop and create the table, and to keep identity, it does not correctly generate the table defintion - it generates the following script

CREATE TABLE [Migration Staging].[dbo].[tbPRO_Package] (
[PRO_PackageID] int NOT NULL,
[PackageCode] varchar(8) NOT NULL,
[LKU_VatRateCode] nchar(10),
[CancellationCharge] decimal(18,2),
[CancellationMultiplier] decimal(18,2),
[CreatedDate] datetime NOT NULL,
[CreatedBy] varchar(50) NOT NULL,
[ModifiedDate] datetime NOT NULL,
[ModifiedBy] varchar(50) NOT NULL,
[GUID] uniqueidentifier NOT NULL
)

We've lost the identity, the PK,the constraints and the defaults.

If I use the SSIS Transfer SQL Server Objects Task, then the primary key is kept, but the identity is not, nor are the constraints or defaults.

Am I missing something? Is it actually POSSIBLE to accurately move tables between databases?"If

I use the SSIS Transfer SQL Server Objects Task, then the primary key

is kept, but the identity is not, nor are the constraints or defaults."

This behavior is by design to maintain backward compatibility with the Copy SQL Server Objects task in SQL Server 2000. Only two constraints (Primary key and foreign key constraints) are expected to be copied by Transfer Objects task.

Sorry for the inconvenience.

If transferring the entire database (as opposed to transferring only the tables) is okay, you can use "Transfer Database" task.

If you want to transfer only tables and preserve the table definition, I can think of two ways it can be done using SMO. You can implement C# code using SMO API to copy the tables or you can implement the same (in VB) inside a script task in SSIS. Both of these approaches should be pretty easy.

Hope this helps.|||so - it is not possible to accurately move a table and its data from one database to another, without resorting to writing code?

DESPITE there being two SQL tools that say they do it?

what is the purpose of these SQL tools then? For those oocassions where you want to move tables and data inaccurately?

you have stated this reponse before, in my post here.

Another poster asked in this thread:

MehtaR wrote:

Kaarthik,

I am not sure why you say "This behavior is by design

to maintain backward compatibility with the Copy SQL Server Objects

task in SQL Server 2000."
In SQL 2000 you can use DTS to transfer tables from one database to another preserving the primary as well as identity columns.
Rahul

edit: added smiley to ensure sarcasm detectors are activated |||I have two issues with this:

1) It's *not* transferringg foreign key constraints. That is what it's complaining about.

2) The "transfer database" task is a joke becuase it doesn't transfer the entire database, just the bits and pieces it chooses to transfer.

All I need is a simple, exact, bit for bit transfer from a 2000 to a 2K5 DB. This appears to be an impossibility in SSIS.|||

You can use Redgate SQL Packager to do what you are doing, since i have worked recently on a similar project. You can download their trial verison to play around with it I htink. SQL Packager will allow you to pick and choose the tables or any objects in your source db, and then package it. Then all you gotta do is to move the package(which is an exe file) to the server where you want to migrate it to. When you execute the package, it will cleanly create the db, with the chosen tables. If you are doing an upgrade, and then choose destination db when preparing the package, and it will create an upgrade package, to again move data/tables both(any that you choose). Thanks

MA

|||exactly!

surely a means to do this is fundamental requirement of a product such as SSIS?

I dont think this is too much to expect.|||I have several Redgate tools, such as packager, sql compare and data compare, and they are fantastically useful - I honest couldn't do my job without sql compare now.

But - we shouldn't need a third party tool to do such a basic task.|||There is a "wizard" that supposedly does this. I've not yet tested it and the way it works is quite bass-ackwards. It also only works for 2K/2K5 --> 2K5. What you have to do is go to your 2K5 DB, right click on the desired DB, go to tasks and down at the very bottom you'll see "Copy database". This theoretically creates a SSIS package which I am hoping I can them tweak if necessary.

What info I found on this is here:
http://blogs.msdn.com/euanga/archive/2006/07/18/668916.aspx

Copy Table and Data from one database to another

I am working on a migration project.

Doing the standard processing, taking source data in a staging database, where I then create the new target tables, transform the source data into the new target table structure, and load the data.

However, having created the new target tables in my staging database, I cannot accurately migrate these tables into the new database.

An example table.

The following is the script created by SQL Management Studio if I right click and script the table as CREATE.

CREATE TABLE [dbo].[tbPRO_Package](
[PRO_PackageID] [int] IDENTITY(1,1) NOT NULL,
[CreatedDate] [datetime] NOT NULL CONSTRAINT [DF_tbPRO_Package_CreatedDate] DEFAULT (getdate()),
[CreatedBy] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL CONSTRAINT [DF_tbPRO_Package_CreatedBy] DEFAULT (suser_sname()),
[ModifiedDate] [datetime] NOT NULL CONSTRAINT [DF_tbPRO_Package_ModifiedDate] DEFAULT (getdate()),
[ModifiedBy] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL CONSTRAINT [DF_tbPRO_Package_ModifiedBy] DEFAULT (suser_sname()),
[GUID] [uniqueidentifier] NOT NULL CONSTRAINT [DF_tbPRO_Package_GUID] DEFAULT (newid()),
CONSTRAINT [PK_tbPackage_ID] PRIMARY KEY CLUSTERED
(
[PRO_PackageID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

Note that the field PRO_PackageID has an identity, and is a primary key.

Also note the constraints on several fields and the default values.

If I try to move this table and data using the SQL Server Import and Export Wizard, telling it to drop and create the table, and to keep identity, it does not correctly generate the table defintion - it generates the following script

CREATE TABLE [Migration Staging].[dbo].[tbPRO_Package] (
[PRO_PackageID] int NOT NULL,
[PackageCode] varchar(8) NOT NULL,
[LKU_VatRateCode] nchar(10),
[CancellationCharge] decimal(18,2),
[CancellationMultiplier] decimal(18,2),
[CreatedDate] datetime NOT NULL,
[CreatedBy] varchar(50) NOT NULL,
[ModifiedDate] datetime NOT NULL,
[ModifiedBy] varchar(50) NOT NULL,
[GUID] uniqueidentifier NOT NULL
)

We've lost the identity, the PK,the constraints and the defaults.

If I use the SSIS Transfer SQL Server Objects Task, then the primary key is kept, but the identity is not, nor are the constraints or defaults.

Am I missing something? Is it actually POSSIBLE to accurately move tables between databases?"If

I use the SSIS Transfer SQL Server Objects Task, then the primary key

is kept, but the identity is not, nor are the constraints or defaults."

This behavior is by design to maintain backward compatibility with the Copy SQL Server Objects task in SQL Server 2000. Only two constraints (Primary key and foreign key constraints) are expected to be copied by Transfer Objects task.

Sorry for the inconvenience.

If transferring the entire database (as opposed to transferring only the tables) is okay, you can use "Transfer Database" task.

If you want to transfer only tables and preserve the table definition, I can think of two ways it can be done using SMO. You can implement C# code using SMO API to copy the tables or you can implement the same (in VB) inside a script task in SSIS. Both of these approaches should be pretty easy.

Hope this helps.|||so - it is not possible to accurately move a table and its data from one database to another, without resorting to writing code?

DESPITE there being two SQL tools that say they do it?

what is the purpose of these SQL tools then? For those oocassions where you want to move tables and data inaccurately?

you have stated this reponse before, in my post here.

Another poster asked in this thread:

MehtaR wrote:

Kaarthik,

I am not sure why you say "This behavior is by design

to maintain backward compatibility with the Copy SQL Server Objects

task in SQL Server 2000."
In SQL 2000 you can use DTS to transfer tables from one database to another preserving the primary as well as identity columns.
Rahul

edit: added smiley to ensure sarcasm detectors are activated |||I have two issues with this:

1) It's *not* transferringg foreign key constraints. That is what it's complaining about.

2) The "transfer database" task is a joke becuase it doesn't transfer the entire database, just the bits and pieces it chooses to transfer.

All I need is a simple, exact, bit for bit transfer from a 2000 to a 2K5 DB. This appears to be an impossibility in SSIS.|||

You can use Redgate SQL Packager to do what you are doing, since i have worked recently on a similar project. You can download their trial verison to play around with it I htink. SQL Packager will allow you to pick and choose the tables or any objects in your source db, and then package it. Then all you gotta do is to move the package(which is an exe file) to the server where you want to migrate it to. When you execute the package, it will cleanly create the db, with the chosen tables. If you are doing an upgrade, and then choose destination db when preparing the package, and it will create an upgrade package, to again move data/tables both(any that you choose). Thanks

MA

|||exactly!

surely a means to do this is fundamental requirement of a product such as SSIS?

I dont think this is too much to expect.|||I have several Redgate tools, such as packager, sql compare and data compare, and they are fantastically useful - I honest couldn't do my job without sql compare now.

But - we shouldn't need a third party tool to do such a basic task.|||There is a "wizard" that supposedly does this. I've not yet tested it and the way it works is quite bass-ackwards. It also only works for 2K/2K5 --> 2K5. What you have to do is go to your 2K5 DB, right click on the desired DB, go to tasks and down at the very bottom you'll see "Copy database". This theoretically creates a SSIS package which I am hoping I can them tweak if necessary.

What info I found on this is here:
http://blogs.msdn.com/euanga/archive/2006/07/18/668916.aspx