Friday, February 24, 2012

How to copy tables from local machine to remote SQL server

We currently have a PPTP connection set up for our developers to
access our development SQL server through a VPN tunnel. When they
need to copy tables up to the dev SQL from their local machine they
simply do a DTS copy.

However, we are now moving to a thin client solution where they will
be working on a terminal server. They will have access to the
development SQL servers and SQL tools such as EM and QA. However,
they will not have access to their local SQL server and, therefore,
will not be able to directly perform DTS copies. We have explored
several possibilities such as exporting tables to a .csv or .mdb file
and then importing them on the development SQL server but this is not
ideal because things are lost in that process (e.g. primary keys,
field names, data types, etc.)

My question is this: Is there a way to export and then import SQL
tables without losing dependent objects such as primary keys and data
types in the process? If any of you are working with a similar
situation I would really like to hear how your remote users copy
objects from their remote location to your SQL servers. Thanks!

Ryan

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Dis...pict211310.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=722630I'm not sure how your developers will get their files to the server
(FTP?), but in any case they can easily create object definition
scripts from Enterprise Manager or Query Analyzer.

Assuming they're maintaining their SQL code in a source control system,
then they will be creating these scripts anyway. So they could recreate
(or perhaps just ALTER) a table with a script, then reload the data
from a csv file. Or perhaps make a copy of the original table, modify
it, insert the data from the copy, and then drop the copy.

If this doesn't help, you might want to clarify what the issue is -
problems creating scripts, running them etc.

Simon|||"Simon Hayes" wrote:
> I'm not sure how your developers will get their files to the
> server
> (FTP?), but in any case they can easily create object
> definition
> scripts from Enterprise Manager or Query Analyzer.
> Assuming they're maintaining their SQL code in a source
> control system,
> then they will be creating these scripts anyway. So they could
> recreate
> (or perhaps just ALTER) a table with a script, then reload the
> data
> from a csv file. Or perhaps make a copy of the original table,
> modify
> it, insert the data from the copy, and then drop the copy.
> If this doesn't help, you might want to clarify what the issue
> is -
> problems creating scripts, running them etc.
> Simon

Thanks for your reply.

The developers can copy files from there local hard drives via their
terminal server session. The problem is that they cannot use DTS to
copy SQL objects because there are firewalls between their local
machine and the terminal server.

It would be possible for them to script the table in EM and then use
bcp to insert the data but I was hoping for a simpler solution with
fewer steps involved. Some of the developers are not so SQL saavy and
Im afraid that we will get stuck supporting them. The extra time
spent on these activities will also give them an excuse to bill us for
more hours (they are contractors).

I have been testing the export of data to .csv and .mdb files and it
is not always straightforward to do. If anyone has any experience
with this your advice is very welcome. Thanks!

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Dis...pict211310.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=723562|||Here's a workaround solution if you'd like.

1. Create a backup of the database you want to send to the SQL Server.
Let's say your DBName is Performer.
2. Restore as another database...Performer1
3. Delete the tables/other objects you don't want to move up.
4. Do another backup of Performer1...maybe into Performer1.bak
5. Upload it to Production.
6. Restore as Performer1.
7. DTS the tables over or if you simply need to move the data, DTS only
the data to the existing tables.

...and 8. Good luck :o)

Hope this helps,

-Umar Farooq.|||1.DTS out to disk the table(s) you need (using DTS for this is much less
prone to error than BCP)
2. Upload files to production
3. DTS Back in (once again DTS is easier than BCP)

To avoid foreign key confilicts you need to DTS any parent data in before
child data (if you need to move new tables you will need to script and copy
them and recreate at other end before loading data)

You can make the task completely automatic by doing steps 1 and 2 in a DTS
package locally, then do step 3 on production and pole for the files, only
upload when they are there an then delete them (or move them) when the
import is complete.

This will mean that your contractors do nothing or just initiate an automate
process.

Hope this helps
Julian

"Umar Farooq" <UmarAlFarooq@.gmail.com> wrote in message
news:1112753008.047280.36140@.o13g2000cwo.googlegro ups.com...
> Here's a workaround solution if you'd like.
> 1. Create a backup of the database you want to send to the SQL Server.
> Let's say your DBName is Performer.
> 2. Restore as another database...Performer1
> 3. Delete the tables/other objects you don't want to move up.
> 4. Do another backup of Performer1...maybe into Performer1.bak
> 5. Upload it to Production.
> 6. Restore as Performer1.
> 7. DTS the tables over or if you simply need to move the data, DTS only
> the data to the existing tables.
> ...and 8. Good luck :o)
> Hope this helps,
> -Umar Farooq.

No comments:

Post a Comment