I have to copy a large (3000) amount of different tables from a Oracle machine into an
SQLServer machine.
I am able to do this using a (VB) script.
I use now several methods:
1) INSERT INTO TABLE1 SELECT * FROM SID1..DB.TABLE1 (SID1 is a linked server)
2) INSERT INTO TABLE1 SELECT * FROM OPENQUERY(SID1,'SELECT * FROM DB.TABLE1')
3) Also used OPENROWSET method (similar to 2)
For small tables this is fine, however for BIG tables (15M Rows/150Cols) the methods above are too slow.
If I compare the same copy action with a simple DTS, the DTS is 3 times faster.
Also, the DTS seems to bulk copy the data directly into the desired database while the
mentioned methods first fill the tempdb, then the transaction log of the desired database and
then finally the desired table (need very much extra space on your filesystem).
The total size of data is about 300GB.
Can anyone supply me with a simple example how to copy data from an Oracle table into a
SQLServer table in script (or SQL) that is as fast as the DTS and not filling my logfiles??
I read the bcp (which I use for import/export files) and bulk insert commands, but
I do not understand how to use them in this question.Your best bet is going to be to dump the data out to delimited files from Oracle and bcp them in to SQL Server...
I would say your going to have at least Date conversion issues...|||I'm sorry. Did you say three THOUSAND tables?
Might I suggest that now might be a good time to redesign your application? A crappy database is still a crappy database, whether it resides on Oracle or SQL Server.|||..the "crappy application" is a quite well known ERP system... (FYI even M$ Navision has over 2000 tables for a single company).
It is a merge of over 30 different companies that went bankrupt recently which of a part will continue to exist while the other must be kept alive for the time being on another platform for the administrators of the company.
Indeed, a dump (master.dbo.xp_cmdshell 'bcp ""dbx.dbo.table1"" in ""\\OLS1\ddumps\Od1.dmp"" -q -k -N -U ""sa"" -P""xyz""') proposed by Brett runs a little (30%) faster (unfortunately Unicode is needed), however not the factor 3 that I can reach with a DTS.
The difference in this speed is (12hours vs 4 hours), that I am able to do an extraction every night (which is a must) or just once a week.|||Well, at least we agree upon its crappiness.|||No other solution(s) in mind?
I am really desperate... No experience with DTS scripting etc?|||Set your SQL Server database's recovery model option to "Simple" to reduce logging. Use DTS to transfer the data directly from your Oracle source, or use BCP to transfer it from a file.
Yes, BCP syntax is esoteric, but it is still the fastest method of loading data.
It seems to me that you are able to transfer the data, but are balking about individually coding all 3000 tables. I don't see any way around that unless you write a batch file to loop through BCPing each file.
Maybe there is a 3rd party utility that will do this for you.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment