I've had really a rough time lately getting data from legacy systems based on MS Sql Server. In each case I got only the .bak backup file and had to do the rest. The whole thing is a bit easier if you run everything on windows, but as I did the development on Mac I'll cover how to connect to MS Sql from MacOS as well.
Restore database from backup
You will need a windows machine with MS Sql Server running (of course :-) Luckily, you can download it for free from
here. Next connect to the database server using the command line utility osql (you will need to have it in your PATH)
cd C:\Program Files\Microsoft SQL Server\MSSQL\Binn
osql -E
You may need to create a new user:
use master
go
EXEC sp_addlogin 'peter', 'pass45'
go
Now let's have a look at the backup file (put it in some easy location so you don't have to type out the path):
Restore FILELISTONLY FROM DISK='c:\AMS.bak'
This query allows us to find out the logical name of the database and log file which is needed to appropriately restore a database to a new path. What you get is basically an original location of the data and the log file. In my case I got:
AMS_Data D:\Program Files\Microsoft SQL Server 2000\MSSQL\data\AMS_Data.MDF
AMS_Log D:\Program Files\Microsoft SQL Server 2000\MSSQL\data\AMS_Log.LDF
Which means that the original installation was on drive D:\. As my temp server is on C: I will have to recover with changing the locations of the files
RESTORE DATABASE ams
FROM DISK='c:\AMS.bak'
WITH
MOVE 'AMS_Data' to 'C:\Program Files\Microsoft SQL Server\MSSQL\Data\ams.mdf',
MOVE 'AMS_Log' to 'C:\Program Files\Microsoft SQL Server\MSSQL\Data\ams_log.ldf'
go
The only important thing here is that 'AMS_Data' and 'AMS_Log' names match the ones from the previous query. Now you should hopefully see that your database has been restored. Now, will just get access to the database:
use ams
go
EXEC sp_grantdbaccess 'peter'
go
sp_addrolemember db_owner,peter
go
There's more info on how to use osql
here.
Now we should be set to connect to the database from development machine. This was quite a number of steps just to restore the database - compared to createdb tmp; psql tmp <>Connecting to MS Sql from MacOS
We will need to install and set up a couple of things:
sudo port install rb-dbi +dbd_odbc
sudo port install freetds
set up connection to database
/opt/local/etc/freetds/freetds.conf
Add your server to the end of the file:
[Ams]
host = 192.168.11.106
port = 1433
tds version = 8.0
where the host is IP of your windows machine. After this step you should be already able to connect to MS Sql Server:
tsql -S Ams -U peter -P pass45
You should get the server database prompt (just like the osql on windows machine). You can try some commands like
use ams
go
select * from contacts
go
Now set up ODBC connection. Go to Applications -> Utils -> ODBC Administrator
1) add driver with following descriptions:
Description: TDS
Driver File: /opt/local/lib/libtdsobdc.so
Setup FIle: /opt/local/lib/libtdsobdc.so
Define as: System
2) add User DNS DSN: Ams Desc: old AMS database server
add 2 keyword/value placeholders. To update them click on it and press TAB. Set it to following values:
ServerName: Ams (or whatever you set in freetds.conf)
Database: ams (or whatever your database name)
now you should be able to test iODBC. Note that I am using sudo for this as it doesn't seem to work without sudo complaining Data source name not found and no default driver specified. Driver could not be loaded (0) SQLSTATE=IM002.
sudo iodbctest "dsn=Ams;uid=USERNAME;pwd=PASSWORD"
You should be now in interactive terminal again.
Once all this works connecting from Ruby is really easy:
require 'rubygems'
require 'dbi'
DBI.connect('DBI:ODBC:ams', 'USERNAME', 'PASSWORD')
Rescuing the data
This is just a short ruby script I use to extract all the data from MS Sql and import it to Postgresql. It's not any functional database conversion - it's just to get the data out to something that's easier to work with and doesn't require windows machine running. It may have an issue with binary fields - it worked on most of them but it did choke on a couple of fields. DBI actually provides more metadata like NOT NULL attribute, primary key, etc. so the script could generate a more precise copy of the original database but this was enough for what I needed. You may run into unknown data type - especially if you try to import it to a different database engine - all you need to do is just update the method update_type to return correct mappings of the data types.
require 'rubygems'
require 'dbi'
require_library_or_gem 'pg'
def escape(text)
return "NULL" if text.nil?
text = PGconn.escape("#{text}")
return "'#{text}'"
end
def self.update_type(col)
type = col["type_name"]
type ||= 'timestamptz'
type = type.downcase
case type
when 'clob'
return 'varchar'
when 'varbinary'
return "oid"
when 'long varbinary'
return "oid"
when 'double'
return 'double precision'
when 'tinyint'
return 'smallint'
when 'char'
return "char(#{col['precision']})"
else
return type
end
end
dbh = DBI.connect("DBI:ODBC:ams", "peter", "pass45")
sth = dbh.prepare('select name from sysobjects where type = \'U\' ORDER BY NAME;')
sth.execute
tables = sth.collect{|row| row[0]}
tables.each do |table|
sth = dbh.prepare("select * from #{table} ")
sth.execute
create = "CREATE TABLE #{table}(\n"
create << sth.column_info.collect{|col| "\"#{col['name'].downcase}\" #{update_type(col)}"}.join(",\n")
create << ");\n\n"
puts create
sth.each do |row|
create << "INSERT INTO #{table} (#{sth.column_info.collect{|column| "\"#{column['name'].downcase}\""}.join(', ')}) VALUES (#{sth.column_info.collect{|col| escape(row[col['name']])}.join(', ')});\n"
end
create << "\n\n"
output = File.new("data_ams.sql", "ab+")
output.puts create
output.close
#puts create
end
To import the data to Postgresql is as simple as:
createdb ams_backup
psql ams_backup <> noise.txt
I like the post format as you create user engagement in the complete article. It seems round up of all published posts. Thanks for gauging the informative posts.
ReplyDeletecara menggugurkan kandungan
cara menggugurkan hamil
You managed to hit the nail upon the top and also defined out the whole thing without having side effect , people could take a signal. Will likely be back to get more. Thanks
ReplyDeleteTangki Panel
Tangki Fiberglass
Jual Septic Tank