Saturday, 16 February 2013

Database Migration Automated

Part 1:-
  • Configure the Distributor, Publisher and enable the database to synchronize from backup ( to make sure I would not need to generate a full snapshot to initialize, as the business requirement was to do this task with least overhead on the live systems)
  • Create an empty publication with no articles. (again I could add all the articles but I choose not to, so I can have control over which article I would like to replicate across)
  • Backup database to be migrated across
  • Verify the backup taken above is valid
  • Copy the backup across the network to the destination server ( scripted Robocopy and used multi threading (MT: 100), flows like a breeze on a dedicated network )
  • Restore the database on the destination server.
  • Create a push subscription
The script creates a log file for each task organizing it in the folder structure with ServerName and a subfolder with DatabaseName then creates a txt log file with results of each step.
The script then prompts the user to check the log file to make sure there are no errors, and press any key to continue migration.

Part 2:-
  • Copy across logins with SID to destination server (I did this using SSIS package, I could use the sp_help_revlogin)
  • Copy across agent Jobs and SSIS packages ( again did this using SSIS package)
  • Copy across Linked Servers
  • Automated LogShipping setup.
Issues faced:
Faced no unusual issues other than things you would face normally when setting up transactional replication using backup to initialize. We could have avoided these issues if we had enough time to analyze the databases prior to actually working on the migration. I was put on the task with no lead time to investigate.
A subset of the issues I faced:
  • Errors with TimeStamp Column when initializing using backup; The business requirement was not to change any database schema, so converting TimeStamp column to varbinary was ruled out. I got lucky as the databases with this issue were all small databases ranging upto 8 GB and I could easily initialize delivering a full snapshot for these without any concerns.
  • Tables without Primary Keys, obviously replication would only work for tables with primary keys defined on them. I prepared a script that would provide me a row count of each table of every user database on each of the source and destination servers I was migrating. I pulled this data into an excel, prepared a pivot table and there it was , very easy to compare which tables were out of sync. Thanks to redgate data compare and SSIS package I could easily sync these tables without any issues. the trick is to compare the tables selecting all the columns for comparison.
  • Ever changing business requirements
  • Application teams were busy working towards their release and had BLOB data imports into the database I was replicating across in preparation of the Migration CutOver. Had to increase the max text repl size to let the application import BLOB data.
  • Had a couple of Logshipping databases on the source included in the list to perform this premigration replication activity. Later identified as to setup Log shipping separately for these two databases. Now the replication would not allow you remove the subscription and publication as the subscriber was a logshipped read only database. Had to manually remove replication on both publisher and subscriber individually.
  • SQL Servers evaluation Editions were installed on the destination servers. Had to upgrade SQL Server Evaluation Edition to Enterprise Edition. Had to Manage SQL Server Active – Passive Windows Cluster.
  • I had to move the distribution database to another drive at some point in the pre migration process due to lack of disk space on the servers.

No comments:

Post a Comment