Showing posts with label AWS. Show all posts
Showing posts with label AWS. Show all posts

Sunday, November 5, 2017

SQL Server and Postgres Datatypes



SQL Server Postgres
 bigint            numeric(20,0)                 
 binary            bytea                         
 bit               numeric(1,0)                  
 char              character(10)                 
 date              date                          
 datetime          timestamp without time zone   
 datetime2         timestamp(6) without time zone 
 datetimeoffset    timestamp(6) with time zone   
 decimal           numeric(18,0)                 
 float             double precision              
 geography         character varying(8000)       
 geometry          character varying(8000)       
 hierarchyid       character varying(8000)       
 image             bytea                         
 money             numeric(19,4)                 
 nchar             character(10)                 
 ntext             text                          
 numeric           numeric(18,0)                 
 nvarchar          character varying(10)         
 real              double precision              
 smalldatetime     timestamp without time zone   
 smallint          numeric(5,0)                  
 smallmoney        numeric(10,4)                 
 sql_variant       character varying(8000)       
 sysname           character varying(128)        
 text              text                          
 time              time(6) without time zone     
 timestamp         character varying(8000)       
 tinyint           numeric(5,0)                  
 uniqueidentifier   uuid                          
 varbinary         bytea                         
 varchar           character varying(10)         
 xml               xml                           

Monday, June 20, 2016

Migrate SQL Server Database to AWS (RDS) or Azure using SSIS

 

Migrate SQL Server Database to AWS (RDS) or Azure using SSIS

If backup and restore is not an option to migrate or copy the whole database which is not an option, specially to migrate managed database services like AWS RDS /Azure or any Environment which has restrictions to restore backups needs Import/Export mechanisms to migrate the data.
The top 2 ways to migrate data is through either SSIS (Export Wizard) or BCP. choosing the one from these two depends on whether it has direct access to the destination server (SSIS) or disconnected server (BCP) which needs to copy the data dump and insert the data at the destination.

Both these techniques are very familiar to most of the DBAs, here I am highlighting the problems
faced with working with these 2 techniques (SSIS/BCP)

SSIS

Export Wizard:
Export Wizard has an option to save an SSIS package without executing. which is pretty good feature,
but it has some limitations which needs manual intervention.

1. Identity Insert should be ON for all the tables which has identity key property, Its very tedious manual process to identify which tables has this property and manually enable them in each data flow.

2. Keep Nulls should be on the tables with default values. E.g.: if the table, column which allows nulls and has a date data type with default value as getdate()  then after exporting the data it inserts the date values instead nulls, unless the SSIS dataflow destination property option FastLoadKeepNulls is checked

3. fast load options are not set by default; it has to be set manually for the property sFastLoadOptionsBatchCommit 

4. TABLOCK which is not enable by default

5. ROWS_PER_BATCH is not configured by default

Above 5 settings have to be done manually.  Without these the package would fail or perform slowly
Think about a situation where you need to export 100 tables with millions of rows.
Below VbScript will create a new SSIS package with all the above options. 

How it works:
It replaces specific string in the package to enable the above configuration values in the source code.

Note: This has been tested on SQL Server 2008 EE. If this doesn’t work in your environment, then modify the key strings as per the release.

Steps:

1. Run the Export export data wizard and save the package to C:\TEMP\northwind.dtsx

2. copy the below vbscript code to a file in C:\TEMP\SetSSIS.VBS 

3. Edit the Vbscript, Line 5 "filePath = "<ssis file path>"

4. run the vb script

5. it creates a new package with new name as <PKG_NAME>.NEW

6. open the package in BIDS

7. run the package

Before

After

clip_image002[6]

clip_image004[6]

Const ForReading=1
Const ForWriting=2
Set objFSO = CreateObject("Scripting.FileSystemObject")
folder = ".\"
filePath = "C:\temp\northwind.dtsx"
dim NewrowsPerBatchString
NewrowsPerBatchString = "TABLOCK,ROWS_PER_BATCH = 50000</property>"
Set myFile = objFSO.OpenTextFile(filePath, ForReading, True)
Set myTemp= objFSO.OpenTextFile(filePath & ".new.dtsx", ForWriting, True)
dim sIdentity
sIdentity =   "name=""FastLoadKeepIdentity"" dataType=""System.Boolean"" state=""default"" isArray=""false"" description=""Indicates whether the values supplied for identity columns will be copied to the destination. If false, values for identity columns will be auto-generated at the destination. Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">false</property>"
sKeepNulls = "name=""FastLoadKeepNulls"" dataType=""System.Boolean"" state=""default"" isArray=""false"" description=""Indicates whether the columns containing null will have null inserted in the destination. If false, columns containing null will have their default values inserted at the destinaton. Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">false</property>"
sFastLoadOptionsBatchCommit = "name=""FastLoadMaxInsertCommitSize"" dataType=""System.Int32"" state=""default"" isArray=""false"" description=""Specifies when commits are issued during data insertion.  A value of 0 specifies that one commit will be issued at the end of data insertion.  Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">0</property>"
sFastLoadOptions = "name=""FastLoadOptions"" dataType=""System.String"" state=""default"" isArray=""false"" description=""Specifies options to be used with fast load.  Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">"
dim x
Do While Not myFile.AtEndofStream
myLine = myFile.ReadLine
If InStr(myLine, sIdentity) Then
          myLine=Replace(myLine,"expressionType=""None"">false</property>", "expressionType=""None"">true</property>")
End If
If InStr(myLine, sKeepNulls) Then
          myLine=Replace(myLine,"expressionType=""None"">false</property>", "expressionType=""None"">true</property>")
End If
If InStr(myLine, sFastLoadOptionsBatchCommit) Then
          myLine=Replace(myLine,"expressionType=""None"">0</property>","expressionType=""None"">50000</property>")
End If
If InStr(myLine, sFastLoadOptions) Then
          myLine=Replace(myLine,"expressionType=""None"">","expressionType=""None"">"&NewrowsPerBatchString)
  x=instrrev(myLine,NewrowsPerBatchString )
          myLine=left(myLine,x+(len(NewrowsPerBatchString)-1))
End If
myTemp.WriteLine myLine
x=""
Loop
myFile.Close
myTemp.Close
'objFSO.DeleteFile(filePath)
'objFSO.MoveFile filePath&".tmp", filePath

BCP

Check the below link which is using BCP to migrate the Data.

Sunday, January 5, 2014

Featured Post

SQL Server AWS Migration BCP

stored procedure to generate BCP scritps to migrate the SQL Server database. Developed this stored procedure on my labs to simulate t...

Contributors