Incremental Uploads:
We have got a request to need a SSIS package to incremental uploads between two sql server instances which are two different locations.
Requirement:
There are two different databases which are available in two different instances.
SSIS package required to accomplish the below tasks.
- Sync data for the table Employee from Source to Destination
- Compare records based on a key value
- If you find any new records Insert them into destination
- Update all existing records from Source to destination
- Insert records into destination If any new records found.
Now we will see how to design a package for incremental load in SSIS with example
Environment:
Source: SQL_Instance: XXXXXX; Database: Source
Destination: SQL_Instance: XXXXXX; Database: Destination
Environment:
Source: SQL_Instance: XXXXXX; Database: Source
Destination: SQL_Instance: XXXXXX; Database: Destination
Table information:
CREATE TABLE Source_Employee (
ID INT IDENTITY NOT NULL PRIMARY KEY,
[First_Name] VARCHAR(50) NOT NULL,
[Last_Name] VARCHAR(50),
[SSL] VARCHAR(18),
[DLNO] VARCHAR(25),
[UpdatedOn] DATETIME NULL,
[CreatedOn] DATETIME NOT NULL DEFAULT(GETDATE())
)
GO
INSERT INTO Source_Employee([First_Name],[Last_Name],[SSL],[DLNO])
SELECT 'Jason','Mag','SA-MYk9989001','DL-SA0545678'
UNION
SELECT 'Carry','Uyon','WC-KAP9989001','DL-WC0545887'
UNION
SELECT 'Chrish','Lott','AT-LKU8788954','DL-AT059675'
UNION
SELECT 'Kourav','Mishra','NY-NYU5669877','DL-NY0073987'
GO
SELECT * FROM Source_Employee
Connect to Source Instance:
Connect to Destination Instance:
Now tables are ready, we need to start build SSIS package.
- Open SQL Server 2008 R2 business intelligence development studio
- Create a new SSIS package and name it as “Incremental Uploads.dtsx” as below
- Add a new dataflow task and name it as “Data_Flow_Sync_SourceNDestination”
- Create two OLE DB connection managers to both source and destination. Since these are the test instances here I am using SA account at the connection managers.
- Open dataflow task. Add “OLE DB Source” and map it with source connection manager, select the table name “Source_Employee”.
- Now dataflow task looks like as below:
- Add a transformation “LookUp” and connect to Lookup from “OLEDB Source”
- Double click on “Lookup” component. Choose “Connection type” as “OLEDB Connection manager” and choose “Specify how to handle rows with no matching entries as “ignore failure”.
- Go to the next tab “Connection”. Select OLEDB connection manager as “Destination connection manager” and table as “Dest_Employee”.
- Go to the next tab “Columns”. Map columns on which lookup has to be perform. Means it applies where condition while performing sync operation. Here we have to map “ID” from “Available Input Columns” To “Available Lookup Columns”. And check all columns at “Available Lookup Columns”.
- For our clarification update “OutPut_Alias” column names as “Out_ID”, “Out_First_Name”, “Out_Last_Name”, etc as below.
- Go to “Error Output” tab and choose “Ignore Failure” for the column “Error”
- Click on Ok. Now the dtaflow task looks like below.
- Add a new transformation “Conditional Split” and connect it from “Lookup” transformation.
- While connecting from “Lookup” choose “Lookup Match Output”
- Now the dataflow task looks like as below
- Double click on Conditional Split transformation, give “Output_Name” as “New Rows” and assign the condition as ISNULL (Out_ID). It means there are no corresponding Out_ID available at destination which in turn as a new row.
- Now add a condition to find the modified rows. Give Output Name as “Updated” . Compare source and destination columns with || (OR) operator. It filters the rows where any of these columns has been changed. (([First_Name] != [Out_First_Name]) || ([Last_Name] != [Out_Last_Name]) || ([SSL] != [Out_SSL]) || ([DLNO] != [Out_DLNO]) || ([UpdatedOn] != [Out_UpdatedOn]) || ([CreatedOn] != [Out_CreatedOn]))
- Change Default Output name to “Unchanged Rows”. Means rows which are not fall in either of these two conditions are unchanged records.
- Now click on “Configure error Output” and made “Ignore failure” when error occurred for the output “Updated”. Because if both source and destination tables are in sync and no new records available and no updates required then the package should execute without fail and ofcourse it neither insert not update any commands.
- Click on OK and now the data flow task looks like below.
- Add a “OLE DB Destination” and connect it from conditional split transformation. While connecting choose “New Rows” as Output.
- Click on OK. Now open “OleDB destination” and map it with destination connection manager. Select the table name “Dest_Employee”.
- Go to the tab mappings and map columns accordingly as shown in below figure.
- Click on OK now the data flow looks like below:
- Add a transformation “OLE DB command” to data flow and name it as “OLE DB Command_Update Changed Rows”.
- Connect “OLE DB Command” from conditional split transformation. While connecting choose “New Rows” as Output.
- Click on OK. Now open the OLE DB command and select destination connection manager.
- Go to the component properties tab and give SQL command as below.
UPDATE dbo.Dest_Employee
SET
First_Name= ?
,Last_Name= ?
,SSL= ?
,DLNO= ?
,UpdatedOn= ?
,CreatedOn= ?
WHERE ID = ?
- Go to the next tab called “Column mappings” and map columns according to the parameters given in update statement. If you see the update statement “ID” would be last parameter hence we have to map it with last “Available Destination Column” as below.
- Click on OK. Now the package is ready and the data flow task looks like below.
- Now execute the package and remember there are no rows available at destination, hence initially it inserts all rows from source to destination.
- Now check the destination table
- You can see that all rows have been loaded into destination from source.
- Now insert 3 rows and update two existing rows at source and then run the package again.
USE Source
GO
INSERTINTO Source_Employee (First_Name,Last_Name,SSL,DLNO)
SELECT’Chan’,’Yano’,’CH-PP89977345′,’DL-CH0587332′
UNION
SELECT’Krishnan’,’Gopal’,’ID-IN8854687′,’DL-IN994532′
UNION
SELECT’Krish’,’Manlon’,’KD-KP8814356′,’DL-ASJ9KI0112′
USE Source
GO
UPDATE Source_Employee
SETFirst_Name = First_Name+’ Updated’,
Last_Name = Last_Name+’ Updated’,
UpdatedOn =GETDATE()
WHERE ID IN(2,4)
USE Source
GO
SELECT*FROM Source_Employee
- After inserting new records and updating two rows the source table looks like :
- Now execute the package:
- Now you can see that there are 3 new rows inserted and 2 rows updated.
- Go to the destination table and check the table to make sure both are in sync.