# World of Whatever

A blog about SQL Server, SSIS, C# and whatever else I happen to be dealing with in my professional life.

## Wednesday, November 27, 2013

### SSIS Copy all tables

This could have also been titled, "holy crap biml is powerful" but that wouldn't do much for SEO. Instead, this post demonstrates how easy, how trivial it is to generate an SSIS package for every table in your database and copy all the data to a new server. Think about that. How often do you have a need to refresh N tables in a development environment from production. No, you can't do a restore because there's work in progress there, people just need some subset of data refreshed. This is yet another hammer for you to solve that problem with.

I have been working with Biml for a few months now but I haven't had a use case for pulling the scripting aspect into it. Today was that day, courtesy of this SO question: ssis best practice to load N tables from Source to Target Server

The following bimlscript will generate an SSIS package for every table in the source database and write all the source data to the matching table in the destination. For my example, I scripted out the Adventureworks2012DW into a new copy and ran the following biml.

### What's it doing?

I create two connection strings to point to my source and destination. As noted over on StackOverflow, you will need to change this to your source and destination servers as well as verify the Provider matches your version.

I then have an query that pulls in all the tables and their schema, skipping any MS shipped ones and sysdiagrams. That query gets executed and dumped into a DataTable. That's all the setup really.

It's standard biml, create the Connections collection, define our Packages collection and then begin iterating through our DataTable.

Each Package will have 3 Variables: SchemaName, TableName and QualifiedTableSchema. The first two are set by values from our DataTable, the third is an expression built using the first two.

We add a Task collection and inside of it, a Data Flow with an OLE DB Source routed to an OLE DB Destination. I drive both of these by our Variable, QualifiedTableName.

<#@ template language="C#" hostspecific="true" #>
<#@ import namespace="System.Data" #>
<#@ import namespace="System.Data.SqlClient" #>
<#@ import namespace="System.IO" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<!--
<#
string connectionStringSource = @"Server=localhost\dev2012;Initial Catalog=AdventureWorksDW2012;Integrated Security=SSPI;Provider=SQLNCLI11.1";
string connectionStringDestination = @"Server=localhost\dev2012;Initial Catalog=AdventureWorksDW2012_DEST;Integrated Security=SSPI;Provider=SQLNCLI11.1";

string SrcTableQuery =     @"
SELECT
SCHEMA_NAME(t.schema_id) AS schemaName
,   T.name AS tableName
FROM
sys.tables AS T
WHERE
T.is_ms_shipped = 0
AND T.name <> 'sysdiagrams';
";

DataTable dt = null;
dt = ExternalDataAccess.GetDataTable(connectionStringSource, SrcTableQuery);
#>
-->
<Connections>
<OleDbConnection
Name="SRC"
CreateInProject="false"
ConnectionString="&lt;#=connectionStringSource#>"
RetainSameConnection="false">
</OleDbConnection>
<OleDbConnection
Name="DST"
CreateInProject="false"
ConnectionString="&lt;#=connectionStringDestination#>"
RetainSameConnection="false">
</OleDbConnection>
</Connections>

<Packages>
<# foreach (DataRow dr in dt.Rows) { #>
<Package ConstraintMode="Linear"
Name="&lt;#=dr[1].ToString()#>"

>
<Variables>
<Variable Name="SchemaName" DataType="String"><#=dr[0].ToString()#></Variable>
<Variable Name="TableName" DataType="String"><#=dr[1].ToString()#></Variable>
<Variable Name="QualifiedTableSchema"
DataType="String"
EvaluateAsExpression="true">"[" +  @[User::SchemaName] + "].[" +  @[User::TableName] + "]"</Variable>
</Variables>
<Dataflow
Name="DFT"
>
<Transformations>
<OleDbSource
Name="OLE_SRC &lt;#=dr[0].ToString()#>_<#=dr[1].ToString()#>"
ConnectionName="SRC"
>
<TableFromVariableInput VariableName="User.QualifiedTableSchema"/>
</OleDbSource>
<OleDbDestination
Name="OLE_DST &lt;#=dr[0].ToString()#>_<#=dr[1].ToString()#>"
ConnectionName="DST"
KeepIdentity="true"
TableLock="true"
KeepNulls="true"
>
<TableFromVariableOutput VariableName="User.QualifiedTableSchema" />
</OleDbDestination>
</Transformations>
</Dataflow>

</Package>
<# } #>
</Packages>
</Biml>

## Usage

2. Add a new biml file to an existing SSIS project
3. Disable Visual Studio's auto-fix for XML files. See Overcoming BimlScript Copy and Paste Issues in Visual Studio
4. Paste this code into that Bimlscript.biml file
5. Fix your source and destination connection strings (lines 8 & 9) to point to the correct servers as well as change the Provider type if SQLNCLI11.1 is not correct for your version of SQL Server
6. Right-Click on the biml file and select "Generate SSIS Packages"

## Friday, November 22, 2013

### BIML Active Directory SSIS Data Source

Since posting Active Directory SSIS Data Source I've received a few requests for the package. And I always felt bad because there wasn't a nice way to say change, these N things from my copy and it'll work for you and here's a copy for 2005, 2008, 2012 and now 2014. Egads.

Fortunately, there's a new tool in my toolbelt called Biml and it's awesomesauce.

2. Install BIDS Helper
3. Open Visual Studio, create/open an Integration Services project type.
4. Right click on the project and select "Add New Biml File"
5. In your project's Miscellaneous folder, double-click the BimlScript.biml file and paste the following content into it replacing the file's content.
6. Save
7. Edit line 6. Replace HOME.BILLFELLOWS.NET with the name of your domain controller (DC).
• To determine your DC, from a command prompt, type set | find /i "userdnsdomain"
• Sample results
J:\Users\bfellows>set | find /i "userdnsdomain"
USERDNSDOMAIN=HOME.BILLFELLOWS.NET

8. Edit line 46. Replace LDAP://DC=home,DC=billfellows,DC=net with the path to the domain controller. Follow the pattern and you should be fine.
9. Right click on the BimlScript.biml file and choose "Generate SSIS Packages"
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
Provider="System.Data.OleDb.OleDbConnection, System.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
/>
</Connections>
<Packages>
<Package
ConstraintMode="Linear"
DelayValidation="true"
>
<Annotations>
<Annotation AnnotationType="Description">$header$</Annotation>
</Annotations>
<Connections>
</Connections>
<Variables>
<Variable Name="RowCountSource" DataType="Int32" Namespace="User">0</Variable>
</Variables>
<Dataflow
DelayValidation="true"
>
<Transformations>
>
<DirectInput>
SELECT
distinguishedName
,   mail
,   samaccountname
,   Name
,   objectSid
,   userAccountControl
,   givenName
,   middleName
,   sn
FROM
'LDAP://DC=home,DC=billfellows,DC=net'
WHERE
sAMAccountType = 805306368
ORDER BY
sAMAccountName ASC
</DirectInput>

<RowCount
Name="RC Source"
VariableName="User.RowCountSource"
/>

<DataConversion
Name="DC NTEXT to DT_WSTR"
>
<Columns>
<Column DataType="String"
Length="500"
SourceColumn="distinguishedName"
TargetColumn="distinguishedName"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="mail"
TargetColumn="mail"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="samaccountname"
TargetColumn="samaccountname"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="Name"
TargetColumn="Name"
IsUsed="true"
/>
<Column DataType="Int32"

SourceColumn="userAccountControl"
TargetColumn="userAccountControl"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="givenName"
TargetColumn="givenName"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="middleName"
TargetColumn="middleName"
IsUsed="true"
/>
<Column DataType="String"
Length="500"
SourceColumn="sn"
TargetColumn="sn"
IsUsed="true"
/>
</Columns>
</DataConversion>

<DerivedColumns Name="DER Check Account Status">
<Columns>
<Column DataType="Boolean" Name="IsActive" >(([userAccountControl] &amp; 2) == 2) ? false : true</Column>
</Columns>
</DerivedColumns>

<ConditionalSplit Name="CSPL Filter Inactive Accounts">
<OutputPaths>
<OutputPath Name="ActiveAccounts">
<Expression>IsActive</Expression>
</OutputPath>
</OutputPaths>
<InputPath OutputPathName="DER Check Account Status.Output"></InputPath>
<DataflowOverrides>
<OutputPath OutputPathName="Default"
PathAnnotation="SourceName"
></OutputPath>
</DataflowOverrides>
</ConditionalSplit>
<DerivedColumns Name="bit bucket Active">
<InputPath OutputPathName="CSPL Filter Inactive Accounts.ActiveAccounts"></InputPath>
</DerivedColumns>
<DerivedColumns Name="bit bucket InActive">
<InputPath OutputPathName="CSPL Filter Inactive Accounts.Default"></InputPath>
</DerivedColumns>
</Transformations>
</Dataflow>
</Package>
</Packages>
</Biml>

If everything went well, you'll end up with a package like this (minus the data viewers)

### Caveats

The original limitations still exist. The next post of this will have the script task included to generate group membership. The final installment will use a .NET script source to provide an alternative to the limitations of the query route.

## Thursday, November 7, 2013

On StackOverflow today, someone asked if you could upload an image from an on-premises SQL Server to Windows Azure Blob Storage with SSIS. That's just the sort of crazy stuff I need to see if I can do so off I went.

I must give great credit to the Azure team and in particular, their tech writers. I was able to crib heavily from their how to and had a working demo in fairly short order. Tying it into SSIS was a bit more work just due to the eccentricities of the platform but hey, it works!

I'm reposting my answer here in case I need to remember how to do this.

What a fun question this was! I got to thread together a lot of pieces that I had never tried.

I first built out a simple console app based on the fine manual over on HOW TO: Blob Storage. Knowing that I had working code allowed me to adapt it for SSIS.

I created 3 SSIS Variables at the Package level. AccountName, AccountKey and ContainerName. They are all data type String. These provide credentials + the folder where my uploaded data will reside.

## Data Flow

The general look of your data flow is rather simple. A data source to a Script Component that will act as a Destination. You will need two columns: one provides a unique name for the blob and the other will be the binary bits.

My source is a trivial table. It has Country Names and their flag (stored as varbinary(max)) which you yourself can scrape from the CIA World Handbook if you're so inclined.

The Destination will be a bit of C#. Add a Script Component of type Destination.

On the Script tab, I have 3 ReadOnly Variables listed User::AccountKey,User::AccountName,User::ContainerName

On the Input Columns tab, I select CountryName and FlagImage.

The script itself follows. As noted in the How To, you will need to add a reference to Microsoft.WindowsAzure.Storage assembly before you can access the last 3 assemblies there.

using System;
using System.Data;
using System.IO;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

// Must add reference to Microsoft.WindowsAzure.Storage for this to work
// http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;

/// <summary>
/// Watch me load data to Azure from SSIS
/// </summary>
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{

/// <summary>
/// The storage account used
/// </summary>
private CloudStorageAccount storageAccount;

/// <summary>
/// An entity to work with the Blobs
/// </summary>
private CloudBlobClient blobClient;

/// <summary>
/// Blobs live in containers
/// </summary>
private CloudBlobContainer container;

/// <summary>
/// blockBlob instead of a pageBlob
/// </summary>
private CloudBlockBlob blockBlob;

/// <summary>
/// This method is called once, before rows begin to be processed in the data flow.
///
/// You can remove this method if you don't need to do anything here.
/// </summary>
public override void PreExecute()
{
base.PreExecute();
string cs = string.Empty;
string csTemplate = string.Empty;
string accountName = string.Empty;
string accountKey = string.Empty;
string containerName = string.Empty;

accountName = Variables.AccountName;
accountKey = Variables.AccountKey;
containerName = Variables.ContainerName;
csTemplate = "DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}";
cs = string.Format(csTemplate, accountName, accountKey);

this.storageAccount = CloudStorageAccount.Parse(cs);
this.blobClient = this.storageAccount.CreateCloudBlobClient();
this.container = this.blobClient.GetContainerReference(containerName);
this.container.CreateIfNotExists();
this.container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });

}

/// <summary>
/// For each row passing through, upload to Azure
/// </summary>
/// <param name="Row">The row that is currently passing through the component</param>
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string blobName = string.Empty;

using (MemoryStream memStream = new MemoryStream(Row.FlagImage.GetBlobData(0, (int)Row.FlagImage.Length)))
{
this.blockBlob = this.container.GetBlockBlobReference(Row.CountryName);
}
}

}


## Global Assembly Cache (GAC)

Assemblies you wish to use within SSIS must reside in the GAC. Assemblies cannot go into the GAC unless they are signed. Fortunately, the Azure assemblies are signed so from a Visual Studio Command Prompt, type gacutil -if "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.1\ref\Microsoft.WindowsAzure.Storage.dll" or the equivalent of where your version of that assembly exists

And as proof, here's a shot from Azure Storage Explorer

# Integration Services Catalog

The 2012 release of SQL Server brought a new capability for SSIS package management and I wanted to put together a comprehensive post on some of the management side of things. Getting the nomenclature sorted out can be rather helpful so let's start there. Within SQL Server Management Studio, SSMS, connect to a 2012 instance. At the bottom, there should be an "Integration Services Catalogs." If that expands when you click the plus sign, then someone has configured the server to run Integration Services.

### 1. SSISDB

Despite the "Catalogs" part of the above label, there is only one allowable catalog for the 2012 release---the SSISDB. That is a physical catalog/database on the instance that contains all of the bits and bobs for managing Integration Services from within SQL Server for those using the new Project deployment model.

And lo, the people rejoiced as Microsoft has stopped stuffing everything into msdb...

#### Right click options

• Active Operations - Lists all currently running SSIS packages with an option to Stop the selected package.
• Create Folder - Creates a folder with an optional description
• Start PowerShell - Self explanatory
• Reports - A variety of native reports about package executions.
• Delete - This deletes the SSISDB catalog. Doing this will result in having to recreate the catalog and a loss of all your history.
• Refresh - Self explanatory
• Properties - Contains information about the default logging settings and the project versioning.

### 2. Folder

A folder contains Project(s) and, optionally, Environment(s).You must create a folder. The deployment wizard will not automagically create one for you. You may call it whatever you like. You may have all of your SSIS projects under a single folder, have a 1:1 mapping between projects and folders or anything in between. Folders are only 1 level deep - No nesting

#### Right click options

• Start PowerShell - Self explanatory
• Reports - All Executions, All Validations
• Delete - This deletes the folder, assuming there are no Projects or Environments below it.
• Refresh - Self explanatory
• Properties - Contains the name, version and a description.

### 3. Projects

This is automatically created when a folder is created.

#### Right click options

• Deploy Project - IS Dpeloyment Wizard
• Import Packages - Integration Services Project Conversion Wizard
• Start PowerShell - Self explanatory
• Reports - Only Custom Reports are available
• Refresh - Self explanatory

### 4. An instance of a project

In this example, I had an SSIS project created called SSIS2012. I also had one called SQLSaturdaySSIS. This is the name of the associated .dtproj in Visual Studio. If you don't like this name in the Integration Services Catalog, you cannot change it here. You must update your original solution, recompile it and deploy it back into the catalog.

#### Right click options

• Configure - This allows for the specification of values for Parameters and/or Connection Managers.
• Validate - Attempt to validate the deployment. Ensures all the metadata is valid, connections work, etc, without actually "running" the packages.
• Move - Relocate the project from one folder to another.
• Versions - The catalog now keeps track of previous revisions of your deployed project. This is not proper version control.. Phil Brammer has a clever writeup about extracting old versions of projects.
• Export - This allows you to pull the .ispac out of SQL Server to disk
• Start PowerShell - Self explanatory
• Reports - Two native report options
• All Executions - Provide a history of all package executions, where all is however far back you have the catalog configured to retain history
• All Validations - Provide a history of all package validations.
• Delete - Deletes the project
• Refresh - Self explanatory
• Properties - Provides the Name, Identifier, Description, Project Version and Deployment Date. Name and Description come from the .ispac's @Project.manifest file. Identifier, Version and Deployment date are all things internal to the catalog's management. The Deployment date though, that can be handy if things "suddenly" start acting up you can quickly determine if there might have been a code change. Emphasis on might because deploying the same .ispac three times in a row will result in three different versions in history with no actual change.

### 5. Packages

This folder is automatically created when you deploy an ispac into a folder.

#### Right click options

• Start PowerShell - Self explanatory
• Reports - Only has the option for Custom reports
• Refresh - Self explanatory

### 6. A package

#### Right click options

• Configure - This allows for the specification of values for Parameters and/or Connection Managers
• Execute - Brings up the dialog to prepare a package for execution.
• Validate - Attempt to validate the deployment. Ensures all the metadata is valid, connections work, etc, without actually "running" the packages.
• Start PowerShell - Self explanatory
• Reports - Two native report options
• All Executions - Provide a history of all package executions, where all is however far back you have the catalog configured to retain history
• All Validations - Provide a history of all package validations.
• Refresh - Self explanatory
• Properties - Exposes the Name, Identifier, whether it's an Entry Point package and the Description.

### 7. Environments

Environments are the successor for Configuration resources (XML, Table, OS Environment Variable, Registry value).

#### Right click options

• Create Environment - Create a new Environment under this folder. Requires a Name and an optional Description.
• Start PowerShell - Self explanatory
• Reports - Only has the option for Custom reports
• Refresh - Self explanatory

### . An Environment

FileSystem and Database are examples of two Environments I have created. You can define as many Environments as you wish. I typically see people naming them for the server class they are in (dev, test, load, stage, prod) but I don't think that makes as much sense as Environments based on the type of thing they configure (FileSystem, Database, etc). Future blog material though. For now, know they exist as a dictionary of values. You can mark them as Sensitive so you can store secure materials.

#### Right click options

• Move - Relocate the Environment from one folder to another. Any configurations built on that Environment do not have a cascading update applied.
• Start PowerShell - Self explanatory
• Reports - Only has the option for Custom reports
• Rename - Changes the name of the Environment.
• Delete - Deletes the Environment
• Refresh - Self explanatory
• Properties - 3 tabs. One is the Name, ID and Description. Second is a list of the Name, Description, Type, Value, Sensitivity of each variable. The Third tab describes the permissions associated to the Environment.

## GUI isn't everything

Even though there's a GUI for all this management, don't hesitate to click that Script button on any of these interfaces and look at the generated scripts. I might use the GUI to build out my Environment and configure a project to use them on our Dev server but you can be assured I've clicked the Script button and saved that out to a version controlled file. When it's time to migrate up the chain I update server names and file paths and hit F5. You're welcome to make mistakes hand key those values per environment but while you're doing that, I'll be off checking StackOverflow.

# BimlScript overview

This is a followup of sorts to my posts on creating SSIS packages with the EzAPI. In these posts, I'll show how to create the same SSIS packages using Biml.

## Prerequisites

The pre-reqs for using BIML to create SSIS packages is simple

## Getting started with Biml

Biml, Business Intelligence Markup Language, is a DSL for creating BI artifacts. It's an interesting proposition: learn a third language to speak a second language more efficiently. Note that Biml covers both SSIS packages and Analysis Services but for now, I'm only building a HOW TO SSIS packages. The more I've been working with this, the more I think this dog might hunt...

Open Visual Studio/BIDS/SSDT-BI and create a new project of type "Integration Services Project".

This is the SQL Server 2012 templates in Visual Studio 2012 but the beautiful thing about Biml is that it's going to work with whatever version of BIDS Helper you've installed.

You will now have a BimlScript.biml file in your Miscellaneous folder. You may rename it if you wish, there is no meaning between a .biml file and the output it generates.

Double click it and Visual Studio will open it and you'll be greeted with

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
</Biml>

To quote Kitty: Wow, that's really something Lex. It's freaking gone with the wind.

You'd be correct, this script won't do anything. Let's describe the Hello World equivalent of an SSIS package. We must create a Packages collection and inside that, add a Package node. Every Package you define will translate into a physical .dtsx file. Each Package node requires two attributes: Name and ConstraintMode.

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<Package
Name="SimpleBimlPackage"
ConstraintMode="Linear"
/>
</Packages>
</Biml>

A beautiful thing you should notice is the Intellisense. Perhaps it's just my age---but a good implementation of an autocomplete system is invaluable. What I really like, especially since I'm just fumbling my way through BIML, is that after defining an attribute, it no longer shows in the autocomplete list.

At this point, we still have no package, just a description of what the project as a whole and the package itself should be. Let's correct that. Right click on the BimlScript.biml file and select "Check Biml for Errors".

After a moment or two, BIDSHelper should report back "No errors or warnings were found."

Once more, right click on "BimlScript.biml" but this time chose the "Generate SSIS Packages" option. After a moment and a flicker, you will notice you now have a real SSIS package.

Examine the package from head to toe and you'll find it's devoid of anything. We now have a mirror of the two packages created with EzAPI and the straight API calls. Given the beauty of tab completion, I'm going to award Biml the fewest key strokes to generate a package.

Yup, looks like a package. One way you can differentiate a package generated from EzAPI from BIDS/SSDT from Biml or the straight .NET API is the Description property. In the case of the .NET API or Biml, the Description is only populated if you provide explicit descriptions.

# I'm in!

I might be a little stoked so please forgive this post but I've had a submission accepted for the 2013 Pass Summit. In the "BI Platform Architecture, Development & Administration" track, I'll be delivering a talk titled "Type more, click less: Programmatically build SSIS packages."

### Gratuitous browser abuse

Marquee tag: I'm speaking!

Blink tag: I'm presenting at Summit 2013

## Type more, click less: Programmatically build SSIS packages.

Why should you come to my session? Because you would love to automate some of the repetitive tasks you perform with SSIS.

You did know you didn't have to fire up BIDS/SSDT to make all your changes, right? In fact, did you know you can build complete packages without ever firing up BIDS? I wouldn't go that route unless I had a strong business case for doing so, but a little bit of up front work can yield an N% complete SSIS package.

Specific technologies I'll cover are BIMLScript, EzAPI and the base .NET library.

After attending this session, I want you to know what your options are and when it makes sense to use one approach instead of the other.

## Monday, April 22, 2013

### File in use by another process

There's nothing quite so delightful as getting error messages that don't provide enough information to resolve the base problem. Case in point, I went to run an SSIS package which in 2012 requires a rebuild of the .ispac and received the following error message.

System.IO.IOException: The process cannot access the file 'C:\Src\DataWarehouse\bin\Development\DataWarehouse.ispac' because it is being used by another process.

What is the other process? I've seen people resort to a reboot because they can't figure out what process has their grubby finger on the file but that is overkill. All you really need is the free utility "Process Explorer" from Microsoft/Sysinternals.

Download or simply run from the website (but you'll want to do download it and keep it as part of your troubleshooting kit) and once it pops up, under the Find menu, select "Find Handle or DLL..." Ctrl-F

In the resulting window, simply search for the file that is being reported as locked.

Here, I searched for the .ispac extension and discovered that a DtsDebugHost.exe still had the .ispac file open. In this specific example, that tells me I should check my Visual Studio instances as that'd be the process that launched DtsDebugHost. That was my instinct however and I had no instances of VS/SSDT/BIDS running so I chalked it up to a zombie process and just killed the process.

You can then simply double click on the line with the Process and observe that the underlying process list will find the offending PID (9388). In the resulting window, right click on the process line and select Kill