How to Migrate from SQL 2005 to 2014

On April 12, 2016 Microsoft will no longer offer extended support for SQL Server 2005.  This means no more security updates or hotfixes.  However, the bigger reason to upgrade is the fantastic performance improvements offered in SQL 2014.  SQL Server 2014 has been shown to be 13X faster than 2005, and that is before taking advantage of the in-memory OLTP.  In addition there are features such as AlwaysOn availability groups, updateable columnstore indexes, T-SQL Intellisense and backups directly to the cloud, just to name a few.  For a more complete listing read What’s New in SQL Server 2014 since SQL Server 2005.

Preparation is key.  Microsoft has provided a number of tools to make migrating your databases easier.  You’ll want to use these tools to assess and plan your migration well ahead of time.  Support for Windows Server 2003 is also ending soon, which may impact your decision making.  Part of your assessment will be to ensure your environment is suitable for SQL Server 2014 before migrating your databases. SQL Server 2014, of course, has minimum hardware and software requirements.   Microsoft recommends a minimum of 4GB of memory, to be increased as database size increases, and processor speed of 2.0GHz or faster. The OS depends on which version of SQL Server you are installing. 

There are two ways to migrate from SQL Server 2005 to SQL Server 2014.  You can do a clean install and restore from backup, or you can do an in-place upgrade. The clean install may sound appealing from a Windows registry and system stability standpoint.  However it means restoring all of your databases, and any custom settings and properties you may have changed.  Over 10 years a lot of tweaks to settings can have been made, and it could be very time consuming to restore them all.  It may even mean purchasing another OS license.

You can do an in-place upgrade to SQL Server 2014 from any version as far back as SQL Server 2005.  You cannot do the in-place upgrade for versions prior to that.  You would need to upgrade a prior version to 2005 or 2008 and then upgrade again to 2014. You do need to consider the edition you are upgrading from, Standard, Developer, Enterprise, etc to determine the supported upgrade path.  With the SQL Server 2014 in-place upgrade you will be able to upgrade SSIS packages, SSAS and SSRS.

I highly recommend you install and run the Upgrade Advisor prior to doing the in-place upgrade.  This tool will analyze components of your previous version and identify issues you may need to address before upgrading to SQL Server 2014.  Upgrade Advisor can be installed remotely to analyze all supported components of SQL Server, except SSRS. The remote computer where you are installing Upgrade Advisor must meet the Upgrade Advisor prerequisites.  If you want to analyze SSRS you must install Upgrade Advisor on the report server itself.

You can install the Upgrade Advisor from the the SQL Server setup, or download it from Microsoft. 

image

In addition to the Upgrade Advisor, I found this great resource, Upgrading to SQL Server 2014: A Dozen Things to Check, written by Thomas LaRock. It is not specific to migrating from SQL Server 2005, but it has some great tips for preparing for a smooth SQL upgrade.

Once you have addressed any pre-installation issues, you are ready to perform the upgrade.  I won’t run through the step-by-step as there are many articles documenting this, such as this one.

There are many great reasons to upgrade your SQL Server, and with the end of extended support in sight, now seems like a good time.  Preparation for your upgrade is key, and Microsoft has provided lots of tools to make it easier.

Moving Your SQL Databases to Azure – Things to Know

This blog is original content written for the Microsoft MVP Award Program Blog, posted here.

SQL Azure is a compelling platform for data storage. It’s cheap, reliable, and highly accessible. However, it’s not identical to SQL Server on-premise, and there are a number of things to be aware of when considering a move from one platform to the other, or simply adopting it. This article will walk through what the various SQL Azure options are, outline the processes involved in moving data form on-prem to Azure, and call out some "gotchas", or things to be aware of when operating a SQL Azure database.

COMPARISON WITH ON-PREM

Azure SQL is SQL Server behind the scenes, so most of the functionality is already there including tables, views, stored procedures, triggers, functions, primary and foreign keys and clustered indexes. Of course there is no windows authentication, and it currently uses SQL authentication only.

SQL Azure allows for logical vs physical administration. There is no to need maintain, balance, upgrade or patch the server as this is all done by Microsoft. You have no control over the physical database and log files and cannot get to them. Therefore, you cannot detach and attach databases as you cannot get to the physical drives. You also can’t reboot the server, so if you end up with a runaway query you may have to open a support ticket. Killing the session doesn’t always work.

Most tools work with SQL Azure database, including:

· .NET Framework via ADO.NETC / C++ via ODBC

· Java via Microsoft JDBC provider

· PHP vis Microsoft PHP provider

· Frameworks such as OData, Entity Framework WCF Data Services, NHibernate

Your application talks to the primary database, but there are actually 3 copies kept in sync. There are always 3 copies of the database for high availability during disaster recovery. If your database should go down, one of the secondary copies will be promoted to be the primary database, and new third copy will be created.

There is a requirement for tables in a SQL Azure database to have a clustered index. This is necessary to keep the 3 copies of the database in sync.

The maximum SQL Azure database size is currently 500GB, but you can get around this using SQL federations and partitioning your data across multiple nodes.

There are a number of partially supported and unsupported features. A few of the ones I run into regularly are:

· You cannot use the USE [databasename] sql statement. You must physically switch between databases in your application.

· Remove from indexes – NOT FOR REPLICATION

· Remove from your tables – WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

You can review a full list of unsupported features here: https://azure.microsoft.com/en-us/documentation/articles/sql-database-transact-sql-information/

TOOLS

SQL Azure database does use transact SQL. There are a number of ways to connect to the database, including Windows Azure Management Portal, SSMS (2008 R2 or newer), or command line utilities such as SQLCMD and BCP. You can move data to and from your SQL Azure database using anything that can connect to the database, such as SSIS, SSMS. When it comes to migrating your database to SQL Azure there is a great article on what to consider, here: https://azure.microsoft.com/en-us/documentation/articles/sql-database-cloud-migrate/

When I migrate a database from SQL to SQL Azure, I typically follow this process using SSMS:

· Create a blank database on the SQL Azure database server

· Generate the scripts from the original database to create the database objects, excluding users

· Do a find and replace to remove any unsupported features such as the two mentioned above

· Run the create database object scripts against the new SQL Azure database

· Create the users and apply permissions for the new database

· Use SSMS or SSIS to copy the data over to the new database.

The SQL Database Management Portal is a web based, scaled down version of SSMS. You can create objects, and run queries and execution plans. But there is no GUI interface for some of the security features like creating users and logins. I find that it’s a friendlier experience to create the database server in the portal, and do everything else using SSMS.

BACKUP

SQL Azure databases are protected by an automatic backup system. The length of time the backups are retained depends on what tier you buy – 7 days for Basic, 14 days for Standard and 35 days for Premium. The point-in-time restore is a self-service feature that costs you nothing unless you use it. If you use it, you pay regular rates for the new database that gets restored. You get all of the protection without any additional cost.

SECURITY

You are in complete control the IP specific access to SQL Azure Database, at both the server AND database level. No one has access by default. At the server level you go into the Management Portal and add your IP in order to be able to connect remotely to the database. Or if you prefer a more granular level of security you can add a firewall rule at the database level to allow specific IPs to connect. Yes, every time your IP changes, you have to update your firewall rules.

SERVICE TIERS AND PERFORMANCE LEVELS

There are three tiers, with several levels of performance within them. I will summarize the Microsoft definitions.

· Basic: Best suited for a small size database, supporting typically one single active operation at a given time.

· Standard: The go-to option for most cloud applications, supporting multiple concurrent queries.

· Premium: Designed for high transactional volume, supporting a large number of concurrent users and requiring the highest level of business continuity capabilities.

Within the tiers are various performance levels. The definition of a Performance Level includes something called a Database Throughput Unit (DTU. DTUs provide a way to describe the relative capacity of a performance level of Basic, Standard, and Premium databases. DTUs are based on a blended measure of CPU, memory, reads, and writes. As DTUs increase, the power offered by the performance level increases. For example, a performance level with 5 DTUs has five times more power than a performance level with 1 DTU. There is more involved, such as maximum concurrent requests, logins and sessions, but at a high level these are the performance levels from which you can choose:

clip_image001

You’ll want to monitor and play with the levels to find the optimum for you needs. There is a Monitor tab for each database in the Azure Portal where you can review the performance. You can easily change the performance level on the fly. If you are anticipating heavier use, you can up the performance level for a period of time, and then drop it back down when the heavy use period is over. You only pay for the performance while it was selected.

COSTS

The pricing varies greatly depending on how many databases you have, how long they have been up, how big they are, what performance level you select, etc. Microsoft advertises no upfront cost, no termination fees, pay only for what you use, and pay by the hour.

Costs can range anywhere from $7 per month for the Basic tier, $19 – $183 per month for a 250GB database in the Standard tier, to $566 to $8500 per month in the Premium tier.

HOW TO PROVISION A SQL AZURE DATABASE

Create a Windows Azure Platform account

If you don’t already have one, you’ll need to create a Windows Azure Platform account. There is a one month free trial on offer. This will give you access to all the Azure services including VMs, SQL databases, websites, AD, etc. To sign up go to http://azure.microsoft.com/. Sign in to the Microsoft Azure Portal at https://manage.windowsazure.com.

Create a Database Server

When you are ready to create a SQL Azure database, you must first add a server to your subscription on which the database will reside in the cloud. Azure will provision it to the least busy server available in the cloud. Health of physical servers are monitored and maintained by Microsoft. You don’t manage these servers yourself. Microsoft does this for you. You don’t need to wait for I.T. to provision a server for you, which could take weeks. Simply sign in, create a server and the admin login credentials, and start adding databases. You can have a server and databases set up in a matter of minutes. To automate server provisioning you can also use PowerShell or REST API calls directly.

Below is an image of the Microsoft Azure Portal. Notice that we are looking at the SQL Databases, Servers tab. You can add a new server by clicking the add button at the bottom of the screen.

clip_image003

· Log in to Windows Azure Management Portal.

· Create a SQL database server.

· Create admin login credentials for that server. This will be used like the SA account.

· Choose the subscription to which you want to add the server, and choose region of the data center where you would like the server to reside.

· You do not get to choose the name of the server. It’s randomly generated.

· Decide if you will "Allow Windows Azure Services", the default is yes. If you were to create a Windows Azure website or cloud service, then services would have access to SQL Azure database server and databases. You can edit this setting the server Configure tab if you change your mind later.

· Click OK. The server will be created in a matter of seconds.

Add Firewall Rules at Server Level

Once you have created the server, you will grab the name of the server, and will likely want to connect to it using SSMS or some other tool. You will connect using the url yourservername.database.windows.net. Your connection will fail, until you add your IP address to the firewall rules.

To add firewall rules and enable service access:

· Click on the database server you created.

· Select the Configure tab.

· You will see your current IP address near the top of the screen

· Add a rule to allow the IP address range that will be connecting to the database and give it a name.

clip_image005

Now you can connect via automatically generated FQDN (xxx.database.windows.net) which initially contains only the master database.

Create a Database

When you create a new database there a number of settings from which to choose, including Service Tiers and Performance Level.

Note: There is an option to select WEB or BUSINESS edition, but this will be retired in September 2015.

Edition: Web (select 1GB or 5GB) or Business (select 10GB to 150GB)

If you create database in SSMS, it will automatically create a 1GB web edition unless you specify otherwise. To create a database in SSMS you can run this sql script:

CREATE DATABASE <yourdatabasename> (Edition ='Business', Maxsize=150gb)

· Key in your database name

· Choose which subscription you are adding this database to

· Choose the Service Tier

· Choose the Performance Level

· Choose the Collation

· Choose on which database server you want the database to reside

clip_image007

Add Firewall Rules at Database Level:

You can add firewall rules at the database server level or at the database level. You can do this at the database level using the stored procedure sp_set_database_firewall_rule.

Run this SQL command against your database:

EXEC sp_set_database_firewall_rule N'MyRule', '100.100.1.1','100.100.1.1'

To look up existing database firewall rules use this query:

SELECT * FROM sys.database_firewall_rules

There are a number of differences and adjustments you may need to make to move to SQL Azure, from unsupported SQL features to managing firewall security. If these are not an issue for you, there is great opportunity to leverage the ease, scalability and reliability of SQL Azure databases. You pay only for what you use and someone else manages the server maintenance, upgrades and backups. It’s a sweet deal.

Creating a Proxy User to run an SSIS package in SQL Server Agent

There have been a number of times over the years when I have had to create a Proxy user in SQL Server in order to provide needed access to connections and locations being used in an SSIS package.  Sometimes the SQL Server Agent login account simply doesn’t and shouldn’t have the required permissions. 

Before using a Proxy account, do check to see if the permissions issue isn’t just that the SQL Server Agent login account has been set up as ‘Local System’.  If that is the case, see if you can change it to a domain account specifically created for SQL Server Agent purposes.  Check Administrative Tools -> Services on the server where the SSIS SQL Server resides to see what login account the SQL Server Agent is mapped to.

The user mapped to the SQL Server Agent Service Account will need read/write permissions.  If you do need to create a new domain login for the SQL Server agent, in SSMS go to Server-> Security (not database security) -> Logins -> left click New Login -> Search -> Locations button -> Entire Directory -> select main domain ->OK -> Sql Agent username-> Check Names button-> OK-> Server Roles-> sysadmin-> OK..

If it turns out that you need to create an SSIS proxy user, edit this script to use the correct username and password and run it to create the proxy user.

USE master 
GO
-- Create a proxy credential for xp_cmdshell.
EXEC sp_xp_cmdshell_proxy_account 'DOMAIN\username', 'password';--SELECT  * FROM [master].[sys].[credentials]
-- Grant execute permission on xp_cmdshell to the SQL Server login account. 
GRANT exec ON sys.xp_cmdshell TO [DOMAIN\username] 
GO

-- Create a credential containing the domain account PowerDomain\PowerUser and its password
CREATE CREDENTIAL MyCredential WITH IDENTITY = N'DOMAIN\username', SECRET = N'password'
GO
USE [msdb]
GO
-- Create a new proxy called SSISProxy and assign the PowerUser credentail to it
EXEC msdb.dbo.sp_add_proxy @proxy_name=N'MyProxy',@credential_name=N'MyCredential',@enabled=1
-- Grant SSISProxy access to the "SSIS package execution" subsystem
EXEC msdb.dbo.sp_grant_proxy_to_subsystem @proxy_name=N'MyProxy', @subsystem_id=11
-- Grant the login testUser the permissions to use SSISProxy
EXEC msdb.dbo.sp_grant_login_to_proxy @login_name = N'DOMAIN\username', @proxy_name=N'MyProxy'
GO

You will be able to see the proxy user in SSMS under SQL Server Agent.  Is is in the SSIS PAckage Execution section because we added it to the SSIS subsystem in our code.

 

image

Now when you create your SQL Server Agent Job you can choose to run the SSIS package as your proxy user with the required permissions,rather than the SQL Server Agent account.

image

Querying XML in SQL

I like to use my blog as a library for myself, to be able to easily find solutions I have used before.  Here is a simple example of querying an XML string using SQL. 

In this example I have inserted one XML string into a temporary table, with two items called pages, and two reference of type ‘book’.  The resulting select statements will pull any and all pages from the XML string, and any ‘book’ references.  There can be multiple types of references with varying fields, which you can query in the same way, simply be altering the text in the .nodes in the FROM statement.

    DECLARE @xml as XML
    DECLARE @tblXML AS TABLE  (Response xml )
INSERT INTO @tblXML
SELECT '<response>
  <pages>
    <page>
      <id>1376745</id>
      <name>First Page</name>
     </page>
    <page>
      <id>1376746</id>
      <name>Second Page</name>
     </page>  </pages>
  <references>
    <reference>
      <type>book</type>
      <id>425285</id>
      <name>How I met your mother</name>
      <activestatus>false</activestatus>
    </reference>
    <reference>
      <type>book</type>
      <id>425286</id>
      <name>Covered in Bees</name>
      <activestatus>true</activestatus>
    </reference>
  </references>
</response>' as Response


SELECT @xml = Response from @tblXML

            SELECT 
                ID        =    t.item.value('(id)[1]','int'),
                BookName    =    t.item.value('(name)[1]', 'nvarchar(50)'),
                IsActive    =    case when t.item.value('(activestatus)[1]', 'nvarchar(10)') = 'false' THEN cast(0 as bit) ELSE cast(1 as bit) END
            FROM @xml.nodes('/response/references/reference[type/text() = "book"]') as t(item)


            SELECT ID        =    t.item.value('(id)[1]','int'),
                PageName    =    t.item.value('(name)[1]', 'nvarchar(50)')
            FROM @xml.nodes('/response/pages/page') as t(item)

You can also query a table directly rather than storing the XML in a variable by using CROSS APPLY, like this.

            SELECT ID        =    t.item.value('(id)[1]','int'),
                PageName    =    t.item.value('(name)[1]', 'nvarchar(50)')
            FROM @tblXML x
            CROSS APPLY
            Response.nodes ('/response/pages/page') as t(item)

I hope you find this useful.

Compare Data and Database Schemas

I was thrilled to discover the functionality to compare database data and schemas is available in Microsoft Visual Studio.  It not only allows you to compare two databases, but it will update the target database or write an update script for you to allow you to update the target database.

Most of this functionality has existed in Visual Studio Team System Database Edition since 2005.  Team System Database edition is used by database developers and administrators and includes advanced tools for database change management and testing.  Database schema comparison is now available in Visual Studio Premium and Visual Studio Ultimate since 2010.  Unfortunately it’s not in Professional or Express.

Having the database schema compare functionality in Visual Studio is awesome for developers who straddle several worlds. It’s nice to be able to build an application that affects or relies on a database, and also be able to compare the database schemas among environments, even if building the database doesn’t fall within your purview, all within one tool. 

The comparison is really easy to do. Look in Tools->SQL Server

SNAGHTML12d0a324

Select your source and target database

SNAGHTML12d22eb6

Click Compare and it will compare the databases.  Here I have compared the schemas of two databases.

SNAGHTML12d43ca6

The tool highlights the differences. In this case a new field called [newfield] exists in the table [Threads] in the source database but not the destination database.

Click the script icon to generate a SQL script for the changes to the destination database.

SNAGHTML1339c331

Or click the Update button to actually do the update of the destination database. 

How sweet is that? I know I’ll be using this regularly in all kinds of development.

Format row delimited values into comma delimited

If you just want a quick and dirty way to format row delimited data into comma delimited for consumption in a query, you can use Word. First you have to turn off the Autocorrect for Smart Quotes, otherwise you will get smart quotes instead of straight quotes in your query. Depending on your version of Word you will find AutoCorrect under Options in the Proofing section or under Tools.

Then you can Find and Replace any carriage returns with ‘,’ using Find and Replace to find ^p and replace it with ‘,’

Your results will look like this:
Name1′,’Name2′,’Name3′,’

You then just need to add a single quote at the beginning and remove the last comma and quote.

GP Historical MultiCurrency Receivables Query

This is really specific, but I had to hunt around to find out how to report on historical receivables in Great Plains.  This will get the multicurrency receivables, but you can modify it to get all receivables.

I’ve written a stored procedure which goes after a cascade of other stored procedures created for the Great Plains reporting.  I chose to use a stored procedure so I can pull the data as it is provided by the GP stored proc using the variables that that stored proc requires, and then adjust the data to meet the needs of my report.

USE [yourdatabase]
GO

/****** Object:  StoredProcedure [dbo].[x_report_HistoricalReceivables_MultiCurrency]    Script Date: 5/12/2014 1:24:37 PM ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

-- =============================================
-- Author:        <Author,,Name>
-- Create date: <Create Date,,>
-- Description:    <Description,,>
-- =============================================
CREATE PROCEDURE [dbo].[x_report_HistoricalReceivables_MultiCurrency]
@EndDate date
AS
BEGIN
    -- SET NOCOUNT ON added to prevent extra result sets from
    -- interfering with SELECT statements.
    SET NOCOUNT ON;

DECLARE @RC int
DECLARE @I_dAgingDate datetime
DECLARE @I_cStartCustomerNumber char(15)
DECLARE @I_cEndCustomerNumber char(15)
DECLARE @I_cStartCustomerName char(65)
DECLARE @I_cEndCustomerName char(65)
DECLARE @I_cStartClassID char(15)
DECLARE @I_cEndClassID char(15)
DECLARE @I_cStartSalesPersonID char(15)
DECLARE @I_cEndSalesPersonID char(15)
DECLARE @I_cStartSalesTerritory char(15)
DECLARE @I_cEndSalesTerritory char(15)
DECLARE @I_cStartShortName char(15)
DECLARE @I_cEndShortName char(15)
DECLARE @I_cStartState char(5)
DECLARE @I_cEndState char(5)
DECLARE @I_cStartZipCode char(11)
DECLARE @I_cEndZipCode char(11)
DECLARE @I_cStartPhoneNumber char(21)
DECLARE @I_cEndPhoneNumber char(21)
DECLARE @I_cStartUserDefined char(15)
DECLARE @I_cEndUserDefined char(15)
DECLARE @I_tUsingDocumentDate tinyint
DECLARE @I_dStartDate datetime
DECLARE @I_dEndDate datetime
DECLARE @I_sIncludeBalanceTypes smallint
DECLARE @I_tExcludeNoActivity tinyint
DECLARE @I_tExcludeMultiCurrency tinyint
DECLARE @I_tExcludeZeroBalanceCustomer tinyint
DECLARE @I_tExcludeFullyPaidTrxs tinyint
DECLARE @I_tExcludeCreditBalance tinyint
DECLARE @I_tExcludeUnpostedAppldCrDocs tinyint
DECLARE @I_tConsolidateNAActivity tinyint

--TODO: Set parameter values here.
Set @I_dAgingDate=  @EndDate
Set @I_cStartCustomerNumber=''
Set @I_cEndCustomerNumber='ZZZZZZZZ'
Set @I_cStartCustomerName=''
Set @I_cEndCustomerName='ZZZZZZZZ'
Set @I_cStartClassID=''
Set @I_cEndClassID='ZZZZZZZZ'
Set @I_cStartSalesPersonID=''
Set @I_cEndSalesPersonID='ZZZZZZZZ'
Set @I_cStartSalesTerritory=''
Set @I_cEndSalesTerritory='ZZZZZZZZ'
Set @I_cStartShortName=''
Set @I_cEndShortName='ZZZZZZZZ'
Set @I_cStartState=''
Set @I_cEndState='ZZZZZZZZ'
Set @I_cStartZipCode=''
Set @I_cEndZipCode='ZZZZZZZZ'
Set @I_cStartPhoneNumber=''
Set @I_cEndPhoneNumber='ZZZZZZZZ'
Set @I_cStartUserDefined=''
Set @I_cEndUserDefined='ZZZZZZZZ'
Set @I_tUsingDocumentDate=0
Set @I_dStartDate='1/1/1900'
Set @I_dEndDate=  @EndDate
Set @I_sIncludeBalanceTypes=0
Set @I_tExcludeNoActivity=1
Set @I_tExcludeMultiCurrency=0
Set @I_tExcludeZeroBalanceCustomer=1
Set @I_tExcludeFullyPaidTrxs=1
Set @I_tExcludeCreditBalance=0
Set @I_tExcludeUnpostedAppldCrDocs=1
Set @I_tConsolidateNAActivity=0

CREATE TABLE #AgedReceivables
(
APPLY_AMOUNT NUMERIC(19,5),
AGING_AMOUNT NUMERIC(19,5),
CUSTNMBR char(15),
CUSTNAME char(100),
BALNCTYP int,
USERDEF1 char(50),
CNTCPRSN char(50),
PHONE1 char(50),
SLPRSNID char(50),
SALSTERR char(50),
PYMTRMID char(50),
CRLMTAMT NUMERIC(19,5),
CRLMTPER NUMERIC(19,5),
CRLMTPAM NUMERIC(19,5),
CRLMTTYP int,
CUSTCLAS char(50),
SHRTNAME char(50),
ZIP char(50),
STATE char(50),
CUDSCRIPTN char(50),
AGNGDATE datetime,
CHCUMNUM char(50),
DOCNUMBR char(50),
RMDTYPAL char(50),
DSCRIPTN char(50),
DCURNCYID char(50),
ORTRXAMT NUMERIC(19,5),
CURTRXAM NUMERIC(19,5),
AGNGBUKT char(50),
CASHAMNT NUMERIC(19,5),
COMDLRAM char(50),
SLSAMNT NUMERIC(19,5),
COSTAMNT NUMERIC(19,5),
FRTAMNT NUMERIC(19,5),
MISCAMNT NUMERIC(19,5),
TAXAMNT NUMERIC(19,5),
DISAVAMT NUMERIC(19,5),
DDISTKNAM NUMERIC(19,5),
DWROFAMNT NUMERIC(19,5),
TRXDSCRN char(50),
DOCABREV char(50),
CHEKNMBR char(50),
DOCDATE datetime,
DUEDATE datetime,
GLPOSTDT datetime,
DISCDATE datetime,
POSTDATE datetime,
DINVPDOF datetime,
DCURRNIDX char(50),
DXCHGRATE NUMERIC(19,5),
ORCASAMT NUMERIC(19,5),
ORSLSAMT NUMERIC(19,5),
ORCSTAMT NUMERIC(19,5),
ORDAVAMT NUMERIC(19,5),
ORFRTAMT NUMERIC(19,5),
ORMISCAMT NUMERIC(19,5),
ORTAXAMT NUMERIC(19,5),
ORCTRXAM NUMERIC(19,5),
ORORGTRX NUMERIC(19,5),
DORDISTKN NUMERIC(19,5),
DORWROFAM NUMERIC(19,5),
DDENXRATE NUMERIC(19,5),
DMCTRXSTT NUMERIC(19,5),
Aging_Period_Amount NUMERIC(19,5),
APFRDCNM char(50),
APFRDCTY char(50),
FROMCURR char(50),
APTODCNM char(50),
APTODCTY char(50),
APPTOAMT NUMERIC(19,5),
ACURNCYID char(50),
DATE1 datetime,
POSTED  char(50),
ADISTKNAM  NUMERIC(19,5),
AWROFAMNT NUMERIC(19,5),
PPSAMDED NUMERIC(19,5),
GSTDSAMT NUMERIC(19,5),
ACURRNIDX char(50),
AXCHGRATE  NUMERIC(19,5),
RLGANLOS NUMERIC(19,5),
ORAPTOAM NUMERIC(19,5),
AORDISTKN NUMERIC(19,5),
AORWROFAM NUMERIC(19,5),
ADENXRATE NUMERIC(19,5),
AMCTRXSTT NUMERIC(19,5),
)

INSERT INTO #AgedReceivables
EXECUTE @RC = [dbo].[seermHATBSRSWrapper] 
   @I_dAgingDate
  ,@I_cStartCustomerNumber
  ,@I_cEndCustomerNumber
  ,@I_cStartCustomerName
  ,@I_cEndCustomerName
  ,@I_cStartClassID
  ,@I_cEndClassID
  ,@I_cStartSalesPersonID
  ,@I_cEndSalesPersonID
  ,@I_cStartSalesTerritory
  ,@I_cEndSalesTerritory
  ,@I_cStartShortName
  ,@I_cEndShortName
  ,@I_cStartState
  ,@I_cEndState
  ,@I_cStartZipCode
  ,@I_cEndZipCode
  ,@I_cStartPhoneNumber
  ,@I_cEndPhoneNumber
  ,@I_cStartUserDefined
  ,@I_cEndUserDefined
  ,@I_tUsingDocumentDate
  ,@I_dStartDate
  ,@I_dEndDate
  ,@I_sIncludeBalanceTypes
  ,@I_tExcludeNoActivity
  ,@I_tExcludeMultiCurrency
  ,@I_tExcludeZeroBalanceCustomer
  ,@I_tExcludeFullyPaidTrxs
  ,@I_tExcludeCreditBalance
  ,@I_tExcludeUnpostedAppldCrDocs
  ,@I_tConsolidateNAActivity

SELECT CUSTNMBR, DCURNCYID as CURNCYID, SUM(ORCTRXAM) as ACCTBAL, SUM(CURTRXAM) AS CADEQUIV
FROM #AgedReceivables 
WHERE 
CURTRXAM <> 0 AND
CASE WHEN DCURNCYID = '' THEN 'CAD' ELSE DCURNCYID END <> 'CAD' 
GROUP BY CUSTNMBR, DCURNCYID

DROP TABLE  #AgedReceivables
END

GO

Thanks to Mark Polino for pointing the way in his blog post –  http://mpolino.com/gp/weekly-dynamic-ar-hatb-via-SQL/

Generate a Series of Months (or Numbers) in SQL

This little trick has helped me keep SQL code clean when generating data.  There are many times where I need to generate a series of months in a sql query.  Very often it is needed in order to join two incomplete sets of data by date, or to cross join a set of data to spread it across a series of months.  This is a very simple solution which can be adapted for many other needs. 

You can start by using a ‘with’ statement to generate the values you need.  In this example I will generate the numbers 1 to 12 to represent 12 months.

;with Months AS
(SELECT 1 AS MonthNum
UNION ALL
SELECT MonthNum + 1 as MonthNum FROM Months WHERE MonthNum <12)
SELECT * FROM Months

This is the result set:

image

Let’s say you have a rates table with one record per year for each Hour Type ID.  You want to duplicate this information across months.  You start with a transaction like this:

image

and end up with a dataset like this:

image

You can do this with dates as well.  For example:

DECLARE @StartDate datetime
DECLARE @EndDate datetime
SET @StartDate=DATEADD(month,-6, CONVERT(CHAR(10),GETDATE(),121))
SET @EndDate = GETDATE()

;with AllDates AS
(
SELECT @StartDate AS Dates
UNION ALL
SELECT DateAdd(month,1,Dates)
FROM AllDates
WHERE 
Dates<=@EndDate
)

SELECT * FROM AllDates

will result in a series of dates like this:

image

I hope this helps you.

Log not truncating on Checkpoint

I have a client who was running out of disk space due to a database log which was growing exponentially.  The database was set to Simple recovery, and the log was set to Truncate on Checkpoint.  The log was growing hugely and had to be manually shrunk every few days.

To find out the reason for the log not truncating I ran this query.

SELECT name,log_reuse_wait , log_reuse_wait_desc FROM sys.databases

I found out that the database was not truncating due to an Active Transaction. This can be caused by a long running transaction or by a deferred transaction.  http://technet.microsoft.com/en-us/library/ms345414(v=sql.105)

In the meantime the database went into Recovery mode and a number of things happened of which I am not aware.  Somehow the active transaction got cleared and the client manually shrunk the log file. 

We changed a few settings to minimize the damage should this happen again. The Autogrowth was set to 20% with no maximum.  We change the Autogrow settings to something more reasonable, with the initial size of 5GB, and a maximum of 10GB to at least protect the other files from getting stalled due to no space on the drive.

Then I looked at capturing the Active Transaction information.  Unfortunately, unless the Active Transaction was running, I was unable to find out what the transaction was.  I built a very simple SSIS package which I ran every 5 minutes to discover if an Active Transaction was holding up the log file, and to capture some information about what transaction was running and who was running it.

image

This package populated a new table in the data warehouse with the attributes of a transaction.  To avoid filling the table with information I didn’t need, I wrote the query so it would only populate the table if there were any transactions holding the log open.  Here is the source query for the Data Flow.  Should the IF EXISTS be negative, the Data Flow task would fail causing the Send Mail Task not to be triggered.

IF EXISTS

–IF log is held up due to Active Transaction
(SELECT  name, database_ID, log_reuse_wait , log_reuse_wait_desc
FROM sys.databases 
WHERE log_reuse_wait_desc =  ‘ACTIVE_TRANSACTION’ AND database_id = <your database id>)

–Find out the user and which query is holding it open
SELECT  s.SPID,S.OPEN_TRAN,TEXT,s.Hostname,s.nt_domain,nt_username,net_address,s.loginame,
s.login_time, s.last_batch, s.status, s.sid, s.hostname as [Host_Name],
s.program_name, s.cmd, s.net_library,  GETDATE() as InputDate
FROM SYS.SYSPROCESSES S CROSS APPLY SYS.DM_EXEC_SQL_TEXT(S.SQL_HANDLE)AS A
WHERE OPEN_TRAN<>0

The really nice thing about this query is that it returns the Text fo the actual query which is holding the log open, as well as program_name which is running it and the Host_Name.

image

I set up a SQL Server Agent job to run the package every 5 minutes, since I only wanted to capture items which were long running. 

Using this package I was able to very quickly find out that there was an automated process on SharePoint which was running every hour but for whatever reason was never able to complete (that’s another story). Because the transaction didn’t successfully complete it remained flagged as Active, the Checkpoint for the log was not removed and the log just kept growing and was never truncated.  Once the issue with the SharePoint job was resolved the log file was able to resume normal behaviour and Truncate on Checkpoint as expected.