Granting SSIS Package Administration Access to Developers

In many instances there is a development team or even a warehouse team that needs to run SSIS packages and you want to store those packages in SQL Server and run them through the SQL Agent Service.  You may not; however, want to grant those users/group sysadmin rights to the machine.  The following describes my experience in setting this up.

SSIS_DB1First, you must add the user/group to the DCOM Configuration for “Microsoft SQL Server Integration Services 11.0″ (SQL 2012), which can be found in the Component Services MMC.  Right click and choose properties.

 

 

SSIS_DB2You can now add the groups to two security sections–“Launch and Activation Permissions” and “Access Permissions”.  It was my experience I had to have them in both groups.  I granted full rights in each group.

The final component is to add each user/group to the local group “Distributed COM Users”.  It seems like it won’t even authenticate a user for the groups you set earlier unless they are in the local group.  This will allow users to connect the Integration Services Instance.SSIS_DB3

At this point you should restart the Integration Service services.

If you haven’t already, you can go ahead and create your Catalog Database.  We won’t go over those steps here–perhaps in another post.

The last step is to allow them to deploy and manage the packages on the servers.  I add the group to the SSISDB and grant the role of ssis_admin.  Now your teams should be able to function completely within the SSIS parameters and not have permissions to drop database objects or modify security settings.

SSIS_DB4

The Local Node is Not Able to Communicate With the WSFC Cluster

I was setting up an availability group in SQL Server 2012 on a pair of servers and came across the following error.  “the local node is not able to communicate with the wsfc cluster”  Interestingly enough, I had already installed an availability group on these two servers in another domain.  I was changing domains (long story) and had decommissioned the cluster and wanted to set it back up.  I knew I had already done the following

  • Installed the latest Service Pack (2)
  • Validated the cluster

The fix was simply to remove the Availability Group option, re-apply it and then restart the SQL Server Service.  After that, I was off and running.

LocalNodeCannotCommunicate

T-SQL Tuesday 59–My Hero

Today’s T-SQL TuesTsql Tuesdayday is being hosted by Tracy Mckibben (t|b) and he has invited us to talk about our heroes with it being Ada Lovelace Day and all.  As Ada is created with being the first programmer and this topic got me thinking about firsts in my family.

My grandmother came to the United States in the early sixties looking for a better life for her family.  She left behind four children who would join her over the next four years.  She worked as a maid to a family in Connecticut and earned $50 a week–$45 of which would go back to her family in Costa Rica.  Those efforts led the family here, where my father earned his bachelor’s degree from BYU and then later earned a masters from Clemson University; the first in the family.  She continued to work until her body would not let her and always encouraged us to do our best.  Her sacrifices those 55 years ago led to all the blessings my family now enjoys.  Grandma is one my heroes.

Of course, I would not be able to pursue independent employment as I know it today if it weren’t for my family–especially my wife who has supported me and my sometimes crazy whims.  I am grateful for their support and am glad I have a partner for all eternity. :)

Also, I need to give a shout out to Andy Leonard, who helped start the Richmond SQL Server Users Group.  Before we even knew what PASS was, Andy was there with a group of folks in Richmond talking shop and his even though he hasn’t had direct interaction with the group for some time, his influence is still felt and the core leadership team are involved because of Andy’s example.  The opportunity to belong to that group has been all the difference in what I have been able to do professionally.

Changes to the Identity Column

A recent upgrade to SQL 2012 reminded me of an issue most will face as they upgrade to the new version.  With the advent of sequences in SQL Server 2012, a change was made to the way identities are issued.  The short story is each time you restart the instances, your identities will increase by 1,000.  This is really only a big deal if

  1. You are close to the limit of an integer (2147483647) or
  2. These values are shown to the end users in your system and they wonder why there was a big spike.

Ahasan Habib has a good write-up of the issue, so I won’t recreate all his steps; however, the good news is there is a workaround for this.  Like always, make sure you test before you implement the change and hopefully you won’t have to answer the question–why is there a big gap in the numbers?  :)

SSPI handshake failed with error code 0x8009030c

The other day I was working in SSMS and connected to an instance.  Pretty normal there.  I later opened some window (I don’t recall exactly–perhaps a report or some db option) and was met with this error message.  I could then do nothing in the database.

SSPI handshake failed with error code 0x8009030c, state 14 while establishing a connection with integrated security; the connection has been closed. Reason: AcceptSecurityContext failed. The Windows error code indicates the cause of failure.  [CLIENT: 10.1.1.1].

This same error was found in the SQL Server log with the accompanying error message.

Error: 17806, Severity: 20, State: 14.

It turns out my Windows password had expired from the time I opened the instance to the time I was looking at something in the database.  Once I reset my password, I was good to go.

Cannot Start Mirroring

Mirroring is dead.  Long live mirroring.  While it will no longer be included in future versions of SQL Server, I still come across instances I have to support and thus am making some notes.  Occasionally, when setting up mirroring, you may get the following error.

The server network address “TCP://SQLServer:5022″ can not be reached or does not exist. Check the network address name and that the ports for the local and remote endpoints are operational. (Microsoft SQL Server, Error: 1418)

I have found some common scenarios include:

  • Insufficient permission on both nodes.  Make sure the service account for each SQL Server instance has rights to the partner instance.  Seems simple enough, but this was an issue for me just last month.
  • The log backup has not been applied to the partner/mirrored instance.  While sometimes it might let you set up without taking a transaction log backup and applying it on the partner database, more often than not it requires it.

Check out Robert Davis’s post for more helpful information if you are stuck.

GETUTCDATE Function

I came across a database that stores the execution date in UTC time.  I needed to run a report with a sliding window to report errors during the day.  Because this server was in Virginia, it is subject to daylight savings so I needed to be able to calculate the local time based off the UTC time and this could be 4 or 5 hours depending on the season.

I found the function GETUTCDATE as an easy way to calculate the UTC date.  It should be noted that it uses the server time to calculate this so if you have multiple servers in different time zones there may be different logic needed for each one.  In my case, I just needed to find the hour difference between my timezone and UTC–in this case I use the DATEDIFF function.

Now that I know how far away from UTC I am, I can apply local time logic to get the data I want.  This example uses AdventureWorks table.

DECLARE @HourOffset INT

SET @HourOffset = DATEDIFF(hh,GETUTCDATE(),GETDATE())
--Select @HourOffset

SELECT DATEADD(hh,@HourOffset,ErrorTime) AS ExecutionTime
FROM [dbo].[ErrorLog]

Moving A Database With Attach and Detach

I found myself doing several database migrations this weekend and because these databases were generally small (Less than 10 GB), I choose the route to detach the database, copy the files to the new location and re-attach.  To remove the databases, I detach them from the current instance.  I am not worried about backup history with this as the server will be decommissioned.

EXEC master.dbo.sp_detach_db @dbname = N'MyDatabase'
GO 

If you refresh your database list, you will no longer see the database listed; however, the files are still on the server.  You can now copy those to the new location and attach them. You need only pass in the name of the database and the location of the files and SQL Server will add the database and make it available to you.

EXEC sp_attach_db @dbname = N'MyDatabase',
    @filename1 = N'E:\MSSQL\MyDatabase.mdf',
    @filename2 = N'F:\MSSQL\MyDatabase_log.ldf'; 

Don’t forget to add your security and any jobs on the old server you need to copy over for this database.

Backup History

As someone who has helped design systems, I know auditing can take a back seat to more ‘pressing’ requirements; however, as anyone responsible for maintaining a system will know, a good audit trail can be immensely helpful in tracking down a problem. When it comes to backup information, SQL Server actually has a lot of good detail which can be very useful. For example, SQL Server can tell you the very first time you took a backup of your database. This script can give you this information by database.

Select MIN(backupset.backup_finish_date), backupset.database_name, backupset.server_name
FROM msdb.dbo.backupset
INNER JOIN msdb.dbo.backupmediafamily ON backupset.media_set_id = backupmediafamily.media_set_id
GROUP BY backupset.database_name, backupset.server_name
Order by backupset.database_name, backupset.server_name
The very first backup?

While it may be nostalgic to look at the very first backup you took, it is not practical to keep all that information around?   Why is this?  Because, as Books On Line puts it–“When a restore [or Backup] is performed, backup history tables and restore history tables are modified.”  Just like any audit table, this information grows and as each operation is performed, more data is added to the mix.  Over time these operations can actually be slowed because of the amount of data.

History Cleanup

Just like any good audit table, we need to have a plan to trim the fat on a regular basis.  What to trim?  Well, my thoughts are they should line up with your requirements for data longevity.  If the policy says we keep backups around (even on tape) for 6 months, then I would make it six months and run the cleanup process every week.  Your time may vary by application and in that case, you can set it for the longest requirement in the environment.  Regular cleanup is the key–you never know how long those servers are going to be around.  Isn’t that right all you SQL 2000 boxes out there?   :)

Script It Out

Removing old records is straightforward with stored procedure sp_delete_backuphistory.  You only have to provide a date and SQL Server will remove ALL backup history up to that date.  I normally have a SQL Agent job, scheduled to run once a week, looking something like this.

USE msdb
GO
DECLARE @BackupDate DATETIME;
SET @BackupDate = GETDATE() - 365;

Exec sp_delete_backuphistory @BackupDate;

We Can Pivot If You Want To

Screen Shot 08-01-14 at 10.59 PMSo I get that I am a little late to the party here; however, I find myself having to pivot from time to time and am always looking back for code as a refresher.  While pivot is pretty cool, it always takes me a second to connect all the dots so I hope this post help do that.

So, with Pivot, we want to take data that looks like the data on the right–there are multiple rows for each ProductId and would like to get each product ID on one line with the quantities from each Bin.

When we are finished, the data will look like this.

Screen Shot 08-01-14 at 10.59 PM 001

One row for each product id

In my example, I am using the [Production].[ProductInventory] table in the adventureworks2012 database.  To make a row become columns, we must know ALL possible values we want to make as a column.  We specify these in the FOR line of the pivot syntax.  This query will create the columns for each value I specified in the Shelf column

SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
	FROM [Production].[ProductInventory]
	 PIVOT
	 (
		SUM (Quantity)
		FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
	 ) AS PVT 

Screen Shot 08-01-14 at 11.13 PMI should see something like this.  We are getting closer–we have our columns, but we still a line for each record.  We only want to see one line per product id.  This is where we put the columns we want from the table and the pivot syntax together in an almost sub-query like format and select the columns as we want them to display.

The syntax is something like this.  The first two rows are the columns I want, the second two are the columns I need from the table, and then add the pivot syntax followed by an order by and presto, you have your data the way you want it.

SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
	FROM
	(SELECT [ProductID], Shelf, Quantity
	FROM [Production].[ProductInventory]) p
	 PIVOT
	 (
		SUM (Quantity)
		FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
	 ) AS PVT
	 ORDER BY [ProductID]