T-SQL Tuesday 59–My Hero

Today’s T-SQL TuesTsql Tuesdayday is being hosted by Tracy Mckibben (t|b) and he has invited us to talk about our heroes with it being Ada Lovelace Day and all.  As Ada is created with being the first programmer and this topic got me thinking about firsts in my family.

My grandmother came to the United States in the early sixties looking for a better life for her family.  She left behind four children who would join her over the next four years.  She worked as a maid to a family in Connecticut and earned $50 a week–$45 of which would go back to her family in Costa Rica.  Those efforts led the family here, where my father earned his bachelor’s degree from BYU and then later earned a masters from Clemson University; the first in the family.  She continued to work until her body would not let her and always encouraged us to do our best.  Her sacrifices those 55 years ago led to all the blessings my family now enjoys.  Grandma is one my heroes.

Of course, I would not be able to pursue independent employment as I know it today if it weren’t for my family–especially my wife who has supported me and my sometimes crazy whims.  I am grateful for their support and am glad I have a partner for all eternity. :)

Also, I need to give a shout out to Andy Leonard, who helped start the Richmond SQL Server Users Group.  Before we even knew what PASS was, Andy was there with a group of folks in Richmond talking shop and his even though he hasn’t had direct interaction with the group for some time, his influence is still felt and the core leadership team are involved because of Andy’s example.  The opportunity to belong to that group has been all the difference in what I have been able to do professionally.

Changes to the Identity Column

A recent upgrade to SQL 2012 reminded me of an issue most will face as they upgrade to the new version.  With the advent of sequences in SQL Server 2012, a change was made to the way identities are issued.  The short story is each time you restart the instances, your identities will increase by 1,000.  This is really only a big deal if

  1. You are close to the limit of an integer (2147483647) or
  2. These values are shown to the end users in your system and they wonder why there was a big spike.

Ahasan Habib has a good write-up of the issue, so I won’t recreate all his steps; however, the good news is there is a workaround for this.  Like always, make sure you test before you implement the change and hopefully you won’t have to answer the question–why is there a big gap in the numbers?  :)

GETUTCDATE Function

I came across a database that stores the execution date in UTC time.  I needed to run a report with a sliding window to report errors during the day.  Because this server was in Virginia, it is subject to daylight savings so I needed to be able to calculate the local time based off the UTC time and this could be 4 or 5 hours depending on the season.

I found the function GETUTCDATE as an easy way to calculate the UTC date.  It should be noted that it uses the server time to calculate this so if you have multiple servers in different time zones there may be different logic needed for each one.  In my case, I just needed to find the hour difference between my timezone and UTC–in this case I use the DATEDIFF function.

Now that I know how far away from UTC I am, I can apply local time logic to get the data I want.  This example uses AdventureWorks table.

DECLARE @HourOffset INT

SET @HourOffset = DATEDIFF(hh,GETUTCDATE(),GETDATE())
--Select @HourOffset

SELECT DATEADD(hh,@HourOffset,ErrorTime) AS ExecutionTime
FROM [dbo].[ErrorLog]

Moving A Database With Attach and Detach

I found myself doing several database migrations this weekend and because these databases were generally small (Less than 10 GB), I choose the route to detach the database, copy the files to the new location and re-attach.  To remove the databases, I detach them from the current instance.  I am not worried about backup history with this as the server will be decommissioned.

EXEC master.dbo.sp_detach_db @dbname = N'MyDatabase'
GO 

If you refresh your database list, you will no longer see the database listed; however, the files are still on the server.  You can now copy those to the new location and attach them. You need only pass in the name of the database and the location of the files and SQL Server will add the database and make it available to you.

EXEC sp_attach_db @dbname = N'MyDatabase',
    @filename1 = N'E:\MSSQL\MyDatabase.mdf',
    @filename2 = N'F:\MSSQL\MyDatabase_log.ldf'; 

Don’t forget to add your security and any jobs on the old server you need to copy over for this database.

We Can Pivot If You Want To

Screen Shot 08-01-14 at 10.59 PMSo I get that I am a little late to the party here; however, I find myself having to pivot from time to time and am always looking back for code as a refresher.  While pivot is pretty cool, it always takes me a second to connect all the dots so I hope this post help do that.

So, with Pivot, we want to take data that looks like the data on the right–there are multiple rows for each ProductId and would like to get each product ID on one line with the quantities from each Bin.

When we are finished, the data will look like this.

Screen Shot 08-01-14 at 10.59 PM 001

One row for each product id

In my example, I am using the [Production].[ProductInventory] table in the adventureworks2012 database.  To make a row become columns, we must know ALL possible values we want to make as a column.  We specify these in the FOR line of the pivot syntax.  This query will create the columns for each value I specified in the Shelf column

SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
	FROM [Production].[ProductInventory]
	 PIVOT
	 (
		SUM (Quantity)
		FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
	 ) AS PVT 

Screen Shot 08-01-14 at 11.13 PMI should see something like this.  We are getting closer–we have our columns, but we still a line for each record.  We only want to see one line per product id.  This is where we put the columns we want from the table and the pivot syntax together in an almost sub-query like format and select the columns as we want them to display.

The syntax is something like this.  The first two rows are the columns I want, the second two are the columns I need from the table, and then add the pivot syntax followed by an order by and presto, you have your data the way you want it.

SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
	FROM
	(SELECT [ProductID], Shelf, Quantity
	FROM [Production].[ProductInventory]) p
	 PIVOT
	 (
		SUM (Quantity)
		FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
	 ) AS PVT
	 ORDER BY [ProductID]

Enqueueing A Message

After restoring a SharePoint database to the development environment, I came across the following error which would repeat every minute.

An exception occurred while enqueueing a message in the target queue. Error: 15404, State: 19. Could not obtain information about Windows NT group/user ‘Domain\User’, error code 0x6e.

I saw a post about ensuring the SharePoint service account had the appropriate access and after verifying mine were good, I saw a suggestion to change the database owner.  While it seemed rather strange, I thought it would at least check the owners of all the databases. I used this query:

SELECT name, suser_sname(owner_sid) FROM sys.databases

This lists all the databases and their owners and they all seemed good–A NULL in the second column would indicate an unresolved user; however, I went ahead and change them all to SA as there were various owners.

USE DBX
GO
EXEC sp_changedbowner 'sa'

The error stopped occurring in the log and life resumed. :)

Point In Time Recovery

It is nice to see new features you would have liked in prior versions make it into the current version.  In SQL Server 2012, Point-In-Time recoveries were made available via SQL Server Management Studio (SSMS).  Point-In-Time recoveries had been available via T-SQL previously; however, there is now a way to visual this process.  This post demonstrates how to do this with the 2014 version of SSMS.

Full Recovery Mode

Point-In-Time recoveries require the database to be in full recovery as we are going to be using the log backup to finalize the restore.  In this scenario, I take a full backup nightly and then transactional log backups regularly throughout the day.  We will need the full backup file and ALL the transaction logs taken to the point in time to which we want to recover.  In my case it is just one; however, there could be many–it just depends on your backup strategy and the time of the failure/error.

This post won’t go over the backups themselves, and I just realized I don’t have a post on those.  I will have to link once I set it up.

Restoring A Database

Screen Shot 07-28-14 at 10.06 PMIf the database doesn’t exist, already you would right-click on Databases, then choose Restore Database …

 

Screen Shot 07-28-14 at 10.13 PMIf we are restoring a copy of the database on the same instance, we can choose Database as the source and then choose the files we wish to restore from the backup history.  In this example, I choose the source of device and choose both the backup file and the transaction log backup, which you can see highlighted in the box.  Clicking on the Timeline button on the restore to line allows us to specify the time to which we want to restore.

Screen Shot 07-28-14 at 10.14 PMI only need to choose Specific date and time and I can see the options available to me.  I can either modify the time field or I can use what looks like a progress bar to slide to the time I want.  As I have chosen the database name on the previous window, all the logic needed to restore my database has been updated and the database will restore to the time I put in the window.

If the database already exists on the server, the steps are similar; however, we will have to break up the restore into two steps.   First, we restore the database and set the recovery to NoRecovery.  We only have the option to choose Tasks->Restore->Transaction Logs when the database is in recovery mode.

There’s Always T-SQL

And we also have the option of restoring the database via T-SQL and while there are lots of options, these seem to work well for me.  This is also done in two steps.  First the Full Restore–and I KNOW you don’t have your databases on the C:\ drive, so please forgive my poor example.  I’ve got kids to feed and can’t afford big servers with lots of disks.

Restore Database RestoreTest
FROM DISK = 'C:\TEMP\AW.bak'
WITH MOVE 'AdventureWorks2012_Data' TO 'C:\MSSQL\DATA\RestoreTest_Data.mdf'
, MOVE 'AdventureWorks2012_Log' TO 'C:\MSSQL\DATA\RestoreTest_log.ldf'
, NORECOVERY, STATS = 10

Then the transaction log.  The secret sauce is the STOPAT option.  That’s it.  I hope you won’t have to use it everyday; however, when that error or issue comes your way, with this process you will able to get back to the point you wanted.

RESTORE LOG RestoreTest
FROM DISK = 'C:\Temp\AW.trn'
WITH RECOVERY,
STOPAT = N'2014-07-28T21:48:47'

Unable To Login to SQL Server Error 18456

I installed SQL Server on a machine the other day, I connected from my desktop, finished my normal server setup routine and life went on. Later, I connected to the machine via Remote Desktop, fired up SSMS and got this error message.
LoginFailure18456

With an accompanying error in the log
Login failed for user ‘MyAccount’. [CLIENT: ]
Error: 18456, Severity: 14, State: 11.

Very strange, I thought to myself. Turns out I had added my account to the local administrator’s group and not created a Login on the SQL Server. (It also appears I missed a step on my server checklist, as I normally remove the Builtin\Administrators group, but who is keeping score?)

I create the Login, and specified the group, and I was good to go.

CREATE LOGIN [Domain\Account] FROM WINDOWS;
EXEC sp_addsrvrolemember [Domain\Account], ‘sysadmin';

Failure to calculate the default value

I was installing SQL Server SP1 on a one node cluster (long story) and I got this error “There was a failure to calculate the default value of setting PatchResult.” It turns out when I launched SQLServer2012SP1-KB2674319-x64-ENU.exe, it put the temporary folder–you know xcdswer-wersdf-weradf on a drive in the cluster. The patch restarted the sql server service rendering the drive from which the service pack was trying to install unavailable. I copied the temp folder to the local c:\ drive and I was able to install with no problems.

Finding Active Heap Tables

As part of a review of a system, I look for tables without clustered indexes; however, invariably I will provide the list of tables to a development team and they will say “These are old tables we don’t use anymore”, so I am forced to provide additional statistics about the number of times a heap table is used in the database.  Based on a check found in Brent Ozar’s sp_blitz script, I pull usage information about the heap tables.  If there is no usage, they are not reported.

EXEC dbo.sp_MSforeachdb 'USE [?]; 
      SELECT DB_NAME(), t.name as TableName, ius.user_seeks, ius.user_scans, ius.user_lookups, ius.user_updates, ius.last_user_scan
      FROM [?].sys.indexes i 
		INNER JOIN [?].sys.objects o ON i.object_id = o.object_id 
		INNER JOIN [?].sys.tables t ON o.object_id = t.object_id
        INNER JOIN [?].sys.partitions p ON i.object_id = p.object_id AND i.index_id = p.index_id 
        INNER JOIN sys.databases sd ON sd.name = ''?'' 
        LEFT OUTER JOIN [?].sys.dm_db_index_usage_stats ius ON i.object_id = ius.object_id AND i.index_id = ius.index_id AND ius.database_id = sd.database_id 
      WHERE i.type_desc = ''HEAP'' AND COALESCE(ius.user_seeks, ius.user_scans, ius.user_lookups, ius.user_updates) IS NOT NULL 
      AND sd.name <> ''tempdb'' AND o.is_ms_shipped = 0 AND o.type <> ''S''';  

Happy Hunting