Finding Active Heap Tables

As part of a review of a system, I look for tables without clustered indexes; however, invariably I will provide the list of tables to a development team and they will say “These are old tables we don’t use anymore”, so I am forced to provide additional statistics about the number of times a heap table is used in the database.  Based on a check found in Brent Ozar’s sp_blitz script, I pull usage information about the heap tables.  If there is no usage, they are not reported.

EXEC dbo.sp_MSforeachdb 'USE [?]; 
      SELECT DB_NAME(), t.name as TableName, ius.user_seeks, ius.user_scans, ius.user_lookups, ius.user_updates, ius.last_user_scan
      FROM [?].sys.indexes i 
		INNER JOIN [?].sys.objects o ON i.object_id = o.object_id 
		INNER JOIN [?].sys.tables t ON o.object_id = t.object_id
        INNER JOIN [?].sys.partitions p ON i.object_id = p.object_id AND i.index_id = p.index_id 
        INNER JOIN sys.databases sd ON sd.name = ''?'' 
        LEFT OUTER JOIN [?].sys.dm_db_index_usage_stats ius ON i.object_id = ius.object_id AND i.index_id = ius.index_id AND ius.database_id = sd.database_id 
      WHERE i.type_desc = ''HEAP'' AND COALESCE(ius.user_seeks, ius.user_scans, ius.user_lookups, ius.user_updates) IS NOT NULL 
      AND sd.name <> ''tempdb'' AND o.is_ms_shipped = 0 AND o.type <> ''S''';  

Happy Hunting

Installing SQL 2005 Service Packs

Now hopefully no one out there will need this post–especially because SQL Server discontinued support in 2011; however, because I learn something new everyday I feel compelled to put his post up. I came across a server that had automatic updates on and was trying to install SP2 for SQL Server 2005. It would error out and SQL Server would be shutdown. I was trying to install the update for SQL Server and the Analysis Server services.  The error in the log was
Product: Microsoft SQL Server 2005 -- Error 29528. The setup has encountered an unexpected error while Setting Internal Properties. The error is: Fatal error during installation.
When I tried installing the service pack manually, this error displayed in the GUI.
Service Pack 4 for SQL Server Database Services 2005 ENU (KB2463332)'
could not be installed. Error code 1603.

The resolution was to delete some registry keys, which is still gives me the Hebe-geebees, and these were the keys.

SQL Server Entries

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\Setup\SQLGroup
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\Setup\AGTGroup
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\Setup\FTSGroup

Analysis Services Entry

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.2\Setup\ASGroup

After I did this the installation went as expected. Now I just have to upgrade these bad boys.

Remotely Changing your Password

As I often work remotely from client sites and connect via VPN and Remote Desktop services, I am not in the position to run ctrl + alt + delete and change the password.  In order the change the password via a Remote Desktop session, you enter ctrl + alt + End and this will bring up the change password prompt.

Happy Trails.  :)

Creating a Linked Server to Oracle

I was recently asked to set up a linked server to Oracle and this is something I have done in the past.  No problem I thought.  To connect to Oracle you need the Oracle Data Access components and I has asked the Oracle DBA if they were installed.  He said yes and that everything was configured on his end.  I referenced David Browne’s blog for a quick review and ensured the Oracle directory was in the PATH variable.

I then proceeded to create the linked server got the following error–ORA-12514:TNS:listener does not currently know of service requested in connect descriptor

Oracle_HomeThere ended up being two issues–the first being, the TNSNames.ora file did not contain the entry for the database I was trying to connect to.  I guess the Oracle DBA didn’t think that was part of his setup.  Secondly, for some reason, the Oracle_Home variable was not set–it was blank.  I am not sure how that is possible; however, once I put the correct path in, I was golden and off and running.

Dropping a publisher when the distributor is down

Working on a new server, I came across a replication process that had been forgotten about and the pieces were disconnected.  The distributor server/process had been installed on another server and when I went to remove the publisher from the database, I got the below error.

DECLARE @publication AS sysname;
SET @publication = N'SmokeTest';

-- Remove a transactional publication.
USE [ReportServerTempDB]
EXEC sp_droppublication @publication = @publication;

OLE DB provider “SQLNCLI10″ for linked server “repl_distributor” returned message “Login timeout expired”.OLE DB provider “SQLNCLI10″ for linked server “repl_distributor” returned message “A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.”.
Msg 11001, Level 16, State 1, Line 0
TCP Provider: No such host is known.

The linked server was pointing to a server that no longer existed and I thought I was stuck.  I tried lots of options, but ultimately came across this blog that demonstrated how to forcefully remove the publisher when I couldn’t connect to the distributor.

Spoiler: The trick is to use the @ignore_distributor=1 parameter and you should be good to go.

DECLARE @publication AS sysname;
SET @publication = N'SmokeTest';

-- Remove a transactional publication.
USE [ReportServerTempDB]
EXEC sp_droppublication @publication = @publication<code>
,@ignore_distributor=1</code>;

You can’t shrink a logfile when . . .

I was asked to look at a database server for the first time.  One of the tasks I was given was to shrink the log file of a database as the previous owner could not.  The DB was only 10 MB, but the log file was 50GB.  No problem, I say to myself–like taking candy from a baby.  This baby; however, was trained in the art of MakeMeCrazy.

Most of the advice out there prompts you to find the root cause of the growth as the log is likely to grow again so shrinking the log file is not a fix, just a band-aid.  I should have pondered this question longer.  This server was running SQL 2008, so the truncate_only option wasn’t available so I changed the recovery model to simple and was then going to shrink the log–only the database was already is simple mode.  I tried shrinking the log and the command executed but did not change the size of the file.

Strange–why would 10 MB DB need a 50 GB log file?  Can you think of a reason?  I admit I was dumbfounded and actually had some strange ideas pop into my head.

 

The reason??  It was setup as a publisher at one point and had been forgotten about.  I removed the publisher and bam–the log shrunk right down.

Importing or Exporting to Access

I had a client request to export some data to Access 2010 for their downstream use.  I have done this in the past and thought it a no-brainer; however, there is always a little trick–hence the post.  :)

Because I had already installed Office 2010, I assumed the Microsoft Access Database Engine for 2010 would get me what I needed.  Because I was using a new machine, I had to install the 2007 Office System Driver: Data Connectivity Components.  After I did that, I could see the Microsoft Office 12.0 Access Database Engine OLE DB Provider.  I put in the location of the access db, and I was off and running.AccessEngineProvider

Upgrading SharePoint List Adapter in SSIS

A server upgrade prompted me to migrate some SSIS packages.  These packages were going from SQL 2008R2 to SQL 2008R2, so there was no change there; however, the new server had a different version of the SharePoint List Adapters, which allow you to easily write to and get data from SharePoint Lists.  When I opened the package on the new server, the list adapter was not recognized and I got the following error.

ListAdapter Error“Error 1 Validation error. %Job Name% The component metadata for “component “SharePoint List Destination” (529)” could not be upgraded to the newer version of the component. The PerformUpgrade method failed.   %Job Name% 0 0″ 

The ‘version’ of the adapter was same; however,SharePointListAdapter I was using an updated release as shown here by the release date for the adapter.  Upgrading was straightforward, once I found this post on codeplex.  My experience was going to the package in Windows Explorer and opening the file in Textpad.  Now I guess I should lament here that I didn’t do it in PowerShell, but one thing at a time here!  I then replaced the PublicTokenID as mentioned in the post.  I saved the text file and had to re-open the SSIS project.  There is a newer version of the file itself and one would need to update the Version and the PublicTokenID if upgrading to the latest version.

The last step was to add a SharePoint Credential to the List Adapter as this is a new feature.  I right clicked in my connection manager and choose new connection.  I was then NewSharePointConnectionable to choose SPCRED and enter the appropriate information.  My package was converted to the new server and worked well.

My Experience on SQL Cruise

Oh Yeah!

Oh Yeah!

Of the great aspects of the SQL Server community is the willingness of people to share what they know as they discuss resolutions to problems they face.  There are a variety of opportunities to engage these people–blogs, user groups, SQL Saturday, the Summit, and there may be a few I am not aware of; however, I can say that none of them are quite like a SQL Cruise.  I’m glad I found it–SQL Cruise is the real deal.

I did not, at first, consider SQL Cruise to be right work/play model.  I had not been a cruise before and I wasn’t sure what to expect.  My track record shows I tend to do things on the cheap and cruising screamed expensive–at least to my ears.  The price of the training itself was another concern as I would be paying for the training out of my own pocket.  While Tim Ford (blog|twitter) is a recognized name in the community, I hadn’t met him nor had I met the other speakers.  A self-proclaimed people person, I needed a connection and that connection came in the way of Wayne Sheffield (blog|twitter), who attended the Alaska 2012 cruise.  I met Wayne at the Richmond SQL Server users group and had attended several of his SQL Saturday and user group sessions.  Wayne gave it his seal of approval and after my wife and I agreed the numbers would work out, we booked our ticket on the SQL Cruise.

The Cruise

After conference activity area

After conference activity area

As a first time cruiser, I was pleasantly surprised by the whole experience and would go again.  If I was not part of a conference, a seven day cruise might be a little long; however, there is plenty to choose from to occupy yourself on board.  As a frequent foreign traveler, I enjoyed chatting with the staff about where they were from, what locations we had been to in common, and where they enjoyed going on the Islands.  Everyone was extremely friendly and when I found out they worked 10 hours a day, 7 days a week for 9 months I was super impressed.

Upward and Onward!

Upward and Onward!

Being in the Caribbean was very nice and I had not traveled to that part of the world before.  I was surprised how many Spanish speakers there were on the islands and many had lived and worked on the different islands their whole lives.  Again, one to do things on the cheap, I had purchased a Lonely Planet book beforehand and we went to several local places to get a better feel for life on the island.  On St Thomas of the Virgin Islands, we boarded a local bus and I struck up a conversation with a high school student.  I asked him what he call US citizens from the ‘mainland’–his response: “tourists”.  :)

Beaches in St Maarten

Beaches in St Maarten

Favorite Island treat: French pastries on French side of St Maarten
Favorite Island Activity: Snorkeling on St Johns.
Favorite Island memory: Traveling back to the Cruise Ship with Big Joe, his son and Jason Brimhall (blog|twitter), his wife and my wife.  We had taken local transportation and no one, even me, was sure we were headed the right direction until we could actually see the cruise ship.  We shared the bus with several high schools students.

Chris Bell(blog|Twitter) is detailing his adventures on his site for a better day by day feel.

The Training

Cruise Schedule

Cruise Schedule

You do, after all, sign up for SQL Cruise because of the training and the technical presentations, given in two hour blocks were very good; however, they are what you would expect at any SQL Saturday. I did learn something new in each session and I have a notebook with pages of writing to prove it.  The real value came in less structured settings called office hours.  In those sessions, I would could ask ‘my’ questions to either a single person or the group I happened to be with at the time and get different viewpoints.  We all know there are a million different ways to do the same thing and I personally valued getting different opinions.  In preparation for the cruise, I made a list of some of the areas I was struggling with or needed help with and asked them.  Some were discussed in front of the whole group and I was a little nervous about being labeled a ‘stupid question asker’, but then I realized I was with the SQL community and EVERYONE was willing to help.

Getting my learn on!

Getting my learn on!

Not all of the training is technical in nature.  Most, if not all, of the attendees have been DBAs for 5+ years and like me, have good technical skills.  Many are looking to expand the nature of their work and it was fascinating to discuss the paths they traveled.  For instance, talking with Tim about his experience of going from developer to DBA to then organizing SQL Cruise and talking with Neil Hambly (blog|twitter) about how he got his job with Confio showed me there are multiple ways to find fulfilling work in our field.

(Possible) Favorite Training Moment: The class reaction when Kevin Kline opened the execution plan I submitted for the Execution Plan contest.  The collective grown in the room let me know I wasn’t  going crazy.
(Possible) Favorite lesson learned:  Office Hours discussion on communicating with management.  See Takeaway #4.

The Takeaway

There are lots of little things I took away from the experience.  I have areas I need to improve in, some I knew before the cruise, and I have the following plan based my experience on the cruise.  In no particular order

  1. Become proficient in PowerShell.  This goal is really about being comfortable with the PowerShell language to the point I can begin to administer my environment in it.  I would like to begin collecting server data/stats about the different environments and having that in a central place.  Phase II would be to create my Minion (You should have been there)–a repository of metadata that each system uses to perform admin functions. IE the backup location is in this table and an update to it would affect the next backup job cycle.  I ordered “Learn Windows PowerShell 3 in a Month of Lunches” to start me in this process.
  2. Implement policy based management.  I have been bitten more than once by a ‘standard’ setting that was different on this one server.  There are several ways to implement this; however, I just need to start.
  3. Master the execution plan.  As an attendee, we all received a copy of Grant Fritchey’s “SQL Server Execution Plans” courtesy of Red Gate .  As the winner of the SQL Sentry’s Hairy execution plan contest, I received a copy of SQL Sentry’s Execution Plan explorer Pro.  No specific session focused on this; however, I realize I need to place a little more effort in reading execution plans and now I have some awesome tools to do it with.
  4. Create a management report that allows them to easily see the health of the SQL Server.  This is not the same type of report a DBA would want to see, just the number of executions per second and how long those executions are taking with some history for comparison.  Stretch goal–make it available to the SQL Community in a way users can easily deploy and use.
  5. Go on another SQL Cruise.  I would hop on the Alaska cruise in a minute, but my schedule won’t allow it this year.  I would surely enjoy chatting with Buck Woody and the others, but I will have to take a rain check and check out the 2014 offerings.

As an aside note, I went on the cruise with my wife and she had a great time as well.  If you can swing it, I recommend bringing the spouse along.  Beside the technical sessions, everyone has the opportunity to get together and we enjoyed getting together with the other family members.  My wife now has a few more Facebook friends she might also meet on the SQL trail.  It was a great time and I am glad I had the opportunity to go.  I would recommend it to anyone.

Tim posted some great photos on his site.  Check them out!

Coalesce instead of Max

I was asked to tune a query which was used in an export of registration data. A person registers for an activity and they answer questions about this. This query pulls from 19 tables–one of which is a one to many. The export creates a column for each row in the many table. It is on this subject I would like to focus.  For simplicity, I have reduced my examples to two tables. My setup looks something like this.

QuestionAnswerSetup

The tsql below will create this. Our end goal is to match the questions answered by each registrant to the registration data.

CREATE TABLE Registration
( RegID    INT
, RegistrantFirstName    VARCHAR(25)
)
INSERT INTO Registration (RegID, RegistrantFirstName)
VALUES (1, 'Carlos'),
(2, 'Wayne'),
(3, 'Jeff')

CREATE TABLE RegistrationAnswer
( RegID    INT
, RegQuestion    VARCHAR(25)
, RegAnswer    VARCHAR(25)
)

INSERT INTO RegistrationAnswer (RegID, RegQuestion, RegAnswer)
VALUES (1,'Favorite Color','Red'),
(2,'Favorite Color','Blue'),
(3,'Favorite Color','Red'),
(1,'Favorite Food','Pizza'),
(2,'Favorite Food','Chimichangas'),
(3,'Favorite Food','Hamburgers')

The previous logic used the Max() Function to break the rows into column. Seemed to work well.

--Old Way
SELECT Registration.RegID, RegistrantFirstName, Answers.*
FROM Registration
LEFT OUTER JOIN (SELECT MAX(CASE WHEN RegQuestion='Favorite Color' THEN RegAnswer ELSE null END) AS [Favorite Color]
, MAX(CASE WHEN RegQuestion='Favorite Food' THEN RegAnswer ELSE null END) AS [Favorite Food]
, reg.RegID
FROM [dbo].[Registration] reg
INNER JOIN [dbo].[RegistrationAnswer] ans ON reg.regid = ans.regid
GROUP BY reg.RegID) Answers ON Answers.regID = registration.regid;

I found; however, that if I commented out the left outer join portion of the query (Remember, the real query has 19 tables) the export will suddenly become very fast. With that in mind, I began to look for ways of moving rows to columns and came across an example using the coalesce function. The result is the following. I use a CTE to get the records from the many table and then I use the coalesce function with my group by, replacing the max function.

--New Way
WITH Answer_CTE (RegID, RegQuestion, RegAnswer) AS(
SELECT RegID, RegQuestion, RegAnswer
FROM RegistrationAnswer
)

SELECT Registration.RegID, Registration.RegistrantFirstName, Answers.*
FROM Registration
LEFT OUTER JOIN (SELECT RegID
, coalesce(min(case RegQuestion when 'Favorite Color' then RegAnswer else null end),'') as [Favorite Color]
, coalesce(min(case RegQuestion when 'Favorite Food' then RegAnswer else null end),'') as [Favorite Food]
FROM Answer_CTE cte
GROUP BY RegID) answers ON Registration.RegID = answers.RegID;

What was the impact? I turned on Statistics IO and Statistics TIME ON and got the following results. I ran each of them several time so I could get everything set (load into memory, cache plan, etc). These results are from the query I was tuning.  For readability purposes, I removed the read-ahead reads, lob logical reads, lob physical reads, and the lob read-ahead reads as each were zero for both queries.

Old Query

SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.

SQL Server Execution Times:
CPU time = 0 ms,  elapsed time = 0 ms.

SQL Server Execution Times:
CPU time = 0 ms,  elapsed time = 0 ms.

(1357 row(s) affected)
Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0
Table ‘two’. Scan count 1, logical reads 6896, physical reads 0
Table ‘three’. Scan count 1, logical reads 56069, physical reads 0
Table ‘four’. Scan count 3, logical reads 4628, physical reads 0
Table ‘five’. Scan count 2, logical reads 10090, physical reads 0
Table ‘six’. Scan count 0, logical reads 0, physical reads 0
Table ‘seven’. Scan count 0, logical reads 4165, physical reads 0
Table ‘eight’. Scan count 0, logical reads 4520, physical reads 0
Table ‘nine’. Scan count 1489, logical reads 9299, physical reads 0
Table ‘ten’. Scan count 0, logical reads 4569, physical reads 0
Table ‘eleven’. Scan count 0, logical reads 3, physical reads 0
Table ‘twelve’. Scan count 0, logical reads 2, physical reads 0
Table ‘thirteen’. Scan count 1, logical reads 35, physical reads 0
Table ‘fourteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘fifteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘sixteen’. Scan count 335, logical reads 720, physical reads 0
Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0
Table ‘seventeen’. Scan count 335, logical reads 721, physical reads 0
Table ‘eighteen’. Scan count 1, logical reads 22, physical reads 0
Table ‘nineteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘twenty’. Scan count 1, logical reads 100039, physical reads 0

SQL Server Execution Times:    CPU time = 13093 ms,  elapsed time = 20205 ms.

The query ran in almost 20 seconds. The only change I made was to the CTE and the coalesce function instead of Max SQL Server parse and compile time:

SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.

SQL Server Execution Times:
CPU time = 0 ms,  elapsed time = 0 ms.

SQL Server Execution Times:
CPU time = 0 ms,  elapsed time = 0 ms.

SQL Server Execution Times:
CPU time = 0 ms,  elapsed time = 0 ms.

(1357 row(s) affected)
Table ‘Worktable’. Scan count 2714, logical reads 10947, physical reads 0
Table ‘two’. Scan count 0, logical reads 13422, physical reads 0
Table ‘three’. Scan count 1357, logical reads 17500, physical reads 0
Table ‘four’. Scan count 2, logical reads 8680, physical reads 0
Table ‘five’. Scan count 2, logical reads 10090, physical reads 0
Table ‘six’. Scan count 0, logical reads 0, physical reads 0
Table ‘seven’. Scan count 0, logical reads 4165, physical reads 0
Table ‘eight’. Scan count 0, logical reads 4520, physical reads 0
Table ‘nine’. Scan count 1489, logical reads 9299, physical reads 0
Table ‘ten’. Scan count 0, logical reads 4569, physical reads 0
Table ‘eleven’. Scan count 0, logical reads 3, physical reads 0
Table ‘twelve’. Scan count 0, logical reads 2, physical reads 0
Table ‘thirteen’. Scan count 1, logical reads 35, physical reads 0
Table ‘fourteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘fifteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘sixteen’. Scan count 335, logical reads 720, physical reads 0
Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0
Table ‘seventeen’. Scan count 335, logical reads 721, physical reads 0
Table ‘eighteen’. Scan count 1, logical reads 22, physical reads 0
Table ‘nineteen’. Scan count 1, logical reads 2, physical reads 0
Table ‘twenty’. Scan count 1, logical reads 100045, physical reads 0

SQL Server Execution Times:    CPU time = 5797 ms,  elapsed time = 7196 ms.

This query ran in just over 7 seconds. Because these are exports and are not always in memory, I have found that there is larger discrepancy in the elapsed time the first time each query is run. For the purposes of comparison I ran them each multiple times before taking the stats.