TempDB Configuration Matters

TempDB Database

TempDB Database

A re-posting originally on b2bsol.com

For each SQL Server database installation there are several accompanying system databases that help keep SQL Server running and functional.  They store information about security, what the server is doing, and how the system should process certain functions.  The TempDB database doesn’t keep permanent information–in fact, each time SQL Server restarts, the TempDB database gets purged and re-built as if it is a new database.  Why then should I care about the TempDB database?  Well, I am glad you asked.

Lots of Hats

The TempDB database has several functions it handles; some of which include: Sorting, local and global tables, index re-organizing, hash comparisons, XML Variables, spooling, triggers, snapshot isolation, and other internal functions.  In short–it is the workhorse of the environment and ALL databases on the system will interact with it at some point, so the configuration becomes very important.

Transaction Dependent

Because local table variables and hash joins are stored in the TempDB database, queries that use these objects/functions have a part of their processing done in the TempDB database.  When a query needs to move a process to TempDB, it creates a worktable and write data out.  What happens when I have 1,000 transactions per second and all of them have a hash join?  I will have a line out the door of threads waiting to be able to write to TempDB for their turn.  I can potentially have GAM (Global Allocation Map) and SGAM(Secondary Global Allocation Map) contention.  Also, because these processes are dependent for a transaction to process, the speed in which than can be written and read becomes important.

Providing some relief

There are two recommendations that can provide immediate to your TempDB environment.  The first is to create multiple TempDB files of the same size.  This will help alleviate the SGAM contention.  The second is to move the TempDB data files to the fastest disks in the system and/or to the most number of spindles as possible.  TempDB is the first candidate for Flash drives should the entire system not be able to take advantage of those disks.

How large should my TempDB files be? Determining the size of the TempDB files may be a bit of trial and error; however, if you have databases already on the system, one way to help make a decision is to run DBCC CHECKDB WITH ESTIMATEONLY to get the size the CHECKDB command uses.

How many TempDB Files should I have? While there are varying ideas, the one I hold to is 1 file per CPU core up to 8 files.  Monitoring will help provide insight if more are needed in the future.

The Importance of Backups

A re-post of a posting at b2bsol.com.

Too many organizations do not have adequate protection of their data and are susceptible to data loss.  While security is important and you might think this post is about implementing policy to limit your exposure to hacking, I am talking about something much more basic than that–I am talking about database backups.  A database backup is the first step in ensuring data availability and limiting exposure to corruption or human error.

What kind of backup strategy do you need?  Well, what is your tolerance for data loss?

To help you answer this question, there are two components you need to consider.

RPO – Recovery Point Objective

TechTarget defines RPO as “the age of files that must be recovered from backup storage for normal operations to resume if a computer, system, or network goes down as a result of a hardware, program, or communications failure.”  My definition would be something like–the moment in time you want to be able to restore to.  If you experience corruption or disk failure, how close do you need to get to that point in time?  Defining this metric–which will vary from system to system, will give you your RPO.

RTO – Recovery Time Objective

TechTarget defines RTO as “the maximum tolerable length of time that a computer, system, network, or application can be down after a failure or disaster occurs.”  My definition would be–The time needed to restore a system to usable functionality. I should note this time would include the alerting and response of your team if user intervention is required.

It would be easy to say, I want less than 1 second of data loss (RPO) and  less than 5 minutes of downtime (RTO); however, don’t expect to pay TGIF prices for Ruth Chris service.  Microsoft has made great strides in giving SQL Server many options for High Availability and Disaster Recovery and the ability to keep the system up ; however, none of these solutions remove the requirement to take backups.  The amount of history you keep will depend on your business requirements and the costs associated with keeping that storage.

If your organization does not have the RPO and RTO points defined, it is definitely time to make it happen.

Unable To Login to SQL Server Error 18456

I installed SQL Server on a machine the other day, I connected from my desktop, finished my normal server setup routine and life went on. Later, I connected to the machine via Remote Desktop, fired up SSMS and got this error message.
LoginFailure18456

With an accompanying error in the log
Login failed for user ‘MyAccount’. [CLIENT: ]
Error: 18456, Severity: 14, State: 11.

Very strange, I thought to myself. Turns out I had added my account to the local administrator’s group and not created a Login on the SQL Server. (It also appears I missed a step on my server checklist, as I normally remove the Builtin\Administrators group, but who is keeping score?)

I create the Login, and specified the group, and I was good to go.

CREATE LOGIN [Domain\Account] FROM WINDOWS;
EXEC sp_addsrvrolemember [Domain\Account], ‘sysadmin';

Newest Regional Mentor for PASS

This month I am pleased to accept the invitation to serve as a Regional Mentor for the Mid-Atlantic region of the PASS organization. It was not that long ago I stepped into my first chapter meeting in Richmond and discovered the unique mix that is the SQLFamily. Many local meetings, several SQLSaturdays, and the SQLSummit have only added to the experience and I am happy to be able to share some of these experiences with others.

As a regional mentor, I will serve as a resource for the local chapters in providing help as they operate their chapters. I will be looking to increase or strengthen the number of chapters in the region and maybe even add another SQLSaturday to the mix (looking at you Norfolk or Lynchburg). I am looking forward to it and as always–I will see you on the SQL trail.

Verizon Fios setting

I recently switched to Verizon FIOS and spent more time than I care to admit getting my printer set up.  The printer is attached to the network and was working previous to my change; however, when I tried to connect to the printer from my laptop I would get a connection failure error.

The resolution was to disable the IGMP Proxy under the advanced settings and ta-da!, I could connect again.

IGMPProxy

SSIS Conditional Logic Based On Stored Procedure Result

I was looking to implement some conditional logic in and SSIS package. In my case, I wanted the package to halt if some validation rules were not met. The logic for these validation rules are found in a stored procedure and I needed a way to return a flag to the SSIS package the success r failure of the validation. I found I could do this with the Result Set tab on an Execute SQL Task. Here is what I did to set this up.

First, I created a stored procedure with the validation rules. For simplicity, I have created a stored procedure that returns a value. You will notice I have named the returning result ‘ReturnResult’.

CREATE PROCEDURE mysp_ReturnAValue
AS
BEGIN
    DECLARE @ReturnVariable SMALLINT
    SET @ReturnVariable = 1

    SELECT @ReturnVariable AS ReturnResult
END

ConditionalLogic_ProcessOverview

Second, I created an SSIS package with a variable of type Int32 named ProcedureResults and an Execute SQL Task component. After establishing the connection on the SQL Task, I changed the ResultSet to Single Row.ConditionalLogic_ExecSQLTask I click on the Result Set ‘tab’ on the left and link the returning value from my stored procedure to the variable in the package.

ConditionalLogic_ResultSet

Third, I added two data flow tasks and linked them with a precedent constraint so the Execute SQL Task would be the first item executed. I only want a data flow tasks to execute based on the results of the stored procedure.

Fourth, I modified the Evaluation Operation of the constraint to be of type Expression and I added and expression it should use when the package runs. In the success constraint the value was @ProcedureResults > 0. ConditionalLogic_ResultSet

The failure contraint? You guess it, @ProcedureResults <= 0 .

**Interesting sidenote.  My stored procedure returns a SMALLINT.  When I created this on a client system the stored procedure returned a SMALLINT data type, just like my example above; however, the Execute SQL task was unable to process the result set.  What happened was no matter what I returned in the stored procedure, the package would stop at the Execute SQL Task and the package would succeed.  It was not until I changed my SMALLINT to an INT that it was able to process the result and continue in the package.  When I created the package on my local machine, I was unable to recreate the issue.  I haven’t been able to put my finger on the difference between the environments, but it seems to have gotten fixed somewhere.  :)

Free Space In Database Files

File growth can cause system outages as SQL Server will have to wait for the file to grow before it can resume operations.  Of course, this depends on the size of the database, the growth rate, etc; however, the size of your databases is something you will want to keep an eye on.  The below script gives you this information and so you can better monitor these rates and grow your database during the maintenance window.

This query will only grab the information for the database you are connected to.

SELECT  s.name AS [Name]
<pre>, s.physical_name AS [FileName]
, s.size * CONVERT(FLOAT, 8) AS [Size] /*File size on Disk*/
, CAST(FILEPROPERTY(s.name, 'SpaceUsed') AS FLOAT)
* CONVERT(FLOAT, 8) AS [UsedSpaceInKB]
, (CAST(FILEPROPERTY(s.name, 'SpaceUsed') AS FLOAT) * CONVERT(FLOAT, 8))
/ 1024 AS [UsedSpaceInMB]
, ((s.size * CONVERT(FLOAT, 8)
- CAST(FILEPROPERTY(s.name, 'SpaceUsed') AS FLOAT) * CONVERT(FLOAT, 8)))
/ 1024 AS 'Available Free Space In MB' /*Matches SSMS file report*/
, (s.size * CONVERT(FLOAT, 8)
- CAST(FILEPROPERTY(s.name, 'SpaceUsed') AS FLOAT) * CONVERT(FLOAT, 8))
/ (s.size * CONVERT(FLOAT, 8)) * 100 AS 'Percent Free'
, growth
, max_size
, state_desc
FROM    sys.master_files AS s
WHERE   s.database_id = DB_ID()

Getting a little fancy, we can grab the information for each database on the instance, but you would have to insert this into a table to do any ordering.

EXEC dbo.sp_MSforeachdb ' Use ? SELECT
s.name AS [Name],
s.physical_name AS [FileName],
s.size * CONVERT(float,8) AS [Size], /*File size on Disk*/
CAST(FILEPROPERTY(s.name, ''SpaceUsed'') AS float)* CONVERT(float,8) AS [UsedSpaceInKB]
, (CAST(FILEPROPERTY(s.name, ''SpaceUsed'') AS float)* CONVERT(float,8)) / 1024 AS [UsedSpaceInMB]
, ((s.size * CONVERT(float,8) - CAST(FILEPROPERTY(s.name, ''SpaceUsed'') AS float)* CONVERT(float,8))) / 1024 AS ''Available Free Space In MB'' /*Matches SSMS file report*/
, (s.size * CONVERT(float,8) - CAST(FILEPROPERTY(s.name, ''SpaceUsed'') AS float)* CONVERT(float,8)) / (s.size * CONVERT(float,8)) * 100 AS ''Percent Free''
, growth, max_size, state_desc
FROM
sys.master_files AS s
WHERE s.database_id = db_id()'
**I was delayed in getting this post out and I no longer remember where I got the original code from. I am pretty sure it was from a trace file looking at what SSMS did do pull this information. If this is your script (or started as your script), I am sorry I did not note it above.**

Excel Connection Manager

I find myself needing to import files from time to time and usually use SSIS to import the data–normally to a temp table.  I did so the other day and got this error.

SSIS package “MyTest.dtsx” starting.
Information: 0x4004300A at 2008 10K, SSIS.Pipeline: Validation phase is beginning.
Error: 0xC00F9304 at MyTest, Connection manager “Excel Connection Manager”: SSIS Error Code DTS_E_OLEDB_EXCEL_NOT_SUPPORTED: The Excel Connection Manager is not supported in the 64-bit version of SSIS, as no OLE DB provider is available.
Error: 0xC020801C at 2008 10K, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager “Excel Connection Manager” failed with error code 0xC00F9304.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.Error: 0xC0047017 at 2008 10K, SSIS.Pipeline: component “Excel Source” (1) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at 2008 10K, SSIS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at 2008 10K: There were errors during task validation.SSIS package “MyTest.dtsx” finished: Failure.

I normally just run the package in Visual Studio, so I changed the 64 bit option of the package property and that allowed me to execute the package successfully.  Should you need to save the package and run it in a scheduled job, you will need to make this change elsewhere as noted here.

This setting can be found by clicking on Properties/[Package Name] properties . . . Where [Package Name] is the name of the package you are working with.

SSISPropertyConfig_64Bit

SQL Saturday 173 Washington DC

Saturday December 9th found me in Washington DC for SQL Saturday 173.  As we are planning an event in Richmond, I opted to volunteer more than I have in the past so I could be closer to the logistics and see what the organizers had done. #SQLSat173 was organized by Chris Bell, a first time organizer, with a little assistance from his wife.  :)

I knew putting on a SQL Saturday took a lot of work and the DC team put it all together nicely.  While the content of the sessions would make most attendees happy, the DC team provided a few extras to give the attendees lots of reasons to enjoy themselves–there was food all day long, a Microsoft Kinect in the lobby, a hot lunch from Maggiano’s, and they had a Kindle Fire raffle for those that printed their speedpass.  The DC team had wonderful sponsors that helped most of that possible and I hope to the RVA team can replicate some of that–with our own twist of course.

The DC team provided vests as a thank you to the speakers and volunteers.  Here I am sporting mine.  :) VestOh course, I did make it to several sessions and I had a great time.  My dad actually tagged along and enjoyed himself.

The one other treat was meeting with Karla Kay and chatting with her about some of the ideas we had for our event.  As always, the SQL community is wonderful and I hope to see you out on the trail!

Roanoke SQL Server Users Group

October 18th, I spoke at the Roanoke SQL Server users group.  While not a big group, there was a good comradery and I enjoyed meeting the people there.  I spent the night in Roanoke with the family and we enjoyed seeing the town.  If anyone in Roanoke is looking for a DBA . . . I may know someone.  :)

I spoke on Performance tuning with a focus on indexes.  This was my first presentation for this subject and I had some trouble with the demo so I have some work to do there; however, here are the scripts I used for the presentation.

If anyone from the group has some feedback, drop me a line.  I would like to hear it.