Ryan J. Adams for PASS Board

Voting for the PASS board has opened and I ask you to vote for Ryan this election.  Our SQL community is the BEST on the planet and I know Ryan will do everything in his power to help grow the community and help new members feel right at home.

Volunteer positions are difficult and can be demanding.  They take time.  Ryan has PROVEN he will devote the required time in his role as regional mentor.  He is regarded as a thought leader among his peers and regularly helps other mentors with ideas to help their chapters.  He has high marks from the chapters he serves and the praise is well deserved.

Ryan has enjoyed the same position with Verizon for over 10 years and plans to stay with them.  This is important as the he has a solid personal foundation that enables to have sufficient time and resources to help support the community.

I hope you will join me and vote for Ryan this election cycle.  Our community will only benefit from it.

Cannot enable the Microsoft .NET Framework 3.5 feature on Windows Server 2012

I was building a new SQL Server Cluster and I was going a little to fast and forgot to enable the .net framework 3.5 feature on the windows server before I installed SQL Server.  .NET 3.5 SP1 is a requirement for SQL Server 2012 when you select Database Engine, Reporting Services, Master Data Services, Data Quality Services, Replication, or SQL Server Management Studio, and it is no longer installed by SQL Server Setup.

No problem, I thought–I know how to enable that using add roles and features wizard.  Only a weird thing happened on the way–the installation failed.  It didn’t even include an error message.  Weird, I thought. When I tried the install using DISM as suggest by Albert Morillo, I got the following:

The operation completed, but the NetFx3 feature was not enabled.  Ensure the parent feature(s) are enabled first.

Parent features?  This was news to me.  Then I found there have been a couple of security updates that have come which will cause this to fail.  If you are getting the operations folks to build the servers, they will have to hold off on installing security update 2966827 and 2966828 until you enable the .NET 3.5 framework.  Once they were removed, I would enable the 3.5 framework and continue my installation.

Looks like I have another item to add to my server build checklist.

All the World’s a Stage

I didn’t realize Pinal Dave had published his site stats last year and I recently came upon them and was very impressed with those numbers.  Kudos to him for all his effort.  My blog on the other hand would need him to chop off about 6 digits and we might be getting a little closer.  My blog won’t every be that popular; however, it is interesting to see where people are coming from and it makes me think the few posts I have up are helping someone.  The SQLFamily has always been about helping others and I am glad to play a small part.

Thanks Mom for all the hits.  :)

Screen Shot 08-01-14 at 04.31 PM

TempDB Configuration Matters

TempDB Database

TempDB Database

For each SQL Server database installation there are several accompanying system databases that help keep SQL Server running and functional.  They store information about security, what the server is doing, and how the system should process certain functions.  The TempDB database doesn’t keep permanent information–in fact, each time SQL Server restarts, the TempDB database gets purged and re-built as if it is a new database.  Why then should I care about the TempDB database?  Well, I am glad you asked.

Lots of Hats

The TempDB database has several functions it handles; some of which include: Sorting, local and global tables, index re-organizing, hash comparisons, XML Variables, spooling, triggers, snapshot isolation, and other internal functions.  In short–it is the workhorse of the environment and ALL databases on the system will interact with it at some point, so the configuration becomes very important.

Transaction Dependent

Because local table variables and hash joins are stored in the TempDB database, queries that use these objects/functions have a part of their processing done in the TempDB database.  When a query needs to move a process to TempDB, it creates a worktable and write data out.  What happens when I have 1,000 transactions per second and all of them have a hash join?  I will have a line out the door of threads waiting to be able to write to TempDB for their turn.  I can potentially have GAM (Global Allocation Map) and SGAM(Secondary Global Allocation Map) contention.  Also, because these processes are dependent for a transaction to process, the speed in which than can be written and read becomes important.

Providing some relief

There are two recommendations that can provide immediate to your TempDB environment.  The first is to create multiple TempDB files of the same size.  This will help alleviate the SGAM contention.  The second is to move the TempDB data files to the fastest disks in the system and/or to the most number of spindles as possible.  TempDB is the first candidate for Flash drives should the entire system not be able to take advantage of those disks.

How large should my TempDB files be? Determining the size of the TempDB files may be a bit of trial and error; however, if you have databases already on the system, one way to help make a decision is to run DBCC CHECKDB WITH ESTIMATEONLY to get the size the CHECKDB command uses.

How many TempDB Files should I have? While there are varying ideas, the one I hold to is 1 file per CPU core up to 8 files.  Monitoring will help provide insight if more are needed in the future.

The Importance of Backups

A re-post of a posting at b2bsol.com.

Too many organizations do not have adequate protection of their data and are susceptible to data loss.  While security is important and you might think this post is about implementing policy to limit your exposure to hacking, I am talking about something much more basic than that–I am talking about database backups.  A database backup is the first step in ensuring data availability and limiting exposure to corruption or human error.

What kind of backup strategy do you need?  Well, what is your tolerance for data loss?

To help you answer this question, there are two components you need to consider.

RPO – Recovery Point Objective

TechTarget defines RPO as “the age of files that must be recovered from backup storage for normal operations to resume if a computer, system, or network goes down as a result of a hardware, program, or communications failure.”  My definition would be something like–the moment in time you want to be able to restore to.  If you experience corruption or disk failure, how close do you need to get to that point in time?  Defining this metric–which will vary from system to system, will give you your RPO.

RTO – Recovery Time Objective

TechTarget defines RTO as “the maximum tolerable length of time that a computer, system, network, or application can be down after a failure or disaster occurs.”  My definition would be–The time needed to restore a system to usable functionality. I should note this time would include the alerting and response of your team if user intervention is required.

It would be easy to say, I want less than 1 second of data loss (RPO) and  less than 5 minutes of downtime (RTO); however, don’t expect to pay TGIF prices for Ruth Chris service.  Microsoft has made great strides in giving SQL Server many options for High Availability and Disaster Recovery and the ability to keep the system up ; however, none of these solutions remove the requirement to take backups.  The amount of history you keep will depend on your business requirements and the costs associated with keeping that storage.

If your organization does not have the RPO and RTO points defined, it is definitely time to make it happen.

Unable To Login to SQL Server Error 18456

I installed SQL Server on a machine the other day, I connected from my desktop, finished my normal server setup routine and life went on. Later, I connected to the machine via Remote Desktop, fired up SSMS and got this error message.

With an accompanying error in the log
Login failed for user ‘MyAccount’. [CLIENT: ]
Error: 18456, Severity: 14, State: 11.

Very strange, I thought to myself. Turns out I had added my account to the local administrator’s group and not created a Login on the SQL Server. (It also appears I missed a step on my server checklist, as I normally remove the BuiltinAdministrators group, but who is keeping score?)

I create the Login, and specified the group, and I was good to go.

EXEC sp_addsrvrolemember [DomainAccount], ‘sysadmin’;