Episode 09: Database Automation

1400Joey D’Antoni has been doing quite a bit with database automation and I caught with him to discuss his experiences about how database automation has helped him be more efficient and have fewer support problems.  We discuss strategies for starting your automation techniques and he gives a few pie in the sky examples if you are running a large environment.  Compañeros, I hope you enjoy the program.


Show Notes:

Central Management Server
Building Perfect SQL Servers, Every Time
Build scriptsT-SQL Tuesday: Managing the Enterprise
September 2015 T-SQL Tuesday
Joey’s Personal Blog

Transcription: Database Automation

Carlos L. Chacon: Welcome to the SQL Data Partners podcast, my name is Carlos L. Chacon, your host. This is episode 9, and today we’re talking about automation.Interestingly enough, this September 2015 T-SQL Tuesday theme was, “Strategies for Managing an Enterprise.” I ended up writing a little bit about automation, and SQL server setup in particular, and we’re going to continue with that topic or that theme today.

If you’ve never heard about the T-SQL Tuesday theme, this was brainchild of Adam Machanic, where several years ago, he thought of the idea of asking bloggers to host a topic or a theme, and bloggers could submit to this page, and the community could increase their knowledge about that specific theme.

The September theme was hosted by MidnightDBA, and again, they chose strategies for managing an enterprise. Today, we have Joey Dantoni with us. I’m super excited to have Joey. He is well-versed in performance tuning and blogs, over at Joeydantoni.com.

He talks about several items, or several scripts that he makes available, several sessions that are available from some of his different conferences that he attends. All of those will be available on the show notes, at sqldatapartners.com/podcast. You can go there and get information about today’s show.

As always, thanks for tuning in. If you like what you hear on today’s episode, we invite you to give us feedback via iTunes or Stitcher. You can also leave some comments on the website. We’d love to hear from you.

If there’s some potential topics you want us to cover, please let us know about those as well. You can hit me up on twitter at @carloslchacon. Lots of information in today’s session, and as always, compañeros, welcome to the show.

Children: SQL Data Partners.[music]

Carlos: Joey, welcome to the show!

Joey: Thanks, Carlos.

Carlos: One of the things we wanted to talk a little bit about today was automation. I know you’ve shared some stories, and I’ve heard some things about one of your previous employers, working at Comcast. Kind of a large environment.

Joey: Yeah, we had roughly close to 1,000 servers total. I was the architect, but I had 4 DBAs working on a team under me. We didn’t have a whole lot of people to maintain that kind of environment. Realistically, when you get into larger environments, you never do, because people are in scale, and your budget is in scale.We had to automate a lot, just so we didn’t have guys working 24-7. We had our build process, and our trade deployment process, fully automated. Trade deployment was interesting. We used SSIS for that. From a build perspective, it was pretty straightforward.

As part of our personalization infrastructure, we built a service management layer, where business users or developers could come in and request a new virtual machine. We had that process build that from end to end, from the Windows perspective.

Using a lot of power shell, particularly in and around Windows, to do things like do an offline domain join, to get the server joined to the domain, grab an IP address. The only issue we had with that was, we couldn’t get…The DS is fully automated. It’s a long story.

Carlos: [laughs]

Joey: It was a long political story. SQL-wise, what we did was, we automated the installation. It’s a two-faced approach that I took to doing SQL automation. You do the install. SQL has a ton of add defaults. We all know that.The install process itself is pretty vanilla. The only thing that matters before you start using SQL, is that you get the right drives configured. You want the right volumes configured for [inaudible 04:33] sets, and where AF files are going to be set.

After, what we did was…and I still use this with my clients, have the script that does the best practices after the fact, and goes in and sets max memory, sets max [inaudible 04:51] parallelism. All plays on what exists in the server, and what is [inaudible 04:54] the DMVs.

Carlos: The ability to automate, because there is lots of things that we can automate. The idea of why automate, you hit on a little bit of that. Simply your scale that require that you do that. Another reason I can think of to help us automate, or why you would automate, is to have a consistent experience.As we get into, you mentioned 1,000 servers, I don’t want to have to have my data files, for example, even in three different places over 1,000 servers.

Joey: Totally. I don’t even like it with 10.

Carlos: [laughs]

Joey: That was really the other goal of our whole project, client project. We had our build process, for physical servers, pretty automated. There were only a couple of manual steps. Even then, still you had weird things happen, on the Windows side, some hash cache doesn’t get applied, or server has the wrong DNS address. We went through a DNS migration at some point.The other part of automation is, you need to build in some intelligence in your health check process, to make sure everything…I read an article about this the other day, and it really wasn’t related directly to what we’re doing in terms of SQL service in Windows, but was more talking about automation, using [inaudible 06:26] and things like that.

One of the messages that the article said, was you still need to have a process to go back, and make sure nothing’s changed, everything is good, everything is happy in your environment.

Carlos: As much time as we spend, trying to set up our automation process, we still need to have someone to check in, and make sure that everything has done what we think it has.

Joey: We were using a system center configuration manager when I was at Comcast. It was really good, except for one problem. It does not know about the existence of SQL server clustered, several clustered instances, which were 80 percent of my production environment.It can only connect to a default instance, which is a shame, because it worked great for Windows. The tracks were very easy to build, for any kind of Windows task, anything you need to look in the registry, anything you get from [inaudible 07:20] line, about the [inaudible 07:21] .

Since we couldn’t connect to SQL clusters, it was virtually useless to do that. What we ended up doing was, and I like this as the core of any real automation strategy. We had an infrastructure server that the DVA team used, and that was the hub for all our scripts. They were all mounted off of there.

There was also our metadata repository for our SQL servers. Since everything went back to that, that’s where my health check ran. I had a process that ran every morning, that would grab a list of all of our instances, and then dynamically create a batch file that would run SQLCMD.

Which if you’re working on automation, you need to get familiar with SQLCMD, and then run that batch file, and I put the results. As part of that process, it would dynamically create a link server back to our infrastructure server.

This was done that way. I didn’t log, doing that, but we had some network issues. We have different networks in various things, and that infrastructure server was…Any SQL server can always talk back to it. It was clear. I didn’t have to worry about it.

It ran through all of our instances, run our health check script that we put together, and then returned it. The other [inaudible 08:42] was we have a nice create a tabular report that the managers and VP’s to look at and say, “Hey, we’re in compliance.”

We created a scorecard. It gave something that was a measurable goal, the kind of crap I used to have to do when I worked in big companies.


Carlos: That’s an interesting process. Even add to that, that automation process is then providing a way to report on that, so that you can ensure that, “Yes we’re in compliance.”

Joey: Yeah.

Carlos: Interesting. You described what I’m going to consider a fairly complex, or at least robust environment. Some of our listeners are thinking, “Wow! That could be a little bit tough.”Before we get to some of the specific things that we could do to help start our automation processes, let’s talk a little bit first about, what it is that we want to automate.

Joey: There’s a few things, to be honest, to go a little bit on the side, we were stuck down that path, strictly because of our scale. There’s a lot of great tools that are built in, like central management server and policy-based management, that worked really well.None of these will scale beyond 200 servers. Don’t be scared off if you have to build your own thing. Microsoft has some stuff that’s built in. The number one thing I always tell people to automate is…I do a lot of health checks with a consultant, and I do a lot of the same write ups every time.

Automate your build process, and automate the “best practices” script. It’s not that hard. You can download it off my blog, or watch my presentation from Summit, last year, it’s on YouTube. You don’t even have to pay for it.

It’s pretty easy. SQL server generates an INI file for you. If you go through the install process, and set everything up the way you’d want it and click “Cancel” at the end, there’s an INI there. All you have to do at the point is run setup.exe from the command line, and it passes the config file.

If people did that, they would eliminate many percent findings I find in a lot of health checks, the common ones. The temp [inaudible 11:06] is going away in 2016, but one temp.dat file maxes off at zero, cross ratio [inaudible 11:12] at the five.

Data files and temp.dat on the same drive that one is…If you want, this really depends on your SAN configuration, but a lot of the common ones. You hear the guys say to make sure they’re friends with their Windows team.

Because a lot of these automation things derive from making sure you’re a Windows guys. Even if it’s inconsistent, it’s harder to automate. You want to make sure you’re drives are always configured the same way on every server.

You want, typically like some of the other things that are associated with build, that are Windows related, or configuring an instant file on initialization, and a lot of pages and memory if you’re wanting to do that.

If you can get those things configured AD level, that’s one last thing you have to worry about when you’re configuring your servers.

Carlos: Compañeros, listen to that. Go grab your Windows guys, grab them, sing “Kumbaya” and make it happen. [laughs] That’s mostly on the build side, then you mentioned putting in some process from SQL Server Standards, or the default settings.When I tend to think about this as, “These are all my jobs and things as well that I’m going to put out there.” For example my backup routines and things like that. After my build process, I have another set of scripts that run, that will go and do all of that index maintenance, things of that nature.

Joey: I throw all that in my build script, because the one I did at Summit last year, I messed around with it. I use a lot of online scripts. I heard there are a few other ones out there, many, and a couple other things. I’ve been really happy with all the stuff.What I do with that is, I have a part in my build script. It’s a batch file. The batch file does a few things, a call setup that EXE with the config file. Then I ran a PowerShell script, that’s an always website, grabs the latest version of these scripts, which he promises me is a consistent URL.

Writes these to a SQL file, and then after I ran the best practices script, that it sets max memory and a lot. I do all the script. The one thing that’s a little challenging with that, I haven’t taken the time to do. I do it for one point, but I haven’t done is, automating the schedules.

It’s all a dozen schedule which are…he doesn’t schedule the jobs by default. Building a script for that. That’s one of those things that you have to customize for your environment, because not everybody is going to be doing their index rebuild on Sunday morning at 3:00 AM. Most people aren’t.

Carlos: That’s an important process, is then building in your own, if you will, customizations into that environment. Even in something like that, like a schedule, your larger environments might be different from the single instance. The ones it uses.

Joey: Definitely. Then you have things like that that may or may not be consistent across the environment. You may have to get a different places, or different servers, and things like that.

Carlos: My thought there is that I generally have…it is a couple of buckets for those schedules, and then I can then deploy those schedules depending on, “This is a single install. It’s going to be for some small department,” and you’re looking at deploying schedule one.If this is the cost for my production environment, where the whole company uses it, LR, HR, then I am on bucket three, and I can use those processes there.

Joey: That’s a really good approach. You can easily deploy or lay as many schedules as you like. There’s no real negative to that, other than making the table in a CDL a bit bigger. I really have bought it in.I like that approach because you can have…and really from an automation perspective, if you had something in your naming convention, and tie back to the environment. I hate naming convention debates. They’re my least favorite thing in IT, because everybody is going to get point.

This is one of those places where it matters, because if you’re using some sort of regular expression in your naming convention, even in the case you’re talking about. I like the number idea. That you want that number to be like save a third character, or whatever the server was [inaudible 15:50] .

You can parse on that, and then as part of your automation you can apply those schedules likewise, because you string out the server name, and a case statement, and then ruin the following [inaudible 16:03] .

Carlos: You’ve talked a little bit about some of the things that you’ve used. You mentioned the SQLCMD, and of course, PowerShell has come up. The thoughts on getting started with that process, I’m not the PowerShell guy, and you’ve probably forgotten more about PowerShell than I know currently.If I was starting out, start with the automation process, and you’ve mentioned even the install process of getting an INI. What would be some of the steps, or what was your experiences going through that process of, “Here’s how I’m going to start automating some things?”

Joey: Good question. First of all, PowerShell, especially with SQL Server, is like this weird religion, and I’m definitely not a part of. The guys that really like to do PowerShell, really like to do PowerShell. I really like to do it from a Windows perspective, I find to me it’s a painful process, the SQL Server.I do it when I have to. Microsoft does a really good job with instrumenting SQL Server, with T-SQL to do about everything. It tends to be, especially for DBAs, you can go to T-SQL and live in T-SQL every day. It tends to be a lot easier for them to live in that world.

What I typically tell people, and Michael and Nelson is if you do something easily in T-SQL, do it in T-SQL. It’s pretty easier to automate there. However, for tasks that you’re doing with the operating system, or you’re making calls…

One of the things that I worked on last year, was this script for a client, where we would dynamically extend a volume. Call that using a PowerCLI, which is VMware’s implementation. VMware set up command lines for PowerShell.

Basically if we detected that a database volume was getting too full, making OS call, extend the volume, which would in turn make a [inaudible 18:14] call. These kinds of things, are the things I focused on. It’s figure out a task that you…It’ll tell you the exact way on PowerShell.

I forget what it was, but I had to do some pass on about 150 servers when I was at Comcast. It was…


Carlos: [laughs] I don’t want to do this 150 times.

Joey: Yeah. It was like changed DNS, or something stupid like that. Yeah, I went and figured out how to do it. The language for the basics is pretty simple. Then the other thing I’d say, especially with cloud becoming so prevalent, and this is slightly changing. The new Azure portal doesn’t really work for half the stuff they say it does.A lot of the stuff in Azure isn’t exposed through the portal, and you can only do it in PowerShell. I will say Azure’s PowerShell is probably a little bit simpler than regular Windows PowerShell. It’s the same language, but a list of the things I had to do with Azure or PowerShell, are pretty basic.

The basic syntax of PowerShell is, get dash something. Get process will get you a list of process for your desk services, will get you a list of services. There’s a crazy language involved there. It’s really finding a couple of tasks that you want to do.

One of the things I do that will be a nice entry to PowerShell is, I wrote a script to automate patching. Applying service to [inaudible 19:47] is the SQL server. There’s two conditions that Microsoft tells you that you need to meet for a service pack to be applied successfully, or sealed.

It says the build number of SQL server matches the expected build, and that the service is online. I wrote a quick script that sets the expected build number as a variable. You have to pass that [inaudible 20:16] , because it had time to grab it. I’m sure I can get it from setup somehow.

Take that as a build number, run the patch, I did it really inelegantly. I had it sleep for roughly how long the patch takes to apply, and then do a check again. SQL Server, make sure the service is running, if the service isn’t running, email me.

Skip that check if the service is running, and then go on to do a build check, and make sure it all matches, and then email me the builds, and the expected builds. Find a task you want to do. That’s the best way to learn a language I find. Is find something you need to do, and then do it.

I forget a lot. It’s funny, I did a lot of 1.x and Oracle, and UNIX. That whole scripting is a little bit more intuitive to me. They’re [inaudible 21:11] and stuff. I was working on Nobel UNIX client the other day, and I had hadn’t been on a UNIX for like eight months, and I was pretty quick to give out the scripting. PowerShell, I had to do crazy time.


Joey: I knew the concepts, but there’s still a little bit of subtleties.

Carlos: Anytime you have a problem to solve right, it makes it a little bit easier, because you know that there is something that you’re solving. There’s a goal there.Something that you said was important, is that you want to do that process once, figure out how the process can be done in a language like T-SQL, which a lot of our listeners are going to be using, or wanting to be using.

Even inside something as simple as SQL Server Management Studio, you can set that up, script out that process, take a look at it, and start to pull it a part to say, “OK. What are these pieces doing here?” Now that you have that script, you can save it and use it again.

Joey: Definitely. That’s the whole thing with scripting. It’s the repeatability that you don’t have to deal with that going forward. One of my favorite scripting techniques, if you want to do a quick little automation task, is building three SQLs.Using [inaudible 22:33] Nation to build SQL statements. The example I showed in the presentation, I do have [inaudible 22:37] , a real basic back up automation script. I say, “Select ‘Back up data base.'” Plus sign is the [inaudible 22:47] automation operator in SQL Server.

The only thing that gets a little dicey there, is you have to convert any [inaudible 22:54] values to…


Joey: …because SQL Server will try to add the [inaudible 22:59] to the…[crosstalk]

Joey: You create a script, or you create a file that’s a bunch of back up data base statements that are successful. With SQLCMD having an output flag, you can create a quick back strip to say, “SQLCMD run generate backup. SQL, output the results to run backup.SQL.”Then in your next script you say, “Run backup.SQL,” and output those results to a log file. If you want to take that one step further, you can use PowerShell to read the log file, and look for any errors.

Carlos: [laughs]

Joey: That’s a real pretty quick example, but…

Carlos: A step through that. I’m taking the script to create the backup statements. I have another script that executes those statements, and then puts them into a log.

Joey: Right. No. See, you write the scripts that would generate the statements, you output that result to a SQL file, you then run that SQL file, and then you output that result. It’s two lines of SQLCMD. SQLCMD line one, gen.backup.SQL, and that outputs to run backup.SQL. Then the second line is run backup.SQL, and that backup that log.

Carlos: To start automating, we talked a little bit about SQLCMD. Assistance from the community, you mentioned you had some scripts. Are those scripts available to share? You mentioned one was on your website.

Joey: Yeah. I’ll find the URL for that. Definitely my bill process is on my website. Like I said, the presentation, I did a Pass last year on automating your bill process. It’s on Pass TV, which I forget if you have to register for. I know you don’t have to pay for it.

Carlos: We can put those in the show notes. We’ll make sure that those get available.

Joey: Allen Light’s got some pretty good sessions on doing some automation with PowerShell. I’m not sure if any of those are available online for free, but he’s helped me a lot with some of the PowerShell stuff. I still docile or still winded. [laughs]

Carlos: There you go.

Joey: I still use batch files. Anytime you’re combining three or four things, like SQLCMD. I don’t run these through PowerShell. One of the things that SQLCMD does, that say for example we are talking about, one of the challenges to it is headers, and counts, and things like that.Because if you’re automating the building of a file, and you don’t want to have headers in your results, when you are going to run that, SQL is going to show up, it’s like it’s got nothing to do. In SQLCMD, you can pass a -h -1 command to strip the headers out, and there’s also a no count option.

You add that no count in your SQL strip. You’re getting the results you’re looking for. In those, SQLCMD doesn’t have that remove headers option.

Carlos: Ah.

Joey: Some people have rewritten it. There up in like SQLCMD four that people have written themselves, but I generally don’t like to recommend people third-party stuff, because you never know what you can do in client [inaudible 26:28] . I try to make stuff work with the core products stuff.

Carlos: It’s another thing you have to keep up with in your environment.

Joey: Yeah. If Microsoft’s auto updating it, that’s fine, but if it’s something I have to go download from somebody, it’s not as good.

Carlos: Great. Joey, I appreciate the conversation. It’s been some interesting information. We’ll make sure we can share some of those ideas in the scripts that you mentioned there in the show notes.We have a couple more questions for you, one of the things we want to try and do is provide a little bit more value for folks who are listening. We’ve mentioned some great ideas now, but one of the things we’d like to talk a little bit about is your favorite SQL tool. This could be a paid tool, a free tool, but why do you like it, and why are you using it?

Joey: This is dicey, because I’ve done work for every vendor this year.

Carlos: [laughs]

Joey: I really like SQLCMD, for the automation value.

Carlos: There you go.

Joey: I’ll name two other ones, Red Gate…I’m going to fall down at this, they’re steam and compare tool. I don’t think it’s called steam and compare. The tool that I use more than any other tool would be SQL Sentry Plan Explorer. It’s really nice, especially the complex execution plans.

Carlos: Much easier to digest.

Joey: Yeah.

Carlos: Very good. Thanks for sharing that. One other thing, you’re now working with Denny Cherry and Associates…

Joey: Yeah.

Carlos: …or with them. This doesn’t all have to be with them, but I we’re looking for an experience that you’ve had that helps illustrates why you enjoy being a data base administrator, or a data professional.

Joey: It’s helping deliver business value to my client’s. The feeling you get when you take a query down from…Actually specifically around this warehouse we architecture, the client’s seeing their run time speed a 100 to 1,000 times better for some things.They got under our hardware, but some of it’s…we’re using 2016 with column index, and it’s really fast. Seeing those kinds of benefits. This a data company, they sell data as their business. Them being able to deliver results to their customers faster, is adding value.

Carlos: Very good. Before we ask you the last question, let’s take a look at one more opportunity folks have to learn about SQL Server.

Announcer: Hello there compañeros. I want to tell you about a unique SQL Server training opportunity that’s unlike anything you’ve encountered. As a listener of this podcast, you’re entitled to a special offer. SQL Cruise is a premier training experience set in the most unlikely of places, a cruise ship.Tim and Amy Ford have created a wonderful culture on SQL Cruise, with five or six technical leads from various industry sectors, you and roughly 20 other students will talk shop in classes while at sea, and while you’re in port you can enjoy the wonderful world we live in, either by yourself or with a group.

The beauty of SQL Cruise is, the unique relationships you will form with other SQL Cruisers. Worried about leaving your spouse? Bring them along! I did, and she had a great time with the group. In fact, I have been on SQL Cruise twice, and I’m looking to go a third time.

You may be wondering if this is serious training, and I can assure you it is as good as any conference you would attend on land. It never crossed my mind that I could be talking about SQL Server, with several Microsoft MVPs on the beaches of Saint John’s.

I know you must have other questions, so I suggest you head over to sqlcruise.com, and check out the site. Once you are satisfied and you want to sign up, when you get to the question, “would you be so kind to share the name of who referred you, and what source material led you to us?”

You enter Sequel Data Partners, and Tim will ensure you get $100 off the training registration. This is a great offer, and I hope this $100 savings will entice you to check out the site. I did, and went on the Caribbean cruise and had a great experience, and I know you will to. Maybe I’ll see you on board.

Head on over to sqlcruise.com, and use the code “SQL data partners” to save $100.

Carlos: Joey, our final question. If you could have one super hero power what would it be, and why would you want it?

Joey: To bend the laws of physics so data goes faster.[laughter]

Carlos: That’s very interesting. You’re the first one…

Joey: Actually, my other super hero power that I’d want would be a private jet. That’s not really a super hero power, I know, but to greatly simplify aviation.

Carlos: [laughs] There you go. You’ve been the first one who’s indicated that their power would go to helping the client, with the getting data moved faster. That shows your commitment there. Hey, thanks for another great episode. Joey, it’s been great having you on the show, we’ve really appreciated it.

Joey: Cool.

Carlos: Compañeros, thanks again, and we’ll see you on the SQL trail.[music]

Children: SQL Data Partners…

Abbreviations for SQL Server Components when Installing with PowerShell

I have (finally) been introduced to installing SQL Server with PowerShell and it has made those installations much easier.  Each of my installs; however, may require different SQL Server components so one of the items in my configuration file I change is the Features= section of the INI file.  To help keep all this information in one place, here is a list of items you can choose to install.  Remember, most of these will require other parameters to be set–this is only for the Features= section.  Each feature should be separated by a comma.


SQL Server Component Install Options

This list specifies the items you can choose to install.

Database engine = SQLENGINE
Replication = REPLICATION
Full-text and semantic extractions for search = FULLTEXT
Data quality services = DQ
Analysis services = AS
Reporting services – native = RS
Reporting services – sharepoint = RS_SHP
Reporting services add-in for sharepoint products = RS_SHPWFE
Data quality client = DQC
SQL Server data tools = BIDS
Client tools connectivity = CONN
Integration services = IS
Client tools backwards compatibility = BC
Client tools SDK = SDK
Documentation components = BOL
Management tools – basic = SSMS
Management tools – advanced = ADV_SSMS
Distributed replay controller = DREPLAY_CTLR
Distributed replay client = DREPLAY_CLT
SQL client connectivity SDK = SNAC_SDK
Master data services = MDS

Happy installing. 🙂

Updating Default Values for Columns

To make life a little simpler, a default value can be added to a column so if the information is not passed with the row data, SQL Server can plug in some information for you.  A common scenario is to add a default DateTime of the current system time.  This image show what that looks like when you design the table in SSMS.


What happens if you wanted to change the default value option for those tables?  For example, we are preparing to move to Azure SQL and our default options need to be GETUTCDATE() instead of GETDATE().  I put together this script to help me identify all the constraints in the database that need to be updated. This code has the new lines so you should be able to run the output. Don’t forget to update your data before you made the change. You can find that example here.

[sourcecode language=”SQL”]
‘ALTER TABLE ‘ + t.name + ‘ DROP CONSTRAINT ‘ + dc.name + ‘
‘ + ‘ALTER TABLE ‘ + t.name + ‘ ADD CONSTRAINT ‘ + dc.name + ‘ DEFAULT GETUTCDATE() FOR ‘ + c.name + ‘
FROM sys.default_constraints dc
INNER JOIN sys.tables t ON dc.parent_object_id = t.object_id
INNER JOIN sys.columns c ON t.object_id = c.object_id
WHERE dc.parent_column_id = c.column_id
AND dc.definition = ‘(getdate())’

Converting to UTC Time

In preparation for a migration to Azure SQL Database, we found we had to adjust our date formats to UTC time as this is the way dates are stored in azure because your databases could be all over the world.  For new systems, this is very useful; however, what about migrating data to Azure?  I have tables with dates based on a GETDATE() default and this time will be the time zone for the server where the record was created.  I wanted to convert my dates to UTC time and came up with this script.  It is based on servers in the EASTERN time zone.  As the date for daylight savings time changes each year, I used timeanddate.com for the historical data.

Suppose I have a table named Customers with a column called CreationDate and I wanted to update my times to UTC. I would apply this logic. As always, test before implementation. I put an ELSE statement at the end so I could check to see if any dates fell outside my range. This only goes back to 2009. You would have to consult timeanddate.com for more dates if you have older data.

[sourcecode language=”sql”]
UPDATE Customers
SET CreationDate = CASE
WHEN YEAR(CreationDate) = 2009 AND CreationDate BETWEEN ‘2009-03-08 02:00:00.0000000’ AND ‘2010-11-01 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2009 AND CreationDate NOT BETWEEN ‘2009-03-08 02:00:00.0000000’ AND ‘2010-11-01 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2010 AND CreationDate BETWEEN ‘2010-03-14 02:00:00.0000000’ AND ‘2010-11-07 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2010 AND CreationDate NOT BETWEEN ‘2010-03-14 02:00:00.0000000’ AND ‘2010-11-07 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2011 AND CreationDate BETWEEN ‘2011-03-13 02:00:00.0000000’ AND ‘2011-11-06 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2011 AND CreationDate NOT BETWEEN ‘2011-03-13 02:00:00.0000000’ AND ‘2011-11-06 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2012 AND CreationDate BETWEEN ‘2012-03-11 02:00:00.0000000’ AND ‘2012-11-04 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2012 AND CreationDate NOT BETWEEN ‘2012-03-11 02:00:00.0000000’ AND ‘2012-11-04 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2013 AND CreationDate BETWEEN ‘2013-03-10 02:00:00.0000000’ AND ‘2013-11-04 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2013 AND CreationDate NOT BETWEEN ‘2013-03-10 02:00:00.0000000’ AND ‘2013-11-04 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2014 AND CreationDate BETWEEN ‘2014-03-09 02:00:00.0000000’ AND ‘2014-11-02 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2014 AND CreationDate NOT BETWEEN ‘2014-03-14 02:00:00.0000000’ AND ‘2014-11-02 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2015 AND CreationDate BETWEEN ‘2015-03-08 02:00:00.0000000’ AND ‘2015-11-01 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2015 AND CreationDate NOT BETWEEN ‘2015-03-08 02:00:00.0000000’ AND ‘2015-11-01 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
WHEN YEAR(CreationDate) = 2016 AND CreationDate BETWEEN ‘2016-03-13 02:00:00.0000000’ AND ‘2016-11-06 02:00:00.0000000’ THEN DATEADD(hh,6,CreationDate)
WHEN YEAR(CreationDate) = 2016 AND CreationDate NOT BETWEEN ‘2016-03-13 02:00:00.0000000’ AND ‘2016-11-06 02:00:00.0000000’ THEN DATEADD(hh,5,CreationDate)
ELSE ‘2046-01-01 11:07:20.6500000’
FROM Customers

Moving A Database With Attach and Detach

I found myself doing several database migrations this weekend and because these databases were generally small (Less than 10 GB), I choose the route to detach the database, copy the files to the new location and re-attach.  To remove the databases, I detach them from the current instance.  I am not worried about backup history with this as the server will be decommissioned.

[sourcecode language=”sql”]EXEC master.dbo.sp_detach_db @dbname = N’MyDatabase’
GO [/sourcecode]

If you refresh your database list, you will no longer see the database listed; however, the files are still on the server.  You can now copy those to the new location and attach them. You need only pass in the name of the database and the location of the files and SQL Server will add the database and make it available to you.

[sourcecode language=”sql”]EXEC sp_attach_db @dbname = N’MyDatabase’,
@filename1 = N’E:MSSQLMyDatabase.mdf’,
@filename2 = N’F:MSSQLMyDatabase_log.ldf’; [/sourcecode]

Don’t forget to add your security and any jobs on the old server you need to copy over for this database.

We Can Pivot If You Want To

Screen Shot 08-01-14 at 10.59 PMSo I get that I am a little late to the party here; however, I find myself having to pivot from time to time and am always looking back for code as a refresher.  While pivot is pretty cool, it always takes me a second to connect all the dots so I hope this post help do that.

So, with Pivot, we want to take data that looks like the data on the right–there are multiple rows for each ProductId and would like to get each product ID on one line with the quantities from each Bin.

When we are finished, the data will look like this.

Screen Shot 08-01-14 at 10.59 PM 001

One row for each product id

In my example, I am using the [Production].[ProductInventory] table in the adventureworks2012 database.  To make a row become columns, we must know ALL possible values we want to make as a column.  We specify these in the FOR line of the pivot syntax.  This query will create the columns for each value I specified in the Shelf column

[sourcecode language=”sql”]SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
FROM [Production].[ProductInventory]
SUM (Quantity)
FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
) AS PVT [/sourcecode]

Screen Shot 08-01-14 at 11.13 PMI should see something like this.  We are getting closer–we have our columns, but we still a line for each record.  We only want to see one line per product id.  This is where we put the columns we want from the table and the pivot syntax together in an almost sub-query like format and select the columns as we want them to display.

The syntax is something like this.  The first two rows are the columns I want, the second two are the columns I need from the table, and then add the pivot syntax followed by an order by and presto, you have your data the way you want it.

[sourcecode language=”sql”]SELECT [ProductID],[A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L]
(SELECT [ProductID], Shelf, Quantity
FROM [Production].[ProductInventory]) p
SUM (Quantity)
FOR SHELF IN ([A],[B],[C],[D],[E],[F],[G],[H],[I],[J],[K],[L])
ORDER BY [ProductID]