Padding Fields for Fixed-Position Data Formats

Fixed-position data formats will seemingly be with us forever. Despite the relative ease of parsing CSV (or other delimited formats), or even XML, many data exchanges require a fixed-position input. Characters 1-10 are X, characters 11-15 are Y and if the source data is fewer than 5 characters, we have to left-pad with a filler character, etc. When you’re accustomed to working with data that says what it means and means what it says, having to add “extra fluff” like left-padding your integers with a half-dozen zeroes can be a hassle.

I received a draft of a stored procedure recently which had to do exactly this. The intent is for the procedure to output the data almost entirely formatted as required, one record per line in the output file, and dump the result set to a file on disk. As it was given to me, the procedure was peppered with CASE statements like this (only more complex) in the SELECT clause:

-- Method 1
select case len(cast(logid as varchar))
when 9 then '0' + cast(logid as varchar)
when 8 then '00' + cast(logid as varchar)
when 7 then '000' + cast(logid as varchar)
when 6 then '0000' + cast(logid as varchar)
when 5 then '00000' + cast(logid as varchar)
when 4 then '000000' + cast(logid as varchar)
when 3 then '0000000' + cast(logid as varchar)
when 2 then '00000000' + cast(logid as varchar)
when 1 then '000000000' + cast(logid as varchar)
when 0 then '0000000000' + cast(logid as varchar)
end as logid
,logtext from cachedb.dbo.logs;

It’s perfectly valid, it works, and there’s nothing inherently wrong with it. But I find it a bit tough to read, and it could become trouble if the format changes later, as additional (or fewer) cases will have to be accounted for. Fortunately, the day I received this procedure was right around the day I learned about the REPLICATE() T-SQL function. Maybe we can make this simpler:

select replicate('0',10-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Not bad. But it leaves us with a magic number and similar to the previous example, if the file format changes we have to seek out these magic numbers and fix them. This is easily remedied by defining these field lengths at the beginning of the procedure, so that they’re all in one place if anything needs to change.

-- Method 2
declare @paddedlength int = 10;
select replicate('0',@paddedlength-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Yet another approach would be to pad out the value beyond what we need, then trim the resulting string back to the required length. Again, we have to be careful to not leave ourselves with magic numbers; the solution is the same as when using REPLICATE():

-- Method 3
select right('0000000000' + cast(logid as varchar), 10) as logid,logtext from cachedb.dbo.logs;
-- Or, with more flexibility/fewer magic numbers
-- Method 4
declare @paddedlength int = 10;
select right(replicate('0',@paddedlength) + cast(logid as varchar), @paddedlength) as logid,logtext from cachedb.dbo.logs;

All four methods yield the same results, as far as the data itself is concerned. But what about performance? For a table with 523,732 records, execution times were:

  1. 2,000ms CPU time, 261,785ms elapsed
  2. 2,265ms CPU time, 294,399ms elapsed
  3. 2,000ms CPU time, 297,593ms elapsed
  4. 2,078ms CPU time, 302,045ms elapsed

Each method had an identical execution plan, so I’m probably going to opt for the code that’s more readable and maintainable – method 2 or 4.

As with any tuning, be sure to test with your own data & queries.


Connecting SQLite to SQL Server with PowerShell

This post is part of Ed Leighton-Dick’s SQL New Blogger Challenge. Please follow and support these new (or reborn) bloggers.

I’m working with a number of SQLite databases as extra data sources in addition to the SQL Server database I’m primarily using for a project. Brian Davis (b|t) wrote a blog post a few years ago that covers setting up the connection quite well. In my case, I’ve got nine SQLite databases to connect to, and that gets tedious. PowerShell to the rescue!

I started by installing the SQLite ODBC Drivers and creating one ODBC connection for reference. Brian’s post linked above covers that well. But I don’t want to do it eight more times, so I’ll use the first DSN as a template so I can script the creation of the rest.

I named my DSN GSAKMyFinds. To inspect the DSN, I can use the Get-OdbcDsn cmdlet.

Get-OdbcDsn -Name GSAKMyFinds;
Name : GSAKMyFinds
DsnType : System
Platform : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute : {Database, Description, NoTXN, LongNames...}

This looks pretty simple, but there’s a collection of Attributes I need to look at too. I’ll do this by expanding that property with Select-Object.

Get-OdbcDsn -Name GSAKMyFinds | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;
Name        Value                                               
----        -----                                               
Database    C:\Users\andy\Dropbox\GSAK8\data\My Finds\sqlite.db3
NoTXN       0                                                   
LongNames   0                                                   
FKSupport   0                                                   
JDConv      0                                                   
StepAPI     0                                                   
BigInt      0                                                   
NoWCHAR     0                                                   
OEMCP       0                                                   
NoCreat     0                                                   
ShortNames  0                                                   

Now I have everything I need to create a new DSN with Add-OdbcDsn. All of my SQLite databases are stored in a directory structure under C:\Users\andy\Dropbox\GSAK8\data\, with each one in a different subdirectory. For now, I’ll just create one to make sure that I’m doing it right, then use Get-OdbcDsn to see if it matches with my GUI-created DSN.

Add-OdbcDsn -Name GSAKPuzzles -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3";
Get-OdbcDsn -Name GSAKPuzzles;
Get-OdbcDsn -Name GSAKPuzzles | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;


Name       : GSAKPuzzles
DsnType    : System
Platform   : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute  : {Database}

Name     Value                                                      
----     -----                                                      
Database C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3

Looks pretty good! Note that not all of the Attributes seen above are here; those are default values that are set when creating the DSN through the GUI. After deleting my first two test DSN, I can move on to looping through all of my SQLite databases and creating DSNs for all of them. SQLite databases are just files on your filesystem, so by iterating over all of the db3 files in the parent directory I can build the list of files to point my DSNs at.

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3 | Select-Object -ExpandProperty FullName | ForEach-Object {
 $DSNName = $_.split("\")[6];
 Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";
Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";

In a few seconds, the DSNs are created.

Name               DsnType Platform DriverName          Attribute                                   
----               ------- -------- ----------          ---------                                   
GSAKMain           System  64-bit   SQLite3 ODBC Driver {Database, Description, NoTXN, LongNames...}
Far-off puzzles    System  64-bit   SQLite3 ODBC Driver {Database}                                  
Home200            System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Finds           System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Hides           System  64-bit   SQLite3 ODBC Driver {Database}                                  
New England        System  64-bit   SQLite3 ODBC Driver {Database}                                  
Niagara Falls      System  64-bit   SQLite3 ODBC Driver {Database}                                  
NJ                 System  64-bit   SQLite3 ODBC Driver {Database}                                  
Seattle            System  64-bit   SQLite3 ODBC Driver {Database}

Next up is creating the linked servers in SQL Server. I created one with Management Studio using all the defaults, then scripted it to see what I need to do. The only parts I really need are sp_addlinkedserver and sp_addlinkedsrvlogin; the defaults for the other options are good enough for what I’m doing here (this may not be true for you, so be sure to check!).

EXEC master.dbo.sp_addlinkedserver @server = N'GSAKMAIN', @srvproduct=N'GSAKMain', @provider=N'MSDASQL', @datasrc=N'GSAKMain'
 /* For security reasons the linked server remote logins password is changed with ######## */
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'GSAKMAIN',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL

Now I can put this into a PowerShell loop and run it for all of my other DSNs.

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;

I let this run and when it’s finished, all my DSNs are linked servers ready to be queried.


Because I’m going to be querying all of these linked servers together, I wrote some additional code to give me a skeleton query performing a UNION across all of my linked servers which I can use as a starting point.

Here’s the final script:

#require -version 3.0
#requires -module sqlps
set-strictmode -Version latest;
set-location c:;

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3|Select-Object -ExpandProperty FullName | ForEach-Object {
	$DSNName = $_.split("\")[6];
	Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
$AllDatabasesUnion = "";
foreach ($DSN in $AllDSNs) {
    $AllDatabasesUnion += "SELECT * FROM OPENQUERY([$($DSN.Name)], 'select * from caches') UNION ALL`n";
$AllDatabasesUnion = $AllDatabasesUnion.Substring(0,$AllDatabasesUnion.Length - 10);
Write-Output $AllDatabasesUnion;

And the query that it generated for me:

SELECT * FROM OPENQUERY([GSAKMain], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Far-off puzzles], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Home200], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Finds], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Hides], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([New England], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Niagara Falls], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([NJ], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Seattle], 'select * from caches')

With a little exploration of the PowerShell OdbcDsn cmdlets, I’ve eliminated a tedious process and prevented any accidental mouse clicks in a GUI.

PASS Summit: Things to Do, People to See

PASS Summit is nearly upon us. I’m excited to be attending my second Summit in Seattle and cannot wait to get there to see everyone. With one Summit and a few SQL Saturdays under my belt I’ve got a laundry list of things and people I can’t miss, and very little time to pack it all into.

Let’s Meet!

The greatest part of Summit (and SQL Saturday) for me is meeting people and exchanging ideas. If you haven’t experienced it, #SQLFamily is amazing. When I reached the convention center two years ago, the first feeling that hit me was “I finally found my people!” We’re all friendly, I swear. Just say “hi, I’m <your name here>.”  I guarantee you will find people who are into the same stuff you’re into, and I’m not talking just talking about SQL Server. Music, dance, outdoor activities, all kinds of stuff. We have a common thing that brought us together, but that’s not what keeps us together. It is an amazing community and it just keeps getting better. On Sunday, as you’re decompressing from the event and travel, you will miss these people who you didn’t even know a week before.

You can even connect strangers with common interests. In 2012, I met someone over a power outlet who asked if I’d done anything with a particular piece of hardware and what I thought of it. Turns out that I hadn’t, but I knew that a former co-worker was also in attendance and he had used the hardware, so I gave them each others’ contact information.

Ping me on Twitter, find me at one of the places/events listed below, breakfast or lunch in the dining hall, or if you think you see me passing in the hall (picture on my Twitter profile), say something (and if it’s not me, you’ll meet someone else, which is still awesome). Maybe even dinner at the airport on Friday evening.

Get on Twitter

So many things happen at Summit which are announced and/or organized via Twitter. The main hashtag to follow is (I think) #summit14 but once you hit the ground you’ll start figuring out who and what to follow to get all the dirt.



I’m arriving in Seattle late Tuesday morning and doing some sightseeing before checking into the hotel and Summit late in the afternoon. Then it’s off to the welcome reception. The first of several visits to Tap House Grill may be in order too.


Wednesday starts dark & early with #SQLRun at 6 AM. I had a great time getting a 5K in before dawn at my first Summit and I’m sure this one will be great too. Don’t forget to bring the right gear; it’s pre-dawn and right now the forecast is for 50°F and rain (in Seattle. Go figure).

Aside from the sessions throughout the day, I’ll probably be found in the Community Zone. I’ll also be serving as an Ambassador helping to direct people to the dining hall for lunch, posted outside room 4C so stop by and say hi.

Wednesday evening, I’m hosting a dinner for geocachers at the Daily Grill at 6:15 PM. If you’re a cacher, or just curious about it, stop by!

Once we’ve wrapped up there, I’ll go wherever the wind may take me; probably back to the Tap House.


Thursday is my light day at Summit. I don’t have any sessions double-booked and the only event I really need to catch is the Argenis Without Borders folks in their fuzzy rainbow leggings.

Thursday evening I’ll be at the Ray Gun Lounge for Table Top Game Night. I’m looking forward to getting to know folks there and learn some new games. We don’t play a lot of table top games at home and I’d like to change that.


Lots more sessions on Friday, plus winding everything down. By the afternoon, I’ll probably be beat and just trying to rest at the Community Zone.

I fly out late Friday night, so I’ll be trying to find dinner somewhere between the convention center and airport. I’ll probably kill a lot of time in the terminal by wandering around, playing Ingress.

Packing List

At my first Summit, I learned a few lessons about what to take and what not to take. The most important thing to bring: empty space for all the stuff you’ll bring home. SWAG from the exhibitors, souvenirs, books and more. Next most important: power! Electrical outlets are few and far between, and there will be 5000 people vying for them to top off their phones and tablets. A quick rundown of some of the stuff that might not be obvious to bring (or easily forgotten) that I’m packing:

  • Small (1 pint) widemouth water bottle. I’m partial to this Nalgene bottle I got at a 5K earlier this year.
  • NUUN electrolyte tabs. Water gets boring after a while. These will help you stave off SQLPlague (don’t forget your vitamins too!).
  • Comfortable shoes. You’ll be on your feet a lot and walking even more; the convention center is big. Not to mention the evening activities.
  • A small notepad for taking non-session notes – phone numbers, names, etc. I love my Field Notes notebook.
  • A larger notepad for taking notes in sessions. Oh, and don’t forget a pen or three. I’ve tried doing notes on a tablet and on a computer, and it just doesn’t work as well as paper & pen for me. Bonus: no batteries!
  • Hand sanitizer. Because when you have 5000 people in one place, germs get around in a hurry no matter how careful you are.
  • good wall charger for your devices. I found myself short chargers last time and had to buy one at Radio Shack. It didn’t cut it. This one has two USB ports that charge at 2.1A, which will give you a good boost when you get near a plug, and you can share with a friend. It’ll also recharge pretty much anything while you sleep. Best of all, it’s really compact.
  • A good external battery pack. Matt Slocum (b | t) got me hooked on the Anker E5 15000 mAH battery. 2 ports so you can share with a friend and it’ll recharge most phones 4-5 times from a completely empty battery.
  • Plenty of USB cords to go with both of the above.
  • Business cards! I ordered mine at the office too late last time and had to get some made at Staples in a pinch.
  • A small, light backpack to carry all of this in (well, not the shoes). Session rooms get cramped, so carrying a big pack can be a pain.
  • A lock code on my phone and tablet. I normally don’t use one but at any large gathering like this, it’s better to be safe.
  • A list of the people I need to see/find/meet/reconnect with.

This Summit is going to be a blast. I cannot wait. There’s only two things I don’t look forward to:

  1. Having to sleep (I’ll miss stuff!)
  2. It’ll eventually end

Next Tuesday cannot come soon enough.

SQL Saturday Trip Report – Cleveland 2014

This past weekend I made the journey to Cleveland, OH (Westlake, actually) for SQL Saturday #241. I’ve attended two local SQL Saturdays in the past (helping organize/run one), but seeing the list of speakers and knowing a few of the local chapter members, I couldn’t pass up the opportunity to visit.


I packed my bags and hit the road. It’s about a 300 mile trip so I gassed up, settled in with my backlog of podcasts and set the cruise control. The drive was almost zen-like. The sky was clear and the sun shining (almost painfully bright) and I don’t recall the last time I took a road trip solo. It was a very relaxing drive.

After arriving at the hotel and settling in, I went out for a bit  to stretch my legs and get some fresh air before searching for dinner. Forgetting that most people coming from out of town are speakers and there’s usually a speaker’s dinner on the Friday night before SQL Saturday, I had trouble finding people to meet for dinner. Ultimately I met up with Travis Garland (t) and his team at the Winking Lizard in downtown Cleveland after a 20 minute search for parking. I wasn’t able to stay too late as I wanted to be up early on Saturday.


6 AM is pretty early when you’re on the road. I pulled myself out of bed, packed up my stuff, got ready & headed down the road to Hyland Software, the venue for the event. The place is set up perfectly for a SQL Saturday. Great layout, a good amount of space, and terrific technical facilities.

After getting registered, I made the rounds of the vendor tables & caught up with several of the people I was looking forward to seeing again, including Karla Landrum (t), Hope Foley (b|t), Allen White (b|t), and Kendal Van Dyke (b|t). While I was at it, I managed to meet one new person, Wendy Pastrick (b|t) – so immediately the weekend was a networking success (one of my goals was to meet & talk with at least one new person this weekend, and I was up to 5 by 8:15 AM on Saturday). After the opening remarks from Allen, Thomas LaRock (b|t) and Adam Belebczuk (b|t), it was time to get our learn on.

Session 1 – Query Performance Tuning: A 12 Step Program

Query performance is always a concern for DBAs, developers and users alike. Thomas & Tim Chapman (t) presented a great series of steps to take while investigating slow queries, and did it in a very engaging way. Some of these techniques I was already familiar with, but there was plenty of new material in here as well. The biggest surprise to me was that they put examining execution plans halfway down the list – 5 stages of investigation before even looking at the plan! The execution plan is usually one of my first stops in trying to optimize queries, but I’m going to adjust that thinking going forward.

Session 2 – Writing Better Queries with T-SQL Window Functions

Confession time: this was my “backup” session. I had planned on attending Grant Fritchey’s (b|t) Statistics and Query Optimization session, but by the time I got to the room it was standing room only with a line out the door. So I crossed the hall to this session, presented by Kathi Kellenberger (t). I’d heard about window functions before, and seen them in blog posts, but had no idea what they were really good for. Kathi presented the material in a very clear, relatable way and by the end of it my mind was racing, trying to find scenarios at work where I could put them to use. I’m very happy that I landed in this session – lots of great stuff to work with. It’s rare that I sit in a session where I know nothing coming into it and feel like I can apply the knowledge gained immediately. This was one of those sessions – highly recommended if you see Kathi presenting at another PASS event.

Session 3 – Code Management with SQL Server Data Tools

I’ve dabbled with SSDT a bit, but never been able to put it to good use for a variety of reasons. Lisa Gardner (t) presented this session and provided good information around some of the things that I’m trying to implement better in my environments – namely automated deploys and keeping environments in sync. I’m not sure when exactly I’ll get to use all of it, but now I know more about what situations call for it and a start down the path of using it to its fullest potential.


The provided lunch was good, a self-serve taco bar and lots of available seating. There was plenty of seating available in the common areas, but several sponsors had presentations in the training rooms and I took a seat with SQL Sentry. I’ve been using their Performance Monitor and Event Manager products for a couple years now, but always forget about Plan Explorer so I soaked up the demo.

Session 4 – Discover, Document & Diagnose Your Servers on Your Coffee Break

I’ve been working with Kendal for a while and I was eager to see what SQL Power Doc was all about. I came for the PowerShell and stayed for the amazingly detailed documentation of any SQL Server environment. I lack the credentials to run it at work, but I know that it’s being run on a regular basis on our network. It’s everything you wanted to know about your environment but didn’t know it could be documented. The volume of information it churns out is amazing. Note to self: run SQL Power Doc against my computers where I have dev instances set up.

Session 5 – Spatial Data: Cooler Than You’d Think!

I’ve wanted to see this session since I first heard about it last year. I have a project I’ve been working on for a bit that has a lot of GPS data in it and I was eager to learn from Hope what else I could/should be doing with it, as well as validating that I was already on the right track. I got some good ideas here with regard to how my tables are defined and what options are available for mapping the data I’ve captured – something that I hadn’t even considered yet.

Session 6 – Making the Leap from Profiler to Extended Events

I met Erin Stellato (b|t) at PASS Summit 2012 during #SQLRun, and her session about DBCC was the first regular session I attended that week (portions of which were way over my head – not an unexpected phenomenon). So I had to check this session out. I don’t do much with Profiler but I know it’s going away and I want to be ready. This session was a great introduction to XE without getting into too much complexity, and she did a great job of showing off some of the aspects of XE that make it so much better than Profiler.

Closing Ceremony

Raffle drawings! I continued my non-winning streak.

The Afterparty

After wrapping up at Hyland Software and dropping all my stuff at the hotel, I headed over to Dave & Buster’s for the after-party. Some game credits were provided, but I was there more to talk to people than play games (aside: I’m really bad at meeting people & “working the room.” I’m better at it than I used to be, but it’s still something I have to work very hard at). I spent most of the event camped out at a table with a rotating cast, but I left without catching up with everyone I wanted to speak with.

The Afterafterparty

After Dave & Buster’s, a few people were headed to the “other” hotel for a few hours of Cards Against Humanity & invited me to join. I’d never played before, and did pretty poorly, but it was a hell of a time. I just wish I was staying at that hotel so I didn’t have to worry about making the drive back to mine.


My view most of the way home
My view most of the way home

After being out until 1 AM, my 7 AM alarm sounded way too early. I slowly rolled out of bed, packed up & prepped to hit the road. In sharp contrast to my drive to Cleveland, the drive home was looking a bit rough with lake-effect snow on the radar until my halfway point. Of course, as I  approached that point I checked again and saw that snow would follow me all the way home. It was slow, treacherous going in places but I managed to stay on the road and got home without any drama.


Everyone involved with this SQL Saturday did an amazing job. Terrific venue, terrific speakers, and everything seemed to go off without a hitch. If you haven’t been to a SQL Saturday and there’s one coming up near you, take advantage of the opportunity to attend. Just being around the PASS community can stir up or re-ignite a passion for all things data-related. And as always, the biggest thing I learned this weekend is that there are so many more things out there for me to learn.

I brought a backpack and PC with me to collect swag and possibly fiddle with a project while I was at the event, but it just ended up being a lot of dead weight I carried around all day. Next time, I’m leaving the computer at home and replacing it with a big battery pack for my phone.

I can’t wait to make the trip to Cleveland again next year – in the meantime, there’s #302 & #303 this summer which I’ll be attending

My First Windows Update Gone Bad

I don’t think I’ve ever had a Windows Update go bad – until this week.

I recently upgraded to Office 2013 and late Monday afternoon, decided to check in with Windows Update prior to our company’s normal monthly patching to see how bad the damage would be. Nearly 1 GB of updates, thanks to my fresh Office install. But there were also a couple optional updates, including a .NET Framework update. So I figured I may as well go ahead and do everything while I was at it. This way I could control the reboot instead of being forced into one in the middle of important tasks.

Tuesday morning, I got a call asking if we were having any issues with one of our key systems. I fired up my trusty SQL Sentry client to check things out. And failed. I couldn’t connect to the server to start monitoring. Never a good sign. Then I tried SSMS 2012. Again, couldn’t connect to any servers. I got the following error:

Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

That sounds pretty ominous. Try a few more times, no luck. Reboot, no better. Uninstall SQL Sentry Client and attempt to reinstall – still nothing. Things were going from bad to worse in a hurry.

I bounced the error messages off Kendal Van Dyke (b | t) and he suggested that it might be an issue with the SQL Native Client. So I reinstalled that from the SQL Server 2012 Feature Pack. And still, I couldn’t connect. I even tried PowerShell (v3), both the SQLPS module and SqlServerCmdletSnapin100 Snap-In and got the same errors SSMS 2012 threw out.

Taking a deep breath and stepping back, I started reviewing the situation. What do all of these have in common? They’re all using the latest (or at least a fairly recent version) .NET Framework. Let’s take a step back and try something older. SSMS 2008 R2 – works fine. Start up PowerShell v2 – that works too. Now we’re onto something!

The Framework update I installed was KB2858725, bringing it to version 4.5.1. My first thought was that maybe the installation was botched somehow, so I downloaded the installer and tried again. But to no avail. So I uninstalled it entirely and reinstalled 4.5 completely. This finally did the trick.

Due to other commitments, I haven’t had a chance yet to re-try the 4.5.1 update, but I’m also in no rush. While it won’t take me 4+ hours to troubleshoot and fix if it breaks things the next time around, I need my system stable for the next few weeks so I won’t be trying again until 2014.

T-SQL Tuesday #39 – Here’s what my PoSH is cooking

T-SQL Tuesday LogoMy first official entry for T-SQL Tuesday (my first was a guest post hosted by Kendal Van Dyke (b|t), so I’m not really counting it) is brought to you by PowerShell, or PoSH. Ever since I discovered PoSH and really dove into learning it a couple years ago, my co-workers have gotten a bit annoyed by my insistence upon using it for everything. It is my favorite hammer, and around me I see nothing but acres and acres of nails.

I’m not a DBA, so I don’t do as much managing of databases with it as most people joining this party, but I still use PoSH with SQL Server pretty often. I spend a lot of time pulling data from SQL Server & doing crunching/analysis, sometimes merging it with some filesystem data as well.

So what have I done lately?

  • Space usage analysis. I work with a system which generates PDF documents, saves them to the server filesystem, and records the details to a table. But how quickly are we consuming space? When will we run out? I whipped up a quick PoSH script to pull the details off the table, then locate the files on the filesystem, and record everything to another table for slicing & dicing.
  • Quick ad-hoc data dumps. Sometimes we just need to pull some data out of the database, crunch a few numbers, and send it off to someone upstairs. Before PoSH, I’d run the query in SSMS, copy the data, paste it into Excel, drop in a few formulas and maybe a graph, and be done. But I’d spend more time fighting Excel on formatting & getting the columns right than I did getting the data into it. Invoke-SQLCmd piped to Export-CSV solves that really quickly.
  • I’ve been working on upgrading a system we purchased from a vendor and migrating everything to a new server at the same time. Moving & updating XML configuration files, thousands of check images, restarting services, migrating databases. And this isn’t a one-time event – we have to do this over 200 times! The SQL Server portion of this process isn’t particularly involved, but it’s vitally important:
    • After most of the heavy lifting of moving things around is complete, one table has to be updated to point at the new path for the images that were migrated.
    • When all of the migrations are finished, we have to validate that everything moved over properly. A few carefully-selected queries to compare critical tables between the old version of the database and the new version and minds are put at ease that all of our data has come over cleanly. Those queries, along with the other components of the validation, are run via the PoSH script & output to a file for review.
  • For reporting purposes, we load a small subset of data from Active Directory into a pair of SQL Server tables on a nightly basis. Previously, it was only 2 fields but recently this has been expanded to about 8. Once again, PoSH to the rescue! Pull the AD user accounts, select the properties we need, and insert it all into the tables. Originally I used my old standby Invoke-SQLCmd, but with the addition of new fields I got concerned about building queries via string concatenation using arbitrary data retrieved from another system. System.Data.SqlClient & prepared statements to the rescue! It’s more code, but it’s a lot safer.
  • Dave Ballantyne reminded me that I have two PowerShell scripts for SSRS.
    • The first, based upon this script from Randy Alrdich Paulo to deploy reports into SSRS. My addition was the option to pull an RDL file straight from our Subversion repository instead of the person doing the deployment having to check it out.
    • The second is based upon this StackOverflow post, and is used to render reports to file. In one case, I then SFTP the generated report to a vendor immediately after.

These are pretty mundane compared to what I’m sure a lot of people participating will be posting, but the key is this: PoSH is saving me time and by scripting everything I can, I’m significantly reducing the chance for errors. Of course, when there is an error, it’s magnified tremendously – so it all gets tested against a non-production server first.