T-SQL Tuesday #61 – Giving Back

T-SQL Tuesday LogoWayne Sheffield (b|t) is hosting this month’s T-SQL Tuesday and his topic is Giving Back to the SQL Community. More specifically, he’s asking how each of us is planning on giving something back to the SQL Community in 2015. He offers up a few suggestions, so I’ll start by addressing those and then move on to additional ideas.

  • Are you going to start speaking at your local user group?
    Yes, I expect that by the end of 2015 I will have spoken to our local chapter at least once. I spoke to various groups at work in 2014 and plan to continue doing so in 2015 as well.
  • Perhaps step up and help run your local user group?
    I was named the Vice President of our local chapter a couple months ago, and I will continue in that capacity.
  • Do you want to start becoming an active blogger – or increase your blogging?
    Yes! At the time of this writing I’ve only published 7 posts here, and I have 6 others in various stages of preparation. I have some ideas brewing, I just need to get things written and then actually press that Publish button. Part of it is fear/insecurity, and I need to get out of my comfort zone a little and Just Do It.
  • Do you plan on volunteering your time with larger organizations (such as PASS), so that SQL Training can occur at a larger level?
    If I have the opportunity to attend PASS Summit in 2015, I will volunteer at the event. When the call for pre-event volunteers go out, I’ll look at what’s needed and try to step a little out of my comfort zone & do something there as well.
  • Other ways of contributing
    • For the 3rd year, I will be helping to organize and run SQL Saturday Rochester in 2015. If you’re reading this, you probably know about SQL Saturday, and have probably even been to one. Next time, bring a friend!
    • I’ve been promoting PASS and our local chapter for a while at work and will be a more vocal in 2015. There are a lot of people with knowledge and experience they can share who aren’t even aware that PASS and the local and virtual user groups exist. I want to help bring those people into the community.

Lightning Talks at SQL Saturday?

We’re already in the early stages of preparing for our 2015 SQL Saturday. One thing that was missing from this year’s event was local speakers; we just didn’t have many, and I’m hoping we can change that the next time around.

For a lot of people (myself included), getting into speaking can be intimidating. Do I even have something interesting to say? What if I can’t fill an entire hour (or 75 minutes)? What if I get everything all wrong?

One session held each day at PASS Summit may hold the answer – Lightning Talks. In a standard session time block, 5-7 speakers each present for only about 10 minutes each. Step up, hook up your laptop, talk about a very focused, narrowly-scoped topic, and then yield the floor to the next speaker.

I held my own “lightning talk” at work one day as an experiment. While working on some reports, I found an alternative way to write a portion of the query which resulted in a significant improvement in the execution plan. I brought my team together, gave a brief intro (basically, the preceding sentence here), then gave a demo:

  • Present both versions of the query
  • Explain the change made for the faster version
  • Run the query
  • Compare I/O and time statistics
  • Compare the execution plans
  • Wrap up

With a couple of questions and some technical difficulties sprinkled in, it was over in just under 15 minutes. Not bad, and with some practice & honing, I think I could get it to 10 minutes.

Could this work at SQL Saturday scale? I think it’s mostly a matter of finding volunteers.  But this should be easier than full sessions. Everyone has at least one thing they’ve discovered, learned, experimented with or implemented in a creative way that’s worth sharing – even if it’s something very brief. With it being only 10-15 minutes, you don’t have to worry about filling a full session slot or losing the audience.

PASS Summit: Things to Do, People to See

PASS Summit is nearly upon us. I’m excited to be attending my second Summit in Seattle and cannot wait to get there to see everyone. With one Summit and a few SQL Saturdays under my belt I’ve got a laundry list of things and people I can’t miss, and very little time to pack it all into.

Let’s Meet!

The greatest part of Summit (and SQL Saturday) for me is meeting people and exchanging ideas. If you haven’t experienced it, #SQLFamily is amazing. When I reached the convention center two years ago, the first feeling that hit me was “I finally found my people!” We’re all friendly, I swear. Just say “hi, I’m <your name here>.”  I guarantee you will find people who are into the same stuff you’re into, and I’m not talking just talking about SQL Server. Music, dance, outdoor activities, all kinds of stuff. We have a common thing that brought us together, but that’s not what keeps us together. It is an amazing community and it just keeps getting better. On Sunday, as you’re decompressing from the event and travel, you will miss these people who you didn’t even know a week before.

You can even connect strangers with common interests. In 2012, I met someone over a power outlet who asked if I’d done anything with a particular piece of hardware and what I thought of it. Turns out that I hadn’t, but I knew that a former co-worker was also in attendance and he had used the hardware, so I gave them each others’ contact information.

Ping me on Twitter, find me at one of the places/events listed below, breakfast or lunch in the dining hall, or if you think you see me passing in the hall (picture on my Twitter profile), say something (and if it’s not me, you’ll meet someone else, which is still awesome). Maybe even dinner at the airport on Friday evening.

Get on Twitter

So many things happen at Summit which are announced and/or organized via Twitter. The main hashtag to follow is (I think) #summit14 but once you hit the ground you’ll start figuring out who and what to follow to get all the dirt.

Schedule

Tuesday

I’m arriving in Seattle late Tuesday morning and doing some sightseeing before checking into the hotel and Summit late in the afternoon. Then it’s off to the welcome reception. The first of several visits to Tap House Grill may be in order too.

Wednesday

Wednesday starts dark & early with #SQLRun at 6 AM. I had a great time getting a 5K in before dawn at my first Summit and I’m sure this one will be great too. Don’t forget to bring the right gear; it’s pre-dawn and right now the forecast is for 50°F and rain (in Seattle. Go figure).

Aside from the sessions throughout the day, I’ll probably be found in the Community Zone. I’ll also be serving as an Ambassador helping to direct people to the dining hall for lunch, posted outside room 4C so stop by and say hi.

Wednesday evening, I’m hosting a dinner for geocachers at the Daily Grill at 6:15 PM. If you’re a cacher, or just curious about it, stop by!

Once we’ve wrapped up there, I’ll go wherever the wind may take me; probably back to the Tap House.

Thursday

Thursday is my light day at Summit. I don’t have any sessions double-booked and the only event I really need to catch is the Argenis Without Borders folks in their fuzzy rainbow leggings.

Thursday evening I’ll be at the Ray Gun Lounge for Table Top Game Night. I’m looking forward to getting to know folks there and learn some new games. We don’t play a lot of table top games at home and I’d like to change that.

Friday

Lots more sessions on Friday, plus winding everything down. By the afternoon, I’ll probably be beat and just trying to rest at the Community Zone.

I fly out late Friday night, so I’ll be trying to find dinner somewhere between the convention center and airport. I’ll probably kill a lot of time in the terminal by wandering around, playing Ingress.

Packing List

At my first Summit, I learned a few lessons about what to take and what not to take. The most important thing to bring: empty space for all the stuff you’ll bring home. SWAG from the exhibitors, souvenirs, books and more. Next most important: power! Electrical outlets are few and far between, and there will be 5000 people vying for them to top off their phones and tablets. A quick rundown of some of the stuff that might not be obvious to bring (or easily forgotten) that I’m packing:

  • Small (1 pint) widemouth water bottle. I’m partial to this Nalgene bottle I got at a 5K earlier this year.
  • NUUN electrolyte tabs. Water gets boring after a while. These will help you stave off SQLPlague (don’t forget your vitamins too!).
  • Comfortable shoes. You’ll be on your feet a lot and walking even more; the convention center is big. Not to mention the evening activities.
  • A small notepad for taking non-session notes – phone numbers, names, etc. I love my Field Notes notebook.
  • A larger notepad for taking notes in sessions. Oh, and don’t forget a pen or three. I’ve tried doing notes on a tablet and on a computer, and it just doesn’t work as well as paper & pen for me. Bonus: no batteries!
  • Hand sanitizer. Because when you have 5000 people in one place, germs get around in a hurry no matter how careful you are.
  • good wall charger for your devices. I found myself short chargers last time and had to buy one at Radio Shack. It didn’t cut it. This one has two USB ports that charge at 2.1A, which will give you a good boost when you get near a plug, and you can share with a friend. It’ll also recharge pretty much anything while you sleep. Best of all, it’s really compact.
  • A good external battery pack. Matt Slocum (b | t) got me hooked on the Anker E5 15000 mAH battery. 2 ports so you can share with a friend and it’ll recharge most phones 4-5 times from a completely empty battery.
  • Plenty of USB cords to go with both of the above.
  • Business cards! I ordered mine at the office too late last time and had to get some made at Staples in a pinch.
  • A small, light backpack to carry all of this in (well, not the shoes). Session rooms get cramped, so carrying a big pack can be a pain.
  • A lock code on my phone and tablet. I normally don’t use one but at any large gathering like this, it’s better to be safe.
  • A list of the people I need to see/find/meet/reconnect with.

This Summit is going to be a blast. I cannot wait. There’s only two things I don’t look forward to:

  1. Having to sleep (I’ll miss stuff!)
  2. It’ll eventually end

Next Tuesday cannot come soon enough.

T-SQL Tuesday #58 – Passwords

T-SQL Tuesday LogoThis month’s T-SQL Tuesday topic is passwords. I’m neither a DBA nor server/system admin, so the only passwords I get to manage are my own. But there’s still lots to talk about. Passwords (or rather, weak passwords) have been in the news a lotover the past two weeks, so it’s timely.

This is the password story I’d like to tell my kids, but they’re too young to understand yet.

What’s Your Password?

I can count on both hands the number of accounts I have actually memorized the password for.

  • Personal Laptop
  • Personal email (2 accounts)
  • Active Directory (work)
  • One non-production service account
  • 2 non-AD-integrated applications
  • Amazon
  • 1Password

Pretty much everything else is a very random string that I have no hope of memorizing. For example, n9;r27LBL8x2x6=X. It’s that last item above that lets me get away with it. Between password complexity rules, the increasing sophistication of attackers, and the frequency of major data breaches, it’s almost impossible to get by without some kind of password manager. You need to be changing your passwords regularly, and using strong ones. 1Password helps me with both of those; it shows me how old each password is, and it generates good, random passwords as seen above. All I have to remember is my master password. I rarely type a password now; it’s copy/pasted from 1Password, or automatically entered thanks to the browser extension.

But that’s not enough. A lot of websites insist upon providing more information for account “security” such as the name of my first pet, or what my first car was. But answering those questions truthfully doesn’t provide as much security as one might think, so I use 1Password to generate random strings for these answers and store those Q&A pairs along with the credentials.

Is that enough? No! Even if you did all of that, your credentials were still easy to capture for about two years thanks to the Heartbleed SSL bug.

More Layers

Security should be like an ogre…er…onion. You need layers. Passwords aren’t enough. More and more websites and services are offering Two-Factor Authentication (2FA) now, but they aren’t making it very well-known. Google, Dropbox, WordPress, Evernote, Facebook, Microsoft and GitHub (and that’s just the list I’ve got registered on my phone) will let you further secure your account by requiring you to enter a second code after your password, either one sent to your phone via text message (or phone call) or automatically generated via an app like Google Authenticator (not unlike a SecurID token). It’s an extra step, but it makes things a lot safer. Even if someone were to get your credentials in a Heartbleed-type attack, they’d be pretty useless with 2FA enabled – at least on that site.

Be Careful Out There

Many years ago when I was in college, I thought my password was safe. I wasn’t sharing it with anyone, and it was reasonably complex – for the time. Then over one Christmas break, my account was compromised. I was checking my mail via an unencrypted POP3 connection. The password got sniffed, and someone got into my account. Fortunately, no damage was done but the lesson was learned. With so much of our identities, personal & financial lives kept behind these virtual doors, it’s vital that we take every possible precaution in securing those accounts (unfortunately, I know too many people who are still not doing this). Sure, POP3 was quick & convenient, but it was extremely dangerous.

There are still websites/services that aren’t so careful. If you see a website that limits your password to anything less than about 30 characters, is not case-sensitive, or doesn’t allow you to use certain characters, there’s a chance they’re not storing passwords safely (I won’t get into that with this post, as I’m sure another T-SQL Tuesday author will dive in and do it better than I would). And if any website sends you your password (the one you created for the site) via email, run away (and drop a line to Plain Text Offenders) because they aren’t doing anything to protect you.

Wrap Up

  • Make sure the login form you’re using is properly secured
  • Use strong passwords
  • Use Two-Factor Authentication everywhere you can
  • Watch out for careless site operators
  • Don’t use weak passwords & skip other security measures because “it’s not convenient”

But just so we’re clear: Unlike the hundred-plus celebrities who were compromised, there’s nothing on my iCloud account that anyone should be subjected to.

SQL Saturday Trip Report – Cleveland 2014

This past weekend I made the journey to Cleveland, OH (Westlake, actually) for SQL Saturday #241. I’ve attended two local SQL Saturdays in the past (helping organize/run one), but seeing the list of speakers and knowing a few of the local chapter members, I couldn’t pass up the opportunity to visit.

Friday

I packed my bags and hit the road. It’s about a 300 mile trip so I gassed up, settled in with my backlog of podcasts and set the cruise control. The drive was almost zen-like. The sky was clear and the sun shining (almost painfully bright) and I don’t recall the last time I took a road trip solo. It was a very relaxing drive.

After arriving at the hotel and settling in, I went out for a bit  to stretch my legs and get some fresh air before searching for dinner. Forgetting that most people coming from out of town are speakers and there’s usually a speaker’s dinner on the Friday night before SQL Saturday, I had trouble finding people to meet for dinner. Ultimately I met up with Travis Garland (t) and his team at the Winking Lizard in downtown Cleveland after a 20 minute search for parking. I wasn’t able to stay too late as I wanted to be up early on Saturday.

Saturday

6 AM is pretty early when you’re on the road. I pulled myself out of bed, packed up my stuff, got ready & headed down the road to Hyland Software, the venue for the event. The place is set up perfectly for a SQL Saturday. Great layout, a good amount of space, and terrific technical facilities.

After getting registered, I made the rounds of the vendor tables & caught up with several of the people I was looking forward to seeing again, including Karla Landrum (t), Hope Foley (b|t), Allen White (b|t), and Kendal Van Dyke (b|t). While I was at it, I managed to meet one new person, Wendy Pastrick (b|t) – so immediately the weekend was a networking success (one of my goals was to meet & talk with at least one new person this weekend, and I was up to 5 by 8:15 AM on Saturday). After the opening remarks from Allen, Thomas LaRock (b|t) and Adam Belebczuk (b|t), it was time to get our learn on.

Session 1 – Query Performance Tuning: A 12 Step Program

Query performance is always a concern for DBAs, developers and users alike. Thomas & Tim Chapman (t) presented a great series of steps to take while investigating slow queries, and did it in a very engaging way. Some of these techniques I was already familiar with, but there was plenty of new material in here as well. The biggest surprise to me was that they put examining execution plans halfway down the list – 5 stages of investigation before even looking at the plan! The execution plan is usually one of my first stops in trying to optimize queries, but I’m going to adjust that thinking going forward.

Session 2 – Writing Better Queries with T-SQL Window Functions

Confession time: this was my “backup” session. I had planned on attending Grant Fritchey’s (b|t) Statistics and Query Optimization session, but by the time I got to the room it was standing room only with a line out the door. So I crossed the hall to this session, presented by Kathi Kellenberger (t). I’d heard about window functions before, and seen them in blog posts, but had no idea what they were really good for. Kathi presented the material in a very clear, relatable way and by the end of it my mind was racing, trying to find scenarios at work where I could put them to use. I’m very happy that I landed in this session – lots of great stuff to work with. It’s rare that I sit in a session where I know nothing coming into it and feel like I can apply the knowledge gained immediately. This was one of those sessions – highly recommended if you see Kathi presenting at another PASS event.

Session 3 – Code Management with SQL Server Data Tools

I’ve dabbled with SSDT a bit, but never been able to put it to good use for a variety of reasons. Lisa Gardner (t) presented this session and provided good information around some of the things that I’m trying to implement better in my environments – namely automated deploys and keeping environments in sync. I’m not sure when exactly I’ll get to use all of it, but now I know more about what situations call for it and a start down the path of using it to its fullest potential.

Lunch

The provided lunch was good, a self-serve taco bar and lots of available seating. There was plenty of seating available in the common areas, but several sponsors had presentations in the training rooms and I took a seat with SQL Sentry. I’ve been using their Performance Monitor and Event Manager products for a couple years now, but always forget about Plan Explorer so I soaked up the demo.

Session 4 – Discover, Document & Diagnose Your Servers on Your Coffee Break

I’ve been working with Kendal for a while and I was eager to see what SQL Power Doc was all about. I came for the PowerShell and stayed for the amazingly detailed documentation of any SQL Server environment. I lack the credentials to run it at work, but I know that it’s being run on a regular basis on our network. It’s everything you wanted to know about your environment but didn’t know it could be documented. The volume of information it churns out is amazing. Note to self: run SQL Power Doc against my computers where I have dev instances set up.

Session 5 – Spatial Data: Cooler Than You’d Think!

I’ve wanted to see this session since I first heard about it last year. I have a project I’ve been working on for a bit that has a lot of GPS data in it and I was eager to learn from Hope what else I could/should be doing with it, as well as validating that I was already on the right track. I got some good ideas here with regard to how my tables are defined and what options are available for mapping the data I’ve captured – something that I hadn’t even considered yet.

Session 6 – Making the Leap from Profiler to Extended Events

I met Erin Stellato (b|t) at PASS Summit 2012 during #SQLRun, and her session about DBCC was the first regular session I attended that week (portions of which were way over my head – not an unexpected phenomenon). So I had to check this session out. I don’t do much with Profiler but I know it’s going away and I want to be ready. This session was a great introduction to XE without getting into too much complexity, and she did a great job of showing off some of the aspects of XE that make it so much better than Profiler.

Closing Ceremony

Raffle drawings! I continued my non-winning streak.

The Afterparty

After wrapping up at Hyland Software and dropping all my stuff at the hotel, I headed over to Dave & Buster’s for the after-party. Some game credits were provided, but I was there more to talk to people than play games (aside: I’m really bad at meeting people & “working the room.” I’m better at it than I used to be, but it’s still something I have to work very hard at). I spent most of the event camped out at a table with a rotating cast, but I left without catching up with everyone I wanted to speak with.

The Afterafterparty

After Dave & Buster’s, a few people were headed to the “other” hotel for a few hours of Cards Against Humanity & invited me to join. I’d never played before, and did pretty poorly, but it was a hell of a time. I just wish I was staying at that hotel so I didn’t have to worry about making the drive back to mine.

Sunday

My view most of the way home

My view most of the way home

After being out until 1 AM, my 7 AM alarm sounded way too early. I slowly rolled out of bed, packed up & prepped to hit the road. In sharp contrast to my drive to Cleveland, the drive home was looking a bit rough with lake-effect snow on the radar until my halfway point. Of course, as I  approached that point I checked again and saw that snow would follow me all the way home. It was slow, treacherous going in places but I managed to stay on the road and got home without any drama.

Recap

Everyone involved with this SQL Saturday did an amazing job. Terrific venue, terrific speakers, and everything seemed to go off without a hitch. If you haven’t been to a SQL Saturday and there’s one coming up near you, take advantage of the opportunity to attend. Just being around the PASS community can stir up or re-ignite a passion for all things data-related. And as always, the biggest thing I learned this weekend is that there are so many more things out there for me to learn.

I brought a backpack and PC with me to collect swag and possibly fiddle with a project while I was at the event, but it just ended up being a lot of dead weight I carried around all day. Next time, I’m leaving the computer at home and replacing it with a big battery pack for my phone.

I can’t wait to make the trip to Cleveland again next year – in the meantime, there’s #302 & #303 this summer which I’ll be attending

My First Windows Update Gone Bad

I don’t think I’ve ever had a Windows Update go bad – until this week.

I recently upgraded to Office 2013 and late Monday afternoon, decided to check in with Windows Update prior to our company’s normal monthly patching to see how bad the damage would be. Nearly 1 GB of updates, thanks to my fresh Office install. But there were also a couple optional updates, including a .NET Framework update. So I figured I may as well go ahead and do everything while I was at it. This way I could control the reboot instead of being forced into one in the middle of important tasks.

Tuesday morning, I got a call asking if we were having any issues with one of our key systems. I fired up my trusty SQL Sentry client to check things out. And failed. I couldn’t connect to the server to start monitoring. Never a good sign. Then I tried SSMS 2012. Again, couldn’t connect to any servers. I got the following error:

Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

That sounds pretty ominous. Try a few more times, no luck. Reboot, no better. Uninstall SQL Sentry Client and attempt to reinstall – still nothing. Things were going from bad to worse in a hurry.

I bounced the error messages off Kendal Van Dyke (b | t) and he suggested that it might be an issue with the SQL Native Client. So I reinstalled that from the SQL Server 2012 Feature Pack. And still, I couldn’t connect. I even tried PowerShell (v3), both the SQLPS module and SqlServerCmdletSnapin100 Snap-In and got the same errors SSMS 2012 threw out.

Taking a deep breath and stepping back, I started reviewing the situation. What do all of these have in common? They’re all using the latest (or at least a fairly recent version) .NET Framework. Let’s take a step back and try something older. SSMS 2008 R2 – works fine. Start up PowerShell v2 – that works too. Now we’re onto something!

The Framework update I installed was KB2858725, bringing it to version 4.5.1. My first thought was that maybe the installation was botched somehow, so I downloaded the installer and tried again. But to no avail. So I uninstalled it entirely and reinstalled 4.5 completely. This finally did the trick.

Due to other commitments, I haven’t had a chance yet to re-try the 4.5.1 update, but I’m also in no rush. While it won’t take me 4+ hours to troubleshoot and fix if it breaks things the next time around, I need my system stable for the next few weeks so I won’t be trying again until 2014.

Shorten Your PowerShell Prompt

Recently, I’ve been getting very annoyed by the length of the default PowerShell prompt. Most of my work starts in my Documents folder, so with the default prompt, I’m working with C:\Users\username\Documents. But more often, it’s closer to C:\Users\username\Documents\_Projects\Project\Section\ and with some projects, even longer. Before you know it, you’re line-wrapping for anything more than running a cmdlet with no parameters.

Sure, it’s better than C:\Documents and Settings\username\My Documents (props to Microsoft for cleaning that up in post-XP releases), but sometimes it’s still not enough.

So this weekend, I cooked up an alternative. It’s pretty much the same as the standard prompt, with one important difference: it dynamically shortens the displayed path based on the width of your window.

At a minimum, you’ll get the first & last components of the path, regardless of the total length of the current directory – when you’re working on a regular filesystem, that’ll be the drive letter & directory name. As space allows, it walks up the tree, adding each parent directory until it runs out of room.

<#
.Synopsis
    Dynamically shortens the prompt based upon window size
.Notes
    I got really annoyed by having my PowerShell prompt extend across 2/3 of my window when in a deeply-nested directory structure.
    This shortens the prompt to roughly 1/3 of the window width, at a minimum showing the first and last piece of the path (usually the PSPROVIDER & the current directory)
    Additional detail is added, starting at the current directory's parent and working up from there.
    The omitted portion of the path is represented with an ellipsis (...)
#>

# Window title borrowed from Joel Bennett @ http://poshcode.org/1834
# This should go OUTSIDE the prompt function, it doesn't need re-evaluation
# We're going to calculate a prefix for the window title
# Our basic title is "PoSh - C:\Your\Path\Here" showing the current path
if(!$global:WindowTitlePrefix) {
   # But if you're running "elevated" on vista, we want to show that ...
   if( ([System.Environment]::OSVersion.Version.Major -gt 5) -and ( # Vista and ...
         new-object Security.Principal.WindowsPrincipal (
            [Security.Principal.WindowsIdentity]::GetCurrent()) # current user is admin
            ).IsInRole([Security.Principal.WindowsBuiltInRole]::Administrator) )
   {
      $global:WindowTitlePrefix = "PoSh (ADMIN)"
   } else {
      $global:WindowTitlePrefix = "PoSh"
   }
}

function prompt {
# Put the full path in the title bar for reference
    $host.ui.rawui.windowtitle = $global:WindowTitlePrefix + " - " + $(get-location);

# Capture the maximum length of the prompt. If you want a longer prompt, adjust the math as necessary.
	$winWidth = $host.UI.RawUI.WindowSize.Width;
    $maxPromptPath = [Math]::Round($winWidth/3);

# In the PowerShell ISE (version 2.0 at least), $host.UI.RawUI.WindowSize.Widthis $null.
# For now, I'm just going to leave the default prompt for this scenario, as I don't work in the ISE.
    if (-not ($winWidth -eq $null)) {
        $currPath = (get-location).path;
        if ($currPath.length -ge $maxPromptPath){
            $pathParts = $currPath.split([System.IO.Path]::DirectorySeparatorChar);
# Absolute minimum path - PSPROVIDER and the current directory
            $myPrompt = $pathParts[0] + [System.IO.Path]::DirectorySeparatorChar+ "..." + [System.IO.Path]::DirectorySeparatorChar + $pathParts[$pathParts.length - 1];
            $counter = $pathParts.length - 2;
# This builds up the prompt until it reaches the maximum length we set earlier.
# Start at the current directory's parent and keep going up until the whole prompt reaches the previously-determined limit.
            while( ($myPrompt.replace("...","..."+[System.IO.Path]::DirectorySeparatorChar+$pathParts[$counter]).length -lt $maxPromptPath) -and ($counter -ne 0)) {
                $myPrompt = $myPrompt.replace("...","..."+[System.IO.Path]::DirectorySeparatorChar+$pathParts[$counter]);
                $counter--;
            }
            $($myPrompt) + ">";
        } else{
# If there's enough room for the full prompt, use the Powershell default prompt
            $(if (test-path variable:/PSDebugContext) { '[DBG]: ' } else { '' }) + 'PS ' + $(Get-Location) + $(if ($nestedpromptlevel -ge 1) { '>>' }) + '> '
        }
    }
}

I’ve also uploaded it to PoshCode.

T-SQL Tuesday #39 – Here’s what my PoSH is cooking

T-SQL Tuesday LogoMy first official entry for T-SQL Tuesday (my first was a guest post hosted by Kendal Van Dyke (b|t), so I’m not really counting it) is brought to you by PowerShell, or PoSH. Ever since I discovered PoSH and really dove into learning it a couple years ago, my co-workers have gotten a bit annoyed by my insistence upon using it for everything. It is my favorite hammer, and around me I see nothing but acres and acres of nails.

I’m not a DBA, so I don’t do as much managing of databases with it as most people joining this party, but I still use PoSH with SQL Server pretty often. I spend a lot of time pulling data from SQL Server & doing crunching/analysis, sometimes merging it with some filesystem data as well.

So what have I done lately?

  • Space usage analysis. I work with a system which generates PDF documents, saves them to the server filesystem, and records the details to a table. But how quickly are we consuming space? When will we run out? I whipped up a quick PoSH script to pull the details off the table, then locate the files on the filesystem, and record everything to another table for slicing & dicing.
  • Quick ad-hoc data dumps. Sometimes we just need to pull some data out of the database, crunch a few numbers, and send it off to someone upstairs. Before PoSH, I’d run the query in SSMS, copy the data, paste it into Excel, drop in a few formulas and maybe a graph, and be done. But I’d spend more time fighting Excel on formatting & getting the columns right than I did getting the data into it. Invoke-SQLCmd piped to Export-CSV solves that really quickly.
  • I’ve been working on upgrading a system we purchased from a vendor and migrating everything to a new server at the same time. Moving & updating XML configuration files, thousands of check images, restarting services, migrating databases. And this isn’t a one-time event – we have to do this over 200 times! The SQL Server portion of this process isn’t particularly involved, but it’s vitally important:
    • After most of the heavy lifting of moving things around is complete, one table has to be updated to point at the new path for the images that were migrated.
    • When all of the migrations are finished, we have to validate that everything moved over properly. A few carefully-selected queries to compare critical tables between the old version of the database and the new version and minds are put at ease that all of our data has come over cleanly. Those queries, along with the other components of the validation, are run via the PoSH script & output to a file for review.
  • For reporting purposes, we load a small subset of data from Active Directory into a pair of SQL Server tables on a nightly basis. Previously, it was only 2 fields but recently this has been expanded to about 8. Once again, PoSH to the rescue! Pull the AD user accounts, select the properties we need, and insert it all into the tables. Originally I used my old standby Invoke-SQLCmd, but with the addition of new fields I got concerned about building queries via string concatenation using arbitrary data retrieved from another system. System.Data.SqlClient & prepared statements to the rescue! It’s more code, but it’s a lot safer.
  • Dave Ballantyne reminded me that I have two PowerShell scripts for SSRS.
    • The first, based upon this script from Randy Alrdich Paulo to deploy reports into SSRS. My addition was the option to pull an RDL file straight from our Subversion repository instead of the person doing the deployment having to check it out.
    • The second is based upon this StackOverflow post, and is used to render reports to file. In one case, I then SFTP the generated report to a vendor immediately after.

These are pretty mundane compared to what I’m sure a lot of people participating will be posting, but the key is this: PoSH is saving me time and by scripting everything I can, I’m significantly reducing the chance for errors. Of course, when there is an error, it’s magnified tremendously – so it all gets tested against a non-production server first.