PSPowerHour v1.0 Wrap-Up

The first edition of the PSPowerHour is in the books and it looks like it was a big success. This one was dbatools-heavy but I chalk that up to the dbatools community having lots of free time because we’ve automated so many of our tasks 🙂

Overall Impressions

I signed in about half an hour ahead of the webcast and was the first one there. Shortly thereafter, I was joined by Michael Lombardi (t, then Jess Pomfret (b|t) and Chrissy LeMaire (b|t). After ironing out a few glitches, we got everyone in the right place and kicked off the broadcast. Everything ran very smoothly, especially considering the number of people involved – Michael and Warren F. (b|t) did a terrific job of orchestrating everything.

While watching and listening to Chrissy, Doug, Andrew & Jess give their demos, I ran through my own in my head a couple times, adding and rearranging a few things as I observed how they were doing theirs. The big dilemma for me was whether or not to run the camera or exclusively screen share (I ended up going with the screen share only). Having not rehearsed my demo enough in the weeks leading up to the event, I was still not sure where to dip into more detail or dial things back and seeing what others were doing helped quite a bit. Having familiar faces & voices ahead of me in the queue put my nerves to rest.

I wasn’t able to watch the sessions after mine in their entirety due to family commitments. Joshua’s Burnt Toast module looks like it’ll be fun to experiment with and add some nice functionality to scripts (I got to see about half of his demo), and I’m really looking forward to catching a replay of Daniel’s demo of PowerShell on the Raspberry Pi – I didn’t realize that it had been ported already!

My Demo

I demoed Invoke-DbaSqlQuery and why one should use it over Invoke-SQLCmd – primarily for protection from SQL injection. Things didn’t go exactly the way I’d practiced; I ran short of time despite feeling like I rushed things and cutting back on some of what I had planned to say. The latter was in part because of the lead-ins from Chrissy, Andrew, and Jess. Because they did such a good job introducing dbatools, I was able to skip over it. But I was able to throw in a teaser for Matt Cushing’s (b|t) demo at the next PSPowerHour.

Running the demos inside a VM and screen-sharing just that VM made things easier for me as opposed to flipping between apps. My scripts will be available on GitHub along with the other presenters’ once the pull request is approved.

I achieved my goals:

  1. I did it
  2. I successfully demonstrated a SQL injection problem and explained why it’s so bad
  3. I demonstrated how to make database queries from PowerShell both more reliable and safer
  4. I learned about some new stuff that I desperately want to experiment with.

Next time around, I definitely need to rehearse more and get my timing down better but overall, I’m happy.

Check it out!

Advertisements

I Will See You in Seattle!

A few weeks ago, I teased good news.

One person hypothesized that I’m joining Microsoft (it seems to be the thing to do lately) and another jumped to the conclusion that I must be pregnant. Both creative responses, but not quite correct.

I’ll be at PASS Summit 2018!

So much to do!

  • Pick some sessions
  • Make my checklist of #sqlfamily I need to see
  • Find a way to pack lighter (I think the iPad will stay home this time)
  • Up my selfie game
  • Get back into shape for #sqlrun
  • Print up some more dbatools ribbons
  • Figure out the social media photo situation (see above, “Up my selfie game”)

If you’re attending Summit, let’s meet up! I’ll be on Twitter, Slack, and Instagram (@alevyinroc across the board) all week and roaming the convention center & various evening events so ping me there to find out where I am.

Because Summit starts on Election Day here in the USA, be sure to either get to your local polling place that morning or follow your state’s process to request & submit an absentee ballot. Every election is more important than the one before it (that’s the most political I’ll get on this blog, I swear).

Speaking: PowerHour, August 21st 2018

It’s official! I will be speaking at the inaugural PowerHour online lightning demo event on Tuesday, August 21st at 2200 UTC. I’ll be demoing Better, Safer SQL Queries from PowerShell.

If you’re working with SQL Server from PowerShell, either as a DBA, analyst, or anyone else running queries, you’ve probably used Invoke-SqlCmd. But depending on how you’re building your queries, this can be error-prone or a huge security exposure! With the help of the dbatools module, I’ll show you how to write and run these queries better and safer – and make them easier to work into your scripts to boot.

I’m excited to be a part of this – it’s been far too long since I’ve done a presentation. Please join us on the YouTube channel/stream next Tuesday!

PowerHour – Online PowerShell Lightning Talks!

Earlier this week, the PowerHour was announced. What is it? It’s kind of like a virtual user group. One hour, 6(ish) lightning demos (10 minutes or less), centered on PowerShell. All community-sourced and driven – anyone can submit a proposal for a demo and if accepted, you’ll be slotted into an available spot.

They’ve already set up a YouTube Channel so you can either watch live or catch up later on, and the whole deal is being organized and managed through GitHub. Got something you want to show off? Log an issue using the template!

I’ve submitted my first (but hopefully not last) proposal. Yep, it involves dbatools because that’s my jam.

It’ll be fun for speakers and attendees alike! You can even use your demo(s) for user group meetings or SQL Saturdays – anywhere lightning talk/demo spots are available. Several SQL Server community folks have tossed proposals in and with so many DBAs getting hooked on PowerShell, it’s a great way for these two communities to come together.

Quick Table Info Export with PowerShell

This week I had a user come to me asking about how fields were defined on a few tables he was using in writing some reports. Long story short, he’s been tasked with writing some new reports and updating existing ones, but he doesn’t have visibility to the database itself so he’s left with the “ok, let’s try this” approach and then reading error messages to debug when things go sideways. Very tedious.

I asked him for a list of the tables he was most interested in, and while he worked on that I set to work on (of course) a quick & dirty PowerShell script to collect the data he needed – field names, types, and whether they’re nullable.

With the help of dbatools and Doug Finke’s ImportExcel module, the resulting script to retrieve the requisite data and export it to an Excel workbook with one sheet per table is very short. Each table is on its own worksheet, with the header row bold and frozen so it stays in view while scrolling.

Line by line:

  • Fetch the list of tables from a file provided to me
  • Gather a collection of SMO objects for the tables from the server
  • Loop over each table and extracting the relevant data about each
  • Write the table’s information out to a worksheet named for the table in an Excel workbook

The only tricky part was fetching the field lengths for the nvarchar fields. I had to go spelunking into each field’s Properties property to get that. The result:

Export-TableInfo Results

But I can tell this isn’t disco enough for you, so I’m gonna put it right here.
– Justin Hammer, Iron Man 2

You can do this in a one-liner too. It’s not as pretty, but it’s quite doable and the resulting Excel file is identical.

End to end, I turned this around in about 20 minutes and the the requestor’s life got easier in a big hurry. Best of all, when he comes back with another batch of tables for his next set of reports, I can turn the request around in less than five minutes.

How to Help with dbatools Comment-based Help

I wrote a post over on the dbatools website about how to get involved with improving comment-based help

Working on the CBH is a great way to get started with the dbatools project, even (especially) if you’re not a PowerShell expert or MVP-level DBA. Getting everything clean and consistent in the CBH is an important step on the road to 1.0. Along the way, you’ll pick up on how dbatools is put together, discover functions that you can use in your day-to-day work, and get a feel for PowerShell best practices. You will learn from this experience!

Join us!

One Line of Code Is All It Takes

This tweet showed up in the dbatools Slack channel Friday afternoon.

My first thought was “huh? John (t) hadn’t kicked code in previously? I thought he had.” Once I was over that, I reflected a bit on what John wrote here, and was reminded of how I felt when I started helping out with dbatools.

It’s similar to Impostor Syndrome – I felt like I wasn’t doing much, small things here and there, in large part “just” documentation cleanup. The feeling that I was just throwing changes into the codebase just for the sake of making changes. It took me a couple of months and talking to several people before I understood that what I was doing was useful to someone other than myself and internalized what I was hearing.

Here’s the thing that I have finally come to realize. Every contribution to an open source project is beneficial, no matter how small it may seem. I’d heard this over the years but didn’t really understand until very recently.

John’s single line of code, no matter how it is that he got it into the dbatools codebase, made it better. His code will be executed by thousands of users of dbatools the world over.

Most open source project maintainers/leaders are looking for help. Get out there on GitHub and look up a project you use. Find an issue that’s tagged good first issue or help wanted. Hop over to Up For Grabs and find a project that needs a little help. If your PR isn’t immediately accepted, work with the maintainers to get it into a condition where it can be merged .

Single lines of code are welcome improvements to projects. Find yours.

dbatools at PASS Summit 2017

I registered for Summit about a month before getting actively involved in the dbatools project, so when I saw the team was running a pre-con and I was going to meet them, I was pretty excited. It was amazing getting to meet and hang out with Chrissy, Rob, CK, Shane, Jess, John, Shawn, Aaron, Ben, Kiril, Shane, and Drew (sorry if I forgot anyone!), even if it was only for a moment.

But I’ll have another post about the people of Summit. This one’s about dbatools being talked about all over Summit and my experience with that as a member of the team. I’m certain there’s a heavy amount of confirmation bias here, but dbatools seems to have caught fire in the SQL Server community. And with good reason!

I was able to hand out about 300 of the dbatools fan ribbons I brought with me; half went to pre-con attendees, and the rest were handed out on the conference center floor at random. Sitting at the PowersShell table at the BoF lunches, people would join us and say “hey, I’ve heard about this dbatools thing but haven’t had a chance to learn it yet.” People would see mine and ask for one as they’d heard about the project and even used it themselves.

Rob Sewell talked about it at the SentryOne booth. I heard on Twitter and around the conference center that dbatools was getting mentioned in a number of speakers’ sessions, even the ones that didn’t advertise it in their abstracts. There was a panel discussion about PowerShell in general, spearheaded by the key dbatools team members and of course dbatools was talked about there. But the star of that session was Ken Van Hyning, aka SQL Tools Guy (t), talking about the roots and evolution of many of the tools we use and where he sees them going. He also hold us how we can impact the direction of the current tools and make pitches for new ones. Key takeaways:

  • Cross-platform, open-source where possible seems to be the way of the future
  • There’s a lot of work to be done to migrate the infrastructure and tooling around the tools to get the existing ones there (I think this is why we’re seeing new tooling come out instead of direct ports)
  • The squeaky wheel gets the love, so make your voice heard on Microsoft Connect and Twitter!

We even managed to get a group photo with the dbatools team members who were in the building!

After all the “I can’t believe this is happening!” moments through the week, the final session on Friday was the icing on the cake. I was in Carlos L Chacon’s session Measuring Performance Through Baselines and dbatools popped up on one of his slides.

dbatools on Carlos’s slide

Later, Carlos demonstrated a couple of functions, Get-DbaAgentAlert and Get-DbaUptime. The latter sounded familiar, so I jumped on Github and checked the history to confirm. Yep, it’s one of the functions I’d done some (non-CBH) work on. Which means that code I wrote was executed in a PASS Summit presentation! Yes, it’s a small thing and I’m the only person who even knew it as it was happening, but it happened. Which is pretty awesome.

My First Migration with dbatools

I’ve been a proponent of dbatools for close to a year now and even contributed to the project, but surprisingly haven’t been a heavy user of it. Mostly due to a lack of opportunity. I’m aware of many of the functions by virtue of working on the built-in documentation and following the project and presentations about it.

So when the need arose to move a development/test instance of SQL Server from a VM onto a physical server, I knew exactly what I wanted to do. I was warned that the contents of this instance had been moved once before and it resulted in over a week of work and a bunch of trouble. I can’t speculate on why this was as I wasn’t there to see it, but I wasn’t going to let that happen on my watch. So, with equal parts hubris and stubbornness (and a dash of naïveté), I dove in. We have the technology. We will migrate this thing.

The advertising for Start-DbaMigration makes it look so easy. Source, destination, your method of moving the data, and you’re done. Right? Well, sure – in a small, controlled sandbox. This one was neither. About 150 databases. Two dozen Agent jobs. User account cleanup. Different drive letters and sizes. And when it was all over, the server name, instance name, and IP of the new box had to match the old one so that we didn’t disrupt production or the developers.

Of course we’re going to rehearse this. But with the destination being a physical machine, I didn’t have the luxury of rolling back a snapshot each time, or restarting from a golden image. And I couldn’t do everything because it wasn’t an isolated environment – I couldn’t test all the Agent jobs (don’t want emails going out in error) and couldn’t reconfigure the IP or server name. Which meant that my script had to clean up any artifacts from previous runs before doing the migration. Each time.

I also wanted to bring the new instance up in a controlled fashion as opposed to just moving everything and letting it go, so that I could check things out before letting them break. I also had to work in checkpoints so the network/server admin could do his pieces. Which meant that after the migration, everything on the old server had to be stopped, and Agent jobs on the new one disabled (but with a record of what was enabled/disabled on the source, so I could replicate it).

I rehearsed what I could about a half-dozen times. Each time through took about 4 hours (having multiple tests helps build confidence in your elapsed time estimates), primarily because of the amount of data that had to be moved (about 700GB). Each time, I found another tweak needed. Maybe not entirely necessary, but I was out to prove something. I didn’t want this migration to be “good enough, a little rough around the edges” – this had to work right, right away.

This is truly standing on the shoulders of giants. Without the thousands of person-hours put in by Chrissy and the rest of the team, a short script like this to do a mountain of work simply is not possible. It’s not just having the huge amount of code to build on – it’s the suite of tests they run with every pull request that tells me that I can trust it’ll work right.

Looking back on it, there’s definitely a few things I’d change in this script, and more dbatools functions I could have used. But after successfully testing a couple times, I didn’t want to break what was working.

When the migration was complete, I did a brief checkout and then gave my server admin the green light. He flipped names & IPs around, and then I ran Repair-DbaServerName which I had just discovered a few days earlier. I was expecting to do it manually but I trust the dbatools crew and their test suite more than myself on this one as I’ve never done this before. When that was complete, I had a grand total of three issues (that I could find):

  • Database owners weren’t set appropriately. I was able to resolve this via Set-DbaDatabaseOwner easily enough.
  • Outgoing dbmail didn’t work. Turns out the SMTP relay on the new server wasn’t started. Easy fix.
  • I had a Linked Server on my production instance which was unable to communicate to the new test server. This took me the longest to figure out. We checked everything – SQL Server Configuration Manager, the network itself, and then finally my colleague suggested testing something outside SQL Server – mapping a drive from production to test. This last test succeeded, which pointed us at the SQL Server connection specifically. The root cause: I had two firewall rules on the new server that blocked connections from all but servers on the local subnet. The production server isn’t on the local subnet.

None of these are total showstoppers. I had workarounds (or quick solutions) for them and as this is a test instance we could live with minor inconvenience for a day or two. One or two final tests, and I was satisfied that everything was working properly so I went ahead and enabled my Agent jobs. Some of them still have incorrect owners but I can fix that later – they were wrong on the source instance too.

I consider this migration a huge success. We had 95% functionality by 9am. By 3pm, the last real problems were resolved (and only that late due to a series of meetings keeping me away from my desk). Most importantly, it was achieved with minimal downtime for the development and QA teams. I’m now one week post-migration and everything is still running smoothly on the new instance.

dbatools Badge Ribbons at PASS Summit

One of the (many) fun things to do at PASS Summit is to check out the ribbons people have attached to their badges. Some are witty or goofy, others informational, others technical, and still more that let you express how you identify with a community within the community.

To celebrate dbatools and the awesome team & community around it, two limited edition badges will be available from/distributed by me and a handful of other folks all week at Summit. Check ’em out:

Be on the lookout for these badges and talk to us about dbatools! What you like, what you’d like to see changed, new feature ideas, questions about how to use functions, anything at all. Even if you’ve never used dbatools, we love talking about it and showing people the awesome things they can do with it so please, introduce yourself!