Processing SQL Saturday Raffle Tickets with PowerShell

Every year, I spend the Sunday after SQL Saturday Rochester scanning & processing raffle tickets for our wonderful sponsors. Here’s how the system works:

  • Attendees get tickets (one ticket per sponsor) with their name, the sponsors name, and a QR code on them
  • The QR codes represents a URI, unique to the combination of event, attendee and sponsor.
  • Attendees drop their tickets in a box to enter the sponsor’s raffle prize drawing
  • When the URI from the QR code is accessed, it registers in the SQL Saturday system
  • Organizers run a report for each sponsor that includes the contact info of all attendees who dropped off a raffle ticket, then email the report to the sponsor

It works pretty well, but the hangup is that most QR scanners will open your web browser (or prompt you to open it) to the URL on each scan. For 150+ tickets, this takes a long time. Every year, I lament “oh, how I wish I could just scan these, collect the URLs into a nicely formatted file, and script this whole thing”.

Finally, this year, I found a way to do it with my iPhone, MacBook Pro & PowerShell. Here’s what I did:

  1. Get Beep for iOS.
  2. Scan the tickets. This app is really fast, it may scan before you even realize it. I just stacked them up, pointed the phone at the pile, and as the app beeped (to tell me it had scanned successfully), I tossed the ticket to the side.
  3. When done, tap the file box icon in the upper-right corner
  4. Tap the Share icon
  5. Save the file out to a CSV on iCloud (you can email it if you like, but iCloud is a little easier for me)
  6. On the Mac, open up Terminal and navigate to /Users/YOURNAME/Library/Mobile Documents/com~apple~CloudDocs
  7. Fire up PowerShell (I installed it via HomeBrew with brew install powershell and start it by running pwsh).
  8. Run the following one-liner:
    import-csv './FILENAME.csv' -Header URI,Type,Date|select-object -Unique -Property uri |foreach-object {invoke-webrequest $PSItem.uri}

This bit of PowerShell:

  • Imports the CSV file and forces column names (as the file doesn’t include them) of my choosing
  • Extracts the unique URIs from the data
  • Loops through all the URIs and invokes a web request to each one of them

It’s the same process I’ve used in the past, just much faster because I’m not pausing after each scan to load a URI in my web browser.

With nearly 300 raffle and attendance tickets scanned, this zipped through all of them in less than 90 seconds. Best of all, I could start it and walk away to do something else. Doing it this way made my SQL Saturday “closeout” process a little less stressful.

My First Migration with dbatools

I’ve been a proponent of dbatools for close to a year now and even contributed to the project, but surprisingly haven’t been a heavy user of it. Mostly due to a lack of opportunity. I’m aware of many of the functions by virtue of working on the built-in documentation and following the project and presentations about it.

So when the need arose to move a development/test instance of SQL Server from a VM onto a physical server, I knew exactly what I wanted to do. I was warned that the contents of this instance had been moved once before and it resulted in over a week of work and a bunch of trouble. I can’t speculate on why this was as I wasn’t there to see it, but I wasn’t going to let that happen on my watch. So, with equal parts hubris and stubbornness (and a dash of naïveté), I dove in. We have the technology. We will migrate this thing.

The advertising for Start-DbaMigration makes it look so easy. Source, destination, your method of moving the data, and you’re done. Right? Well, sure – in a small, controlled sandbox. This one was neither. About 150 databases. Two dozen Agent jobs. User account cleanup. Different drive letters and sizes. And when it was all over, the server name, instance name, and IP of the new box had to match the old one so that we didn’t disrupt production or the developers.

Of course we’re going to rehearse this. But with the destination being a physical machine, I didn’t have the luxury of rolling back a snapshot each time, or restarting from a golden image. And I couldn’t do everything because it wasn’t an isolated environment – I couldn’t test all the Agent jobs (don’t want emails going out in error) and couldn’t reconfigure the IP or server name. Which meant that my script had to clean up any artifacts from previous runs before doing the migration. Each time.

I also wanted to bring the new instance up in a controlled fashion as opposed to just moving everything and letting it go, so that I could check things out before letting them break. I also had to work in checkpoints so the network/server admin could do his pieces. Which meant that after the migration, everything on the old server had to be stopped, and Agent jobs on the new one disabled (but with a record of what was enabled/disabled on the source, so I could replicate it).

I rehearsed what I could about a half-dozen times. Each time through took about 4 hours (having multiple tests helps build confidence in your elapsed time estimates), primarily because of the amount of data that had to be moved (about 700GB). Each time, I found another tweak needed. Maybe not entirely necessary, but I was out to prove something. I didn’t want this migration to be “good enough, a little rough around the edges” – this had to work right, right away.

import-module dbatools;
start-transcript;
$VerbosePreference = "Continue";
$WhatIfPreference = $False;
$SrcSQLInstance = 'SOURCE\INSTNAME'
$DestSQLInstance = "DEST\INSTNAME";
Get-DbaAgentJob -ServerInstance $DestSQLInstance | Foreach-object {Remove-DbaAgentJob -serverinstance $psitem.sqlinstance -job $psitem}
get-dbaagentschedule -serverinstance $DestSQLInstance | foreach-object {remove-dbaagentschedule -ServerInstance $DestSQLInstance -schedule $PSItem}
get-dbadatabase -serverinstance $DestSQLInstance | remove-dbadatabase;
Copy-DbaLogin -source $SrcSQLInstance -destination $DestSQLInstance -ExcludeLogin 'DOMAIN\User1','DOMAIN\User2','User3';
# Get list of agent jobs and their state on SOURCE
Get-DbaAgentJob -ServerInstance $SrcSQLInstance | Select-Object -Property sqlinstance, name, isenabled | export-csv -Path 'jobstate_$(get-date -format "yyyyMMddHHmmss").csv' -NoTypeInformation;
Start-dbaMigration -Source $SrcSQLInstance -Destination $DestSQLInstance -DisableJobsOnDestination -BackupRestore -NetworkShare \\DEST\SQLMigration -WithReplace -Force -nologins -DisableJobsOnSource;
$DestServices = @('sqlagent$INSTNAME','mssqlfdlauncher$INSTNAME','mssql$INSTNAME');
foreach ($Service in $DestServices) {
Get-Service -ComputerName $DestSQLInstance.Split('\')[0] -Name $Service | Stop-Service;
# Want to shut down services on SOURCE and keep them that way
Get-Service -ComputerName $SrcSQLInstance.Split('\')[0] -Name $Service | Stop-Service;
}
# Bring services up in the correct order
[array]::Reverse($DestServices);
foreach ($Service in $DestServices) {
Get-Service -ComputerName $DestSQLInstance.Split('\')[0] -Name $Service | Start-Service;
}
$VerbosePreference = "SilentlyContinue";
Stop-Transcript
remove-module dbatools;

This is truly standing on the shoulders of giants. Without the thousands of person-hours put in by Chrissy and the rest of the team, a short script like this to do a mountain of work simply is not possible. It’s not just having the huge amount of code to build on – it’s the suite of tests they run with every pull request that tells me that I can trust it’ll work right.

Looking back on it, there’s definitely a few things I’d change in this script, and more dbatools functions I could have used. But after successfully testing a couple times, I didn’t want to break what was working.

When the migration was complete, I did a brief checkout and then gave my server admin the green light. He flipped names & IPs around, and then I ran Repair-DbaServerName which I had just discovered a few days earlier. I was expecting to do it manually but I trust the dbatools crew and their test suite more than myself on this one as I’ve never done this before. When that was complete, I had a grand total of three issues (that I could find):

  • Database owners weren’t set appropriately. I was able to resolve this via Set-DbaDatabaseOwner easily enough.
  • Outgoing dbmail didn’t work. Turns out the SMTP relay on the new server wasn’t started. Easy fix.
  • I had a Linked Server on my production instance which was unable to communicate to the new test server. This took me the longest to figure out. We checked everything – SQL Server Configuration Manager, the network itself, and then finally my colleague suggested testing something outside SQL Server – mapping a drive from production to test. This last test succeeded, which pointed us at the SQL Server connection specifically. The root cause: I had two firewall rules on the new server that blocked connections from all but servers on the local subnet. The production server isn’t on the local subnet.

None of these are total showstoppers. I had workarounds (or quick solutions) for them and as this is a test instance we could live with minor inconvenience for a day or two. One or two final tests, and I was satisfied that everything was working properly so I went ahead and enabled my Agent jobs. Some of them still have incorrect owners but I can fix that later – they were wrong on the source instance too.

I consider this migration a huge success. We had 95% functionality by 9am. By 3pm, the last real problems were resolved (and only that late due to a series of meetings keeping me away from my desk). Most importantly, it was achieved with minimal downtime for the development and QA teams. I’m now one week post-migration and everything is still running smoothly on the new instance.

SQL New Blogger Digest – Week 2

I didn’t intend for last week’s digest to also be my post for week two of the challenge, but life got in the way and I wasn’t able to complete the post that I really wanted in time. So, that post will be written much earlier in week three and completed well ahead of the deadline.

Here are the posts collected from week two of the SQL New Blogger Challenge. It’s been compiled the same way last week’s was.

Author Post
@AaronBertrand #SQLNewBlogger Roundup – SQL Sentry Team Blog
@MtnDBA #SQLNewBlogger Week 2 – Teach Something New | DBA With Altitude
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Back on the Job Hunt
@DwainCSQL An Easter SQL | dwaincsql
@DwainCSQL An Even Faster Method of Calculating the Median on a Partitioned Heap | dwaincsql
@arrowdrive Anders On SQL: Stupid stuff I have done. 1/? Or, How I learned to stop GUIing and love the script
@MattBatalon Another TRUNCATE vs. DELETE tidbit… | Matt Batalon
@way0utwest April Blogging Challenge 2 – Primary Key in CREATE TABLE | Voice of the DBA
@GuruArthur Arthur BaanSQL Server error 17310 – Arthur Baan
@Pittfurg Blog Series: SQL Server Backup and Restores with PowerShell – Port 1433
@fade2blackuk Checking SQL Server User Role Membership with PowerShell « SQL DBA with A Beard
@gorandalf Database Compatibility Level 101 | Gorandalf’s SQL Blog
@nocentino Designing for offloaded log backups in AlwaysOn Availability Groups – Monitoring – Centino Systems Blog
@SqlrUs Detaching a Database – File Security Gotcha | John Morehouse | sqlrus.com
@MartynJones76 Devon DBA: Check Database Integrity Task Failed … Oh Dear Master Luke!
@toddkleinhans How Do You Visualize Abstractions? | toddkleinhans.com
@AalamRangi How to Have Standard Logging in SSIS and Avoid Traps | SQL Erudition
@gorandalf How to Test Existing T-SQL Code Before Changing the Compatibility Level | Gorandalf’s SQL Blog
@EdDebug HOWTO-Get-T-SQL-Into-SSDT | the.agilesql.club
@DBAFromTheCold In-Memory OLTP: Part 2 – Indexes | The DBA Who Came In From The Cold
@nicharsh It’s a Harsh Reality – SQL Sentry Team Blog
@SQLBek Learn Something New – SSMS Tips & Tricks « Every Byte Counts
@cjsommer Modify SQL Agent Jobs using PowerShell and SMO | cjsommer.comcjsommer.com
@MyHumbleSQLTips My Humble SQL Tips: Full List of SQL Server 2014 DMVs
@MyHumbleSQLTips My Humble SQL Tips: Running DBCC CHECKDB on TEMPDB
@way0utwest New Blogger Challenge 1 – Adding a Primary Key | Voice of the DBA
@uMa_Shankar075 Querying Microsoft SQL Server: In Memory Optimized Table in SQL Server 2014
@Jorriss Random Thoughts of Jorriss
@pmpjr Sidenote, the 4200 databases are a different story for another week… | I have no idea what I’m doing
@ALevyInROC SQL New Blogger Challenge Weekly Digest | The Rest is Just Code
@jh_randall SQL Server Monitoring – Getting it Right – SQL Sentry
@cathrinew Table Partitioning in SQL Server – The Basics – Cathrine Wilhelmsen
@eleightondick Teach Something New: PowerShell Providers [T-SQL Tuesday #065] | The Data Files
@rabryst The Art of Improvisation – Born SQL
@DBAFromTheCold The DBA Who Came In From The Cold | Advice on working as a SQL Server DBA
@Lance_LT The estimated query plan and the plan cache (Part 2) | Lance Tidwell the Silent DBA
@SQLJudo TSQL Tue 65: Memory Optimized Hash Indexes | Russ Thomas – SQL Judo
@sqlsanctum T-SQL Tuesday #065 – Teach Something New – APPLY | SQL Sanctum
@_KenWilson T-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@m82labs Untangling Dynamic SQL · m82labs
@cathrinew Using a Numbers Table in SQL Server to insert test data – Cathrine Wilhelmsen
@tpet1433 Why yes I can do table level restores – Tim Peters
@Jorriss Why You Absolutely Need Alternate Keys: A Unique Constraint Story

SQL New Blogger Challenge Weekly Digest

Watching all of the tweets as people posted their first entries in the SQL New Blogger Challenge earlier this week, I quickly realized that keeping up was going to be a challenge of its own. Fortunately, there are ways to reign it in.

My first stop was IFTTT (If This Then That). IFTTT allows you to create simple “recipes” to watch for specific events/conditions, then perform an action. They have over 175 “channels” to choose from, each of which has one or more triggers (events) and actions. I have IFTTT linked to both my Google and Twitter accounts, which allowed me to create a recipe which watches Twitter for the #sqlnewblogger hashtag, and writes any tweets that match it to a spreadsheet on my Google Drive account (I’ll make the spreadsheet public for now, why not?).

The next step is to export the spreadsheet to CSV. I don’t have this automated, and may not be able to (I may have to find another workaround). Once it’s a CSV, I can go to PowerShell to parse my data. I want the end result to be an HTML table showing each post’s author (with a link to their Twitter stream) and a link to the post (using the title of the post itself).

Once I import the CSV file into an object in my PowerShell script, I need to do some filtering. I don’t want to be collecting all the retweets (posts starting with RT), and I should probably exclude any post that doesn’t contain a URL (looking for the string HTTP).

To extract the post URLs, I ran a regular expression against each tweet. Twitter uses their own URL shortener (of course), which makes this pretty easy – I know the hostname is t.co, and after the slash is an alphanumeric string. The regex to match this is fairly simple: [http|https]+://t.co/[a-zA-Z0-9]+

Then, for each URL found in the tweet, I use Invoke-WebRequest to fetch the page. This cmdlet automatically follows any HTTP redirects (I was afraid I’d have to do this myself), so the object returned is the real destination page. Invoke-WebRequest also returns the parsed HTML of the page (assuming you use the right HTTP method), so I can extract the title easily instead of having to parse the content myself. It’ll also give me the “final” URL (the destination reached after all the redirects). Easy!

My full script:

#requires -version 3
[cmdletbinding()]
param ()
set-strictmode -Version latest;
Add-Type -AssemblyName System.Web;
$AllTweets = import-csv -path 'C:\Dropbox\MiscScripts\Sqlnewblogger tweets - Sheet1.csv' | where-object {$_.text -notlike "RT *" -and $_.text -like "*http*"} | select-object -property "Tweet By",Text,Created | Sort-Object -property created -Unique;
$TweetLinks = @();
foreach ($Tweet in $AllTweets) {
    $Tweet.text -match '([http|https]+://t.co/[a-zA-Z0-9]+)' | out-null;
    foreach ($URL in $Matches) {
        $MyURL = $URL.Item(0);
# Invoke-WebRequest automatically follows HTTP Redirects. We can override this with -MaxRedirection 0 but in this case, we want it!
        $URLCheck = Invoke-WebRequest -Method Get -Uri $MyUrl;
        $OrigUrl = $URLCheck.BaseResponse.ResponseUri;
        write-debug $Tweet.'Tweet By';
        Write-debug $URLCheck.ParsedHtml.title;
        write-debug $URLCheck.BaseResponse.ResponseUri;
        $TweetLinks += new-object -TypeName PSObject -Property @{"Author"=$Tweet.'Tweet By';"Title"=$URLCheck.ParsedHtml.title;"URL"=$URLCheck.BaseResponse.ResponseUri;};
    }
}
Write-debug $TweetLinks;
$TableOutput = "<table><thead><tr><td>Author</td><td>Post</td></tr></thead><tbody>";
foreach ($TweetLink in $TweetLinks) {
$TableOutput += "<tr><td><a href=""https://twitter.com/$($TweetLink.Author.replace('@',''))"">$($TweetLink.Author)</a></td><td><a href=""$($TweetLink.URL)"">$([System.Web.HttpUtility]::HtmlEncode($TweetLink.Title))</a></td></tr>";
}
$TableOutput += "</tbody></table>";
$TableOutput;

And now, my digest of the first week of the SQL New Blogger Challenge. This is not a complete listing because I didn’t think to set up the IFTTT recipe until after things started. I also gave people the benefit of the doubt on the timing (accounting for timezones, etc.) and included a few posted in the early hours of April 8th. For week 2, it will be more complete.

Author Post
@eleightondick Kevin Kline on Twitter: “Advice to New Bloggers http://t.co/o1jfLOR4QI

@BarbiducH
Safe exit from WHILE loop using ##global temp tables | One developer’s SQL blog
@eleightondick Mike Donnelly on Twitter: “T-SQL Tuesday #065 – Teach Something New http://t.co/LoyFbhVOpw #tsql2sday”
@GuruArthur Arthur BaanTrace flags in SQL Server – Arthur Baan
@cjsommer Blogging and Intellectual Property Law | legalzoom.com
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Answering a Blog Challenge (Plus, Why I Like Data)
@arrowdrive Anders On SQL: A bit about me continued. Anders meets SQL
@SQLJudo Experience Is Overated | Russ Thomas – SQL Judo
@SQLBek T-SQL Tuesday #065 – Teach Something New | Mike Donnelly, SQLMD
@MtnDBA #SQLNewBlogger Week 1 “Eye of the Tiger” | DBA With Altitude
@Lance_LT The estimated query plan and the plan cache (Part 1) | Lance Tidwell the Silent DBA
@AalamRangi How to Use Temp Table in SSIS | SQL Erudition
@DwainCSQL An Easter SQL

Limitations

There are a couple limitations and gotchas with this process:

  • The IFTTT recipe only runs every 15 minutes (all IFTTT triggers run on 15 minute intervals) and only fetches 15 tweets each time it runs (again, IFTTT’s configuration). So if there’s a flood of tweets, they won’t all be captured.
  • I don’t really check the final destination of a link. For example, one of the first tweets captured contained a link to another tweet, which then linked to a blog post. Could I detect this & resolve the true final destination? Probably. But it’d involve a lot more logic, and it’s getting late.
  • I also don’t pick up every duplicate link/post. Again, I can probably get by this with some extra logic, but I don’t think it’s necessary right now.
  • It doesn’t post automatically to my blog, or anywhere else. I have to manually paste the HTML into my blog post(s).
  • I had to manually remove one link as it didn’t actually point to a post written for the challenge; it was tweeted with the hashtag, so my IFTTT recipe caught it.
  • I started collecting these tweets mid-day April 7th. If you posted before that, I’ve likely missed your post. You will be picked up for Week Two!