I Spoke at SQLSat (and I Liked It)

That is the first and last Katy Perry reference you will find on this blog or anywhere else in my life.

Last weekend I spoke at the 4th edition of my “home” SQL Saturday, SQL Saturday #383. This was the end of a path that started four years ago, and the beginning of an exciting new one.

About four years ago, I was introduced to PASS. It didn’t take long for people to start talking to me about public speaking. I went to my first-ever SQL Saturday), and kept thinking to myself “I could never do that.” Then I was given the opportunity to attend PASS Summit 2012 and was hooked on the PASS community – SQL Family. I stepped onto the floor at the convention center and felt comfortable immediately. I think my exact words when I called home that evening were “I’m home. I found my people.” Mid-Summit, in a 10-minute conversation with a chapter leader, I was told “you should speak at one of my user group meetings.”

But I have nothing to talk about. I’m terrified of public speaking. I’ve only ever done it in a classroom, in college or high school and I hated it. It terrified me. And I’m not an expert on anything. Well…maybe. Someday. A long time from now.

Time passed. I got involved with my local PASS chapter, got heavily involved with our annual SQL Saturday events, and got to know (or at least meet) more people in the SQL Server community. And I kept hearing the question “so when are you going to start speaking?”

But I have nothing to talk about. I’m not a speaker. I don’t have the polish that all these people on stage at Summit or in the front of the room at SQL Saturday have. I’m not even a DBA!

In 2014, one of my professional development goals at work was to give at least two presentations. I pretty much didn’t have a choice now, I had to get up in front of a crowd. So I wrote & delivered two sessions:

  • An introduction to PowerShell. Adoption of PowerShell had been slow in my office and I wanted to demonstrate how it could benefit the entire IT department. This wasn’t targeted at any particular job role; I was addressing the whole department.
  • A demo of SQL Sentry Performance Advisor & Event Monitor. We’ve been using this software for a few years now and I’ve spent quite a bit of time getting comfortable with these two portions of it, mostly in the course of figuring out why our systems were running poorly.

I was starting to get a bit more relaxed about talking in front of people. But this was a comfortable environment – I knew everyone in the room. That summer, I attended Mark Vaillancourt’s (b | t) session DANGER! The Art and Science of Presenting at SQL Saturday Albany, looking to fill in some gaps and figure out how to put myself at ease in less familiar territory.

Well, maybe I can put together a beginner-level session.

In February 2015, I attended SQL Saturday Cleveland. One of my goals for the day was to catch beginner-level sessions. I wanted to study the type and depth of the material, as well as how it was presented. Late in the day I had my breakthrough moment. The room was completely packed and the crowd was hanging on the presenter’s every word. I finally had a grasp of how to tailor a topic to a “beginner” audience.

I don’t have to put on a flashy show with 20 different advanced features and techniques. There’s room for the basics because there are always people who are new to this stuff and they want sessions too!

That same month, we needed a a speaker for our chapter meeting and rather than find someone to do a remote presentation, I decided to dust off my PowerShell talk from work, retool it for a DBA crowd, and go for it. It went pretty well, and the next week I took the plunge. I wrote up an abstract and submitted for SQL Saturday.

Pressing this button is one of the most nerve-wracking things I’ve done. Deep breath…go.

At the chapter meeting, I’d gone over 90 minutes with my slides and demos. At SQL Saturday, I’d only have 60. I had my work cut out for me. I spent April tweaking and tuning my slide deck, honing my demos. I felt like I had a pretty solid setup. The Sunday before SQL Saturday, I sent myself to the basement and started rehearsing my presentation. I went 48 minutes. Without demos or questions from an audience (proving that cats don’t care about PowerShell).

Hard stop at 60 minutes. What can I cut? Where did I waste time? Am I speaking too slowly?

Every night that week I was in the basement, running through my presentation and demos. I got myself to 55 minutes for the whole package.

That’ll have to do. If I get questions mid-session, I’ll just drop a demo or two to make up the time.

I arrived home from the speaker dinner Friday night and did one last run through my deck. I had just redone one of my big slides Thursday night. Friday was a terrible run, but it was getting late. I had 38 minutes on the slides themselves.

Saturday morning, I awoke at 6 and my brain was already in overdrive; on a scale of one to ten, I was at an eleven. I fired up my Azure VMs so they’d be ready well ahead of time and hit the road for RIT. I found my room (I was speaking in the first slot) and got myself set up. I wanted to check and re-check everything. I was not about to let a technical problem take me down.

That settled, I milled around a bit and as 8:15 arrived, I found myself escalating from 11 to 15. People started filtering into the room and I tried to chat with them a bit as I’d read about doing so in Grant Fritchey’s (b | t) most recent Speaker of the Month post. That helped calm me down a bit.

8:30. Showtime. Breathe.

I feel like I fumbled a little bit on my intro (before I even got off my title slide), but by the time I hit my 3rd slide, a calm fell over me. I got out of my head and cruised through the material. It seemed like it was going smoother than any of my rehearsals. I wasn’t relying on my written notes. I got a couple chuckles out of the audience before I reached my demos. As I returned to the keyboard, I glanced at the clock.

What? 9:00? I burned through my slides in 30 minutes and I’d planned for close to 40. Am I speaking that quickly? Did I stumble that much when I practiced?

Fortunately, I’d set up my demos in preparation for such an event. I had a set of “must do” demos, and then a bunch of alternates which I could bring in to fill some time. I got through my demos, answered the lone question I was asked, and wrapped up right on time.

As people filtered out of the room and I started packing up, an enormous weight was lifted off my shoulders. I was done. I survived. And scanning through the feedback, it looked like I did an OK job. Reading through it later, I saw a few notes that meshed with things I was thinking during the session, and I will definitely take into consideration for the next time.

Yes, the next time. I’m doing this again. I’m hooked.

Slides & demos from SQL Saturday Rochester

Slides & demos from my SQL Saturday Rochester presentation “Easing into Scripting with Windows PowerShell” have been posted on the SQL Saturday site.

Thank you to everyone who came out for my session and all of SQL Saturday!

Speaking at SQL Saturday Rochester this weekend!

I’ll be presenting my session Easing into Scripting with Windows PowerShell this Saturday, May 16th at SQL Saturday Rochester.

SQL Saturday is a free, all-day event for learning about SQL Server and related technologies, and networking with like-minded professionals in the region.

In addition to speaking, I’ll be tweeting out live updates all day long and posting pictures to both Instagram and Twitter. Watch for the hashtag #sqlsatroc (links to searches on both services).

The event is being held at RIT in Golisano Hall. Registration starts at 7:30 AM and sessions start at 8:30. We even have raffle prizes at the end of the day! If you’re in the area, come check it out!

SQL New Blogger Challenge Digest – Week 4

This week marks the end of Ed Leighton-Dick’s New Blogger Challenge. It’s terrific seeing everyone sticking with the challenge all month and I’m looking forward to catching up with all the posts. Great job, everyone! Keep going!

Author Post
@MtnDBA #SQLNewBlogger Week 4 – My 1st SQLSaturday session | DBA With Altitude
@Lance_LT “MongoDB is the WORST!” | Lance Tidwell the Silent DBA
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Big Data for Diagnosis
@Jorriss A Podcast Is Born
@toddkleinhans A Tale of SQL Server Disk Space Trials and Tribulations | toddkleinhans.com
@arrowdrive Anders On SQL: First “real” job with SQL.
@arrowdrive Anders On SQL: Stupid Stuff I have done. 2/?. Sometimes even a dev server is not a good dev environment
@way0utwest April Blogger Challenge 4–Filtered Index Limitations | Voice of the DBA
@ALevyInROC Are You Backing Everything Up? | The Rest is Just Code
@DesertIsleSQL Azure Data Lake: Why you might want one |
@EdDebug BIML is better even for simple packages | the.agilesql.club
@tpet1433 Corruption – The Denmark of SQL Instances – Tim Peters
@eleightondick Creating a Self-Contained Multi-Subnet Test Environment, Part II – Adding a Domain Controller | The Data Files
@MattBatalon Creating an Azure SQL Database | Matt Batalon
@pshore73 Database on the Move – Part I | Shore SQL
@pmpjr Do you wanna build a cluster?! | I have no idea what I’m doing
@DwainCSQL Excel in T-SQL Part 1 – HARMEAN, GEOMEAN and FREQUENCY | dwaincsql
@AalamRangi Gotcha – SSIS ImportExport Wizard Can Kill Your Diagrams | SQL Erudition
@toddkleinhans How Do Blind People Use SQL Server? | toddkleinhans.com
@DBAFromTheCold In-Memory OLTP: Part 4 – Native Compilation | The DBA Who Came In From The Cold
@AaronBertrand It’s a Harsh Reality – Listen Up – SQL Sentry Team Blog
@GuruArthur Looking back at April – Arthur Baan
@nocentino Moving SQL Server data between filegroups – Part 2 – The implementation – Centino Systems Blog
@MyHumbleSQLTips My Humble SQL Tips: Tracking Query Plan Changes
@m82labs Reduce SQL Agent Job Overlaps · m82labs
@fade2blackuk Rob Sewell on Twitter: “Instances and Ports with PowerShell http://t.co/kwN2KwVDOS”
@DwainCSQL Ruminations on Writing Great T-SQL | dwaincsql
@sqlsanctum Security of PWDCOMPARE and SQL Hashing | SQL Sanctum
@Pittfurg SQL Server Backup and Restores with PowerShell Part 1: Setting up – Port 1433
@cjsommer Using PowerShell to Export SQL Data to CSV. How well does it perform? | cjsommer.com
@gorandalf Using SSIS Lookup Transformation in ETL Packages | Gorandalf’s SQL Blog
@nicharsh Words on Words: 5 Books That Will Improve Your Writing

Are You Backing Everything Up?

We hear the common refrain among DBAs all the time. Back up your data! Test your restores! If you can’t restore the backup, it’s worthless. And yes, absolutely, you have to back up your databases – your job, and the company, depend upon it.

But are you backing everything up?

Saturday night was an ordinary night. It was getting late, and I was about to put my computer to sleep so I could do likewise. Suddenly, everything on my screen was replaced with a very nice message telling me that something had gone wrong and my computer needed to be restarted.

Uh oh.

In 7 1/2 years of using OS X, I’ve had something like this happen maybe 4 times.

After waiting whet felt like an eternity, the system finished booting & I got back into my applications. I opened up PowerPoint, as I had it open before the crash so I could work on my SQL Saturday Rochester slide deck whenever inspiration struck. I opened my file, and was greeted by nothingness. I flipped over to Finder and saw zero bytes displayed as the file size.

Uh oh.

“But Andy,” you say, “you use CrashPlan, right? Can’t you just recover the file from there?” Well, you’re half right. I do use CrashPlan. I even have a local, external hard drive (two, actually) that I back up to in addition to CrashPlan’s cloud service. But I couldn’t recover from any of those.

CrashPlan configuration - oops

Because Dropbox is already “in the cloud”, I had opted to not back it up with CrashPlan when I first set it up. After all, it’s already a backup right? It’s not my only copy, it’s offsite, it’s all good.

Not so fast. When my system came back up, Dropbox dutifully synced everything that had changed – including my now-empty file.

Dropbox - 0 bytes

So, now what? Fortunately, Dropbox allows you to revert to older versions, and I was able to select my last good version and restore it.

Screenshot 2015-04-26 21.04.48

Lessons Learned

I broke The Computer Backup Rule of Three and very nearly regretted it. For my presentation:

  • I had copies in two different formats – Dropbox & my local (internal) hard drive
  • I had one copy offsite (Dropbox)
  • I only had two copies, not three (local and Dropbox).

Even scarier, if Dropbox didn’t have a version history or it had taken me more than 30 days to realize that this file had been truncated, I’d have lost it completely.

Everything else on my computer was in compliance with the Rule Of Three; I just got lazy with the data in my Dropbox and Google Drive folders. I’ve since updated my CrashPlan settings to include my local Dropbox and Google Drive folders so that my presentation should now be fully protected:

  • Five copies
    • Local drive
    • Two external drives w/ CrashPlan
    • CrashPlan cloud service
    • Dropbox/Google Drive (different content in each)
  • Three formats
    • Spinning platters in my possession
    • Dropbox/Google Drive
    • Crashplan
  • Two copies offsite
    • CrashPlan cloud
    • Dropbox/Google Drive

And don’t forget to test those backups before you need to use them. Dropbox, Google Drive and other online file storage/sync solutions are very useful, but you cannot rely upon them as backups. I don’t think you’ll ever regret having “extra” backups of your data, as long as that process is automatic.

SQL New Blogger Digest – Week 3

Here are the posts collected from week three of the SQL New Blogger Challenge. It’s been compiled the same way previous weeks’ posts were. Everyone’s doing a great job keeping up with the challenge!

Author Post
@MtnDBA #SQLNewBlogger Week 3 – PowerShell Aliases | DBA With Altitude
@ceedubvee A Insider's View of the Autism Spectrum: Autism and Information Technology: New Efforts for Kids to Code
@arrowdrive Anders On SQL: Stupid Stuff I have done. 2/?. Sometimes even a dev server is not a good dev environment
@way0utwest April Blogger Challenge 3 – Filtered Indexes | Voice of the DBA
@eleightondick Creating a Self-Contained Multi-Subnet Test Environment, Part I – Networking | The Data Files
@ceedubvee Empower Individuals With Autism Through Coding | Indiegogo
@MattBatalon EXCEPT and INTERSECT… | Matt Batalon
@cjsommer Follow the yellow brick what? My road to public speaking. | cjsommer.com
@DBAFromTheCold In-Memory OLTP: Part 3 – Checkpoints | The DBA Who Came In From The Cold
@MattBatalon Introduction to Windowing Functions | Matt Batalon
@nocentino Moving SQL Server data between filegroups – Part 1 – Database Structures – Centino Systems Blog
@Lance_LT My first year as a speaker | Lance Tidwell the Silent DBA
@MyHumbleSQLTips My Humble SQL Tips: Tracking Page Splits
@ALevyInROC Padding Fields for Fixed-Position Data Formats | The Rest is Just Code
@tpet1433 Sir-Auto-Completes-A-Lot a.k.a. how to break IntelliSense, SQL Prompt and SQL Complete – Tim Peters
@pmpjr stats, yeah stats. | I have no idea what I'm doing
@DwainCSQL Stupid T-SQL Tricks – Part 3: A Zodiacal SQL | dwaincsql
@cathrinew Table Partitioning in SQL Server – Partition Switching – Cathrine Wilhelmsen
@gorandalf The MERGE Statement – One Statement for INSERT, UPDATE and DELETE | Gorandalf's SQL Blog
@SQLJudo The Road to SQL Server 2014 MCSE | Russ Thomas – SQL Judo
@GGreggB T-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@AalamRangi What is the RetainSameConnection Property of OLEDB Connection in SSIS? | SQL Erudition
@EdDebug What Permissions do I need to generate a deploy script with SSDT? | the.agilesql.club
@_KenWilson Windowing using OFFSET-FETCH | Ken Wilson
@DesertIsleSQL What Does Analytics Mean?
@DesertIsleSQL Azure ML, SSIS and the Modern Data Warehouse
@DesertIsleSQL Musing about Microsoft’s Acquisition of DataZen and Power BI
@GuruArthur Check for database files not in default location

Padding Fields for Fixed-Position Data Formats

Fixed-position data formats will seemingly be with us forever. Despite the relative ease of parsing CSV (or other delimited formats), or even XML, many data exchanges require a fixed-position input. Characters 1-10 are X, characters 11-15 are Y and if the source data is fewer than 5 characters, we have to left-pad with a filler character, etc. When you’re accustomed to working with data that says what it means and means what it says, having to add “extra fluff” like left-padding your integers with a half-dozen zeroes can be a hassle.

I received a draft of a stored procedure recently which had to do exactly this. The intent is for the procedure to output the data almost entirely formatted as required, one record per line in the output file, and dump the result set to a file on disk. As it was given to me, the procedure was peppered with CASE statements like this (only more complex) in the SELECT clause:

-- Method 1
select case len(cast(logid as varchar))
when 9 then '0' + cast(logid as varchar)
when 8 then '00' + cast(logid as varchar)
when 7 then '000' + cast(logid as varchar)
when 6 then '0000' + cast(logid as varchar)
when 5 then '00000' + cast(logid as varchar)
when 4 then '000000' + cast(logid as varchar)
when 3 then '0000000' + cast(logid as varchar)
when 2 then '00000000' + cast(logid as varchar)
when 1 then '000000000' + cast(logid as varchar)
when 0 then '0000000000' + cast(logid as varchar)
end as logid
,logtext from cachedb.dbo.logs;

It’s perfectly valid, it works, and there’s nothing inherently wrong with it. But I find it a bit tough to read, and it could become trouble if the format changes later, as additional (or fewer) cases will have to be accounted for. Fortunately, the day I received this procedure was right around the day I learned about the REPLICATE() T-SQL function. Maybe we can make this simpler:

select replicate('0',10-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Not bad. But it leaves us with a magic number and similar to the previous example, if the file format changes we have to seek out these magic numbers and fix them. This is easily remedied by defining these field lengths at the beginning of the procedure, so that they’re all in one place if anything needs to change.

-- Method 2
declare @paddedlength int = 10;
select replicate('0',@paddedlength-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Yet another approach would be to pad out the value beyond what we need, then trim the resulting string back to the required length. Again, we have to be careful to not leave ourselves with magic numbers; the solution is the same as when using REPLICATE():

-- Method 3
select right('0000000000' + cast(logid as varchar), 10) as logid,logtext from cachedb.dbo.logs;
-- Or, with more flexibility/fewer magic numbers
-- Method 4
declare @paddedlength int = 10;
select right(replicate('0',@paddedlength) + cast(logid as varchar), @paddedlength) as logid,logtext from cachedb.dbo.logs;

All four methods yield the same results, as far as the data itself is concerned. But what about performance? For a table with 523,732 records, execution times were:

  1. 2,000ms CPU time, 261,785ms elapsed
  2. 2,265ms CPU time, 294,399ms elapsed
  3. 2,000ms CPU time, 297,593ms elapsed
  4. 2,078ms CPU time, 302,045ms elapsed

Each method had an identical execution plan, so I’m probably going to opt for the code that’s more readable and maintainable – method 2 or 4.

As with any tuning, be sure to test with your own data & queries.

SQL New Blogger Digest – Week 2

I didn’t intend for last week’s digest to also be my post for week two of the challenge, but life got in the way and I wasn’t able to complete the post that I really wanted in time. So, that post will be written much earlier in week three and completed well ahead of the deadline.

Here are the posts collected from week two of the SQL New Blogger Challenge. It’s been compiled the same way last week’s was.

Author Post
@AaronBertrand #SQLNewBlogger Roundup – SQL Sentry Team Blog
@MtnDBA #SQLNewBlogger Week 2 – Teach Something New | DBA With Altitude
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Back on the Job Hunt
@DwainCSQL An Easter SQL | dwaincsql
@DwainCSQL An Even Faster Method of Calculating the Median on a Partitioned Heap | dwaincsql
@arrowdrive Anders On SQL: Stupid stuff I have done. 1/? Or, How I learned to stop GUIing and love the script
@MattBatalon Another TRUNCATE vs. DELETE tidbit… | Matt Batalon
@way0utwest April Blogging Challenge 2 – Primary Key in CREATE TABLE | Voice of the DBA
@GuruArthur Arthur BaanSQL Server error 17310 – Arthur Baan
@Pittfurg Blog Series: SQL Server Backup and Restores with PowerShell – Port 1433
@fade2blackuk Checking SQL Server User Role Membership with PowerShell « SQL DBA with A Beard
@gorandalf Database Compatibility Level 101 | Gorandalf’s SQL Blog
@nocentino Designing for offloaded log backups in AlwaysOn Availability Groups – Monitoring – Centino Systems Blog
@SqlrUs Detaching a Database – File Security Gotcha | John Morehouse | sqlrus.com
@MartynJones76 Devon DBA: Check Database Integrity Task Failed … Oh Dear Master Luke!
@toddkleinhans How Do You Visualize Abstractions? | toddkleinhans.com
@AalamRangi How to Have Standard Logging in SSIS and Avoid Traps | SQL Erudition
@gorandalf How to Test Existing T-SQL Code Before Changing the Compatibility Level | Gorandalf’s SQL Blog
@EdDebug HOWTO-Get-T-SQL-Into-SSDT | the.agilesql.club
@DBAFromTheCold In-Memory OLTP: Part 2 – Indexes | The DBA Who Came In From The Cold
@nicharsh It’s a Harsh Reality – SQL Sentry Team Blog
@SQLBek Learn Something New – SSMS Tips & Tricks « Every Byte Counts
@cjsommer Modify SQL Agent Jobs using PowerShell and SMO | cjsommer.comcjsommer.com
@MyHumbleSQLTips My Humble SQL Tips: Full List of SQL Server 2014 DMVs
@MyHumbleSQLTips My Humble SQL Tips: Running DBCC CHECKDB on TEMPDB
@way0utwest New Blogger Challenge 1 – Adding a Primary Key | Voice of the DBA
@uMa_Shankar075 Querying Microsoft SQL Server: In Memory Optimized Table in SQL Server 2014
@Jorriss Random Thoughts of Jorriss
@pmpjr Sidenote, the 4200 databases are a different story for another week… | I have no idea what I’m doing
@ALevyInROC SQL New Blogger Challenge Weekly Digest | The Rest is Just Code
@jh_randall SQL Server Monitoring – Getting it Right – SQL Sentry
@cathrinew Table Partitioning in SQL Server – The Basics – Cathrine Wilhelmsen
@eleightondick Teach Something New: PowerShell Providers [T-SQL Tuesday #065] | The Data Files
@rabryst The Art of Improvisation – Born SQL
@DBAFromTheCold The DBA Who Came In From The Cold | Advice on working as a SQL Server DBA
@Lance_LT The estimated query plan and the plan cache (Part 2) | Lance Tidwell the Silent DBA
@SQLJudo TSQL Tue 65: Memory Optimized Hash Indexes | Russ Thomas – SQL Judo
@sqlsanctum T-SQL Tuesday #065 – Teach Something New – APPLY | SQL Sanctum
@_KenWilson T-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@m82labs Untangling Dynamic SQL · m82labs
@cathrinew Using a Numbers Table in SQL Server to insert test data – Cathrine Wilhelmsen
@tpet1433 Why yes I can do table level restores – Tim Peters
@Jorriss Why You Absolutely Need Alternate Keys: A Unique Constraint Story

SQL New Blogger Challenge Weekly Digest

Watching all of the tweets as people posted their first entries in the SQL New Blogger Challenge earlier this week, I quickly realized that keeping up was going to be a challenge of its own. Fortunately, there are ways to reign it in.

My first stop was IFTTT (If This Then That). IFTTT allows you to create simple “recipes” to watch for specific events/conditions, then perform an action. They have over 175 “channels” to choose from, each of which has one or more triggers (events) and actions. I have IFTTT linked to both my Google and Twitter accounts, which allowed me to create a recipe which watches Twitter for the #sqlnewblogger hashtag, and writes any tweets that match it to a spreadsheet on my Google Drive account (I’ll make the spreadsheet public for now, why not?).

The next step is to export the spreadsheet to CSV. I don’t have this automated, and may not be able to (I may have to find another workaround). Once it’s a CSV, I can go to PowerShell to parse my data. I want the end result to be an HTML table showing each post’s author (with a link to their Twitter stream) and a link to the post (using the title of the post itself).

Once I import the CSV file into an object in my PowerShell script, I need to do some filtering. I don’t want to be collecting all the retweets (posts starting with RT), and I should probably exclude any post that doesn’t contain a URL (looking for the string HTTP).

To extract the post URLs, I ran a regular expression against each tweet. Twitter uses their own URL shortener (of course), which makes this pretty easy – I know the hostname is t.co, and after the slash is an alphanumeric string. The regex to match this is fairly simple: [http|https]+://t.co/[a-zA-Z0-9]+

Then, for each URL found in the tweet, I use Invoke-WebRequest to fetch the page. This cmdlet automatically follows any HTTP redirects (I was afraid I’d have to do this myself), so the object returned is the real destination page. Invoke-WebRequest also returns the parsed HTML of the page (assuming you use the right HTTP method), so I can extract the title easily instead of having to parse the content myself. It’ll also give me the “final” URL (the destination reached after all the redirects). Easy!

My full script:

#requires -version 3
param ()
set-strictmode -Version latest;
Add-Type -AssemblyName System.Web;
$AllTweets = import-csv -path 'C:\Dropbox\MiscScripts\Sqlnewblogger tweets - Sheet1.csv' | where-object {$_.text -notlike "RT *" -and $_.text -like "*http*"} | select-object -property "Tweet By",Text,Created | Sort-Object -property created -Unique;
$TweetLinks = @();
foreach ($Tweet in $AllTweets) {
    $Tweet.text -match '([http|https]+://t.co/[a-zA-Z0-9]+)' | out-null;
    foreach ($URL in $Matches) {
        $MyURL = $URL.Item(0);
# Invoke-WebRequest automatically follows HTTP Redirects. We can override this with -MaxRedirection 0 but in this case, we want it!
        $URLCheck = Invoke-WebRequest -Method Get -Uri $MyUrl;
        $OrigUrl = $URLCheck.BaseResponse.ResponseUri;
        write-debug $Tweet.'Tweet By';
        Write-debug $URLCheck.ParsedHtml.title;
        write-debug $URLCheck.BaseResponse.ResponseUri;
        $TweetLinks += new-object -TypeName PSObject -Property @{"Author"=$Tweet.'Tweet By';"Title"=$URLCheck.ParsedHtml.title;"URL"=$URLCheck.BaseResponse.ResponseUri;};
Write-debug $TweetLinks;
$TableOutput = "<table><thead><tr><td>Author</td><td>Post</td></tr></thead><tbody>";
foreach ($TweetLink in $TweetLinks) {
$TableOutput += "<tr><td><a href=""https://twitter.com/$($TweetLink.Author.replace('@',''))"">$($TweetLink.Author)</a></td><td><a href=""$($TweetLink.URL)"">$([System.Web.HttpUtility]::HtmlEncode($TweetLink.Title))</a></td></tr>";
$TableOutput += "</tbody></table>";

And now, my digest of the first week of the SQL New Blogger Challenge. This is not a complete listing because I didn’t think to set up the IFTTT recipe until after things started. I also gave people the benefit of the doubt on the timing (accounting for timezones, etc.) and included a few posted in the early hours of April 8th. For week 2, it will be more complete.

Author Post
@eleightondick Kevin Kline on Twitter: “Advice to New Bloggers http://t.co/o1jfLOR4QI

Safe exit from WHILE loop using ##global temp tables | One developer’s SQL blog
@eleightondick Mike Donnelly on Twitter: “T-SQL Tuesday #065 – Teach Something New http://t.co/LoyFbhVOpw #tsql2sday”
@GuruArthur Arthur BaanTrace flags in SQL Server – Arthur Baan
@cjsommer Blogging and Intellectual Property Law | legalzoom.com
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Answering a Blog Challenge (Plus, Why I Like Data)
@arrowdrive Anders On SQL: A bit about me continued. Anders meets SQL
@SQLJudo Experience Is Overated | Russ Thomas – SQL Judo
@SQLBek T-SQL Tuesday #065 – Teach Something New | Mike Donnelly, SQLMD
@MtnDBA #SQLNewBlogger Week 1 “Eye of the Tiger” | DBA With Altitude
@Lance_LT The estimated query plan and the plan cache (Part 1) | Lance Tidwell the Silent DBA
@AalamRangi How to Use Temp Table in SSIS | SQL Erudition
@DwainCSQL An Easter SQL


There are a couple limitations and gotchas with this process:

  • The IFTTT recipe only runs every 15 minutes (all IFTTT triggers run on 15 minute intervals) and only fetches 15 tweets each time it runs (again, IFTTT’s configuration). So if there’s a flood of tweets, they won’t all be captured.
  • I don’t really check the final destination of a link. For example, one of the first tweets captured contained a link to another tweet, which then linked to a blog post. Could I detect this & resolve the true final destination? Probably. But it’d involve a lot more logic, and it’s getting late.
  • I also don’t pick up every duplicate link/post. Again, I can probably get by this with some extra logic, but I don’t think it’s necessary right now.
  • It doesn’t post automatically to my blog, or anywhere else. I have to manually paste the HTML into my blog post(s).
  • I had to manually remove one link as it didn’t actually point to a post written for the challenge; it was tweeted with the hashtag, so my IFTTT recipe caught it.
  • I started collecting these tweets mid-day April 7th. If you posted before that, I’ve likely missed your post. You will be picked up for Week Two!

Connecting SQLite to SQL Server with PowerShell

This post is part of Ed Leighton-Dick’s SQL New Blogger Challenge. Please follow and support these new (or reborn) bloggers.

I’m working with a number of SQLite databases as extra data sources in addition to the SQL Server database I’m primarily using for a project. Brian Davis (b|t) wrote a blog post a few years ago that covers setting up the connection quite well. In my case, I’ve got nine SQLite databases to connect to, and that gets tedious. PowerShell to the rescue!

I started by installing the SQLite ODBC Drivers and creating one ODBC connection for reference. Brian’s post linked above covers that well. But I don’t want to do it eight more times, so I’ll use the first DSN as a template so I can script the creation of the rest.

I named my DSN GSAKMyFinds. To inspect the DSN, I can use the Get-OdbcDsn cmdlet.

Get-OdbcDsn -Name GSAKMyFinds;
Name : GSAKMyFinds
DsnType : System
Platform : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute : {Database, Description, NoTXN, LongNames...}

This looks pretty simple, but there’s a collection of Attributes I need to look at too. I’ll do this by expanding that property with Select-Object.

Get-OdbcDsn -Name GSAKMyFinds | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;
Name        Value                                               
----        -----                                               
Database    C:\Users\andy\Dropbox\GSAK8\data\My Finds\sqlite.db3
NoTXN       0                                                   
LongNames   0                                                   
FKSupport   0                                                   
JDConv      0                                                   
StepAPI     0                                                   
BigInt      0                                                   
NoWCHAR     0                                                   
OEMCP       0                                                   
NoCreat     0                                                   
ShortNames  0                                                   

Now I have everything I need to create a new DSN with Add-OdbcDsn. All of my SQLite databases are stored in a directory structure under C:\Users\andy\Dropbox\GSAK8\data\, with each one in a different subdirectory. For now, I’ll just create one to make sure that I’m doing it right, then use Get-OdbcDsn to see if it matches with my GUI-created DSN.

Add-OdbcDsn -Name GSAKPuzzles -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3";
Get-OdbcDsn -Name GSAKPuzzles;
Get-OdbcDsn -Name GSAKPuzzles | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;


Name       : GSAKPuzzles
DsnType    : System
Platform   : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute  : {Database}

Name     Value                                                      
----     -----                                                      
Database C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3

Looks pretty good! Note that not all of the Attributes seen above are here; those are default values that are set when creating the DSN through the GUI. After deleting my first two test DSN, I can move on to looping through all of my SQLite databases and creating DSNs for all of them. SQLite databases are just files on your filesystem, so by iterating over all of the db3 files in the parent directory I can build the list of files to point my DSNs at.

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3 | Select-Object -ExpandProperty FullName | ForEach-Object {
 $DSNName = $_.split("\")[6];
 Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";
Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";

In a few seconds, the DSNs are created.

Name               DsnType Platform DriverName          Attribute                                   
----               ------- -------- ----------          ---------                                   
GSAKMain           System  64-bit   SQLite3 ODBC Driver {Database, Description, NoTXN, LongNames...}
Far-off puzzles    System  64-bit   SQLite3 ODBC Driver {Database}                                  
Home200            System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Finds           System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Hides           System  64-bit   SQLite3 ODBC Driver {Database}                                  
New England        System  64-bit   SQLite3 ODBC Driver {Database}                                  
Niagara Falls      System  64-bit   SQLite3 ODBC Driver {Database}                                  
NJ                 System  64-bit   SQLite3 ODBC Driver {Database}                                  
Seattle            System  64-bit   SQLite3 ODBC Driver {Database}

Next up is creating the linked servers in SQL Server. I created one with Management Studio using all the defaults, then scripted it to see what I need to do. The only parts I really need are sp_addlinkedserver and sp_addlinkedsrvlogin; the defaults for the other options are good enough for what I’m doing here (this may not be true for you, so be sure to check!).

EXEC master.dbo.sp_addlinkedserver @server = N'GSAKMAIN', @srvproduct=N'GSAKMain', @provider=N'MSDASQL', @datasrc=N'GSAKMain'
 /* For security reasons the linked server remote logins password is changed with ######## */
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'GSAKMAIN',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL

Now I can put this into a PowerShell loop and run it for all of my other DSNs.

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;

I let this run and when it’s finished, all my DSNs are linked servers ready to be queried.


Because I’m going to be querying all of these linked servers together, I wrote some additional code to give me a skeleton query performing a UNION across all of my linked servers which I can use as a starting point.

Here’s the final script:

#require -version 3.0
#requires -module sqlps
set-strictmode -Version latest;
set-location c:;

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3|Select-Object -ExpandProperty FullName | ForEach-Object {
	$DSNName = $_.split("\")[6];
	Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
$AllDatabasesUnion = "";
foreach ($DSN in $AllDSNs) {
    $AllDatabasesUnion += "SELECT * FROM OPENQUERY([$($DSN.Name)], 'select * from caches') UNION ALL`n";
$AllDatabasesUnion = $AllDatabasesUnion.Substring(0,$AllDatabasesUnion.Length - 10);
Write-Output $AllDatabasesUnion;

And the query that it generated for me:

SELECT * FROM OPENQUERY([GSAKMain], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Far-off puzzles], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Home200], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Finds], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Hides], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([New England], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Niagara Falls], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([NJ], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Seattle], 'select * from caches')

With a little exploration of the PowerShell OdbcDsn cmdlets, I’ve eliminated a tedious process and prevented any accidental mouse clicks in a GUI.