SQL New Blogger Digest – Week 3

Here are the posts collected from week three of the SQL New Blogger Challenge. It’s been compiled the same way previous weeks’ posts were. Everyone’s doing a great job keeping up with the challenge!

Author Post
@MtnDBA #SQLNewBlogger Week 3 – PowerShell Aliases | DBA With Altitude
@ceedubvee A Insider's View of the Autism Spectrum: Autism and Information Technology: New Efforts for Kids to Code
@arrowdrive Anders On SQL: Stupid Stuff I have done. 2/?. Sometimes even a dev server is not a good dev environment
@way0utwest April Blogger Challenge 3 – Filtered Indexes | Voice of the DBA
@eleightondick Creating a Self-Contained Multi-Subnet Test Environment, Part I – Networking | The Data Files
@ceedubvee Empower Individuals With Autism Through Coding | Indiegogo
@MattBatalon EXCEPT and INTERSECT… | Matt Batalon
@cjsommer Follow the yellow brick what? My road to public speaking. | cjsommer.com
@DBAFromTheCold In-Memory OLTP: Part 3 – Checkpoints | The DBA Who Came In From The Cold
@MattBatalon Introduction to Windowing Functions | Matt Batalon
@nocentino Moving SQL Server data between filegroups – Part 1 – Database Structures – Centino Systems Blog
@Lance_LT My first year as a speaker | Lance Tidwell the Silent DBA
@MyHumbleSQLTips My Humble SQL Tips: Tracking Page Splits
@ALevyInROC Padding Fields for Fixed-Position Data Formats | The Rest is Just Code
@tpet1433 Sir-Auto-Completes-A-Lot a.k.a. how to break IntelliSense, SQL Prompt and SQL Complete – Tim Peters
@pmpjr stats, yeah stats. | I have no idea what I'm doing
@DwainCSQL Stupid T-SQL Tricks – Part 3: A Zodiacal SQL | dwaincsql
@cathrinew Table Partitioning in SQL Server – Partition Switching – Cathrine Wilhelmsen
@gorandalf The MERGE Statement – One Statement for INSERT, UPDATE and DELETE | Gorandalf's SQL Blog
@SQLJudo The Road to SQL Server 2014 MCSE | Russ Thomas – SQL Judo
@GGreggB T-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@AalamRangi What is the RetainSameConnection Property of OLEDB Connection in SSIS? | SQL Erudition
@EdDebug What Permissions do I need to generate a deploy script with SSDT? | the.agilesql.club
@_KenWilson Windowing using OFFSET-FETCH | Ken Wilson
@DesertIsleSQL What Does Analytics Mean?
@DesertIsleSQL Azure ML, SSIS and the Modern Data Warehouse
@DesertIsleSQL Musing about Microsoft’s Acquisition of DataZen and Power BI
@GuruArthur Check for database files not in default location

Padding Fields for Fixed-Position Data Formats

Fixed-position data formats will seemingly be with us forever. Despite the relative ease of parsing CSV (or other delimited formats), or even XML, many data exchanges require a fixed-position input. Characters 1-10 are X, characters 11-15 are Y and if the source data is fewer than 5 characters, we have to left-pad with a filler character, etc. When you’re accustomed to working with data that says what it means and means what it says, having to add “extra fluff” like left-padding your integers with a half-dozen zeroes can be a hassle.

I received a draft of a stored procedure recently which had to do exactly this. The intent is for the procedure to output the data almost entirely formatted as required, one record per line in the output file, and dump the result set to a file on disk. As it was given to me, the procedure was peppered with CASE statements like this (only more complex) in the SELECT clause:

-- Method 1
select case len(cast(logid as varchar))
when 9 then '0' + cast(logid as varchar)
when 8 then '00' + cast(logid as varchar)
when 7 then '000' + cast(logid as varchar)
when 6 then '0000' + cast(logid as varchar)
when 5 then '00000' + cast(logid as varchar)
when 4 then '000000' + cast(logid as varchar)
when 3 then '0000000' + cast(logid as varchar)
when 2 then '00000000' + cast(logid as varchar)
when 1 then '000000000' + cast(logid as varchar)
when 0 then '0000000000' + cast(logid as varchar)
end as logid
,logtext from cachedb.dbo.logs;

It’s perfectly valid, it works, and there’s nothing inherently wrong with it. But I find it a bit tough to read, and it could become trouble if the format changes later, as additional (or fewer) cases will have to be accounted for. Fortunately, the day I received this procedure was right around the day I learned about the REPLICATE() T-SQL function. Maybe we can make this simpler:

select replicate('0',10-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Not bad. But it leaves us with a magic number and similar to the previous example, if the file format changes we have to seek out these magic numbers and fix them. This is easily remedied by defining these field lengths at the beginning of the procedure, so that they’re all in one place if anything needs to change.

-- Method 2
declare @paddedlength int = 10;
select replicate('0',@paddedlength-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;

Yet another approach would be to pad out the value beyond what we need, then trim the resulting string back to the required length. Again, we have to be careful to not leave ourselves with magic numbers; the solution is the same as when using REPLICATE():

-- Method 3
select right('0000000000' + cast(logid as varchar), 10) as logid,logtext from cachedb.dbo.logs;
-- Or, with more flexibility/fewer magic numbers
-- Method 4
declare @paddedlength int = 10;
select right(replicate('0',@paddedlength) + cast(logid as varchar), @paddedlength) as logid,logtext from cachedb.dbo.logs;

All four methods yield the same results, as far as the data itself is concerned. But what about performance? For a table with 523,732 records, execution times were:

  1. 2,000ms CPU time, 261,785ms elapsed
  2. 2,265ms CPU time, 294,399ms elapsed
  3. 2,000ms CPU time, 297,593ms elapsed
  4. 2,078ms CPU time, 302,045ms elapsed

Each method had an identical execution plan, so I’m probably going to opt for the code that’s more readable and maintainable – method 2 or 4.

As with any tuning, be sure to test with your own data & queries.

SQL New Blogger Digest – Week 2

I didn’t intend for last week’s digest to also be my post for week two of the challenge, but life got in the way and I wasn’t able to complete the post that I really wanted in time. So, that post will be written much earlier in week three and completed well ahead of the deadline.

Here are the posts collected from week two of the SQL New Blogger Challenge. It’s been compiled the same way last week’s was.

Author Post
@AaronBertrand #SQLNewBlogger Roundup – SQL Sentry Team Blog
@MtnDBA #SQLNewBlogger Week 2 – Teach Something New | DBA With Altitude
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Back on the Job Hunt
@DwainCSQL An Easter SQL | dwaincsql
@DwainCSQL An Even Faster Method of Calculating the Median on a Partitioned Heap | dwaincsql
@arrowdrive Anders On SQL: Stupid stuff I have done. 1/? Or, How I learned to stop GUIing and love the script
@MattBatalon Another TRUNCATE vs. DELETE tidbit… | Matt Batalon
@way0utwest April Blogging Challenge 2 – Primary Key in CREATE TABLE | Voice of the DBA
@GuruArthur Arthur BaanSQL Server error 17310 – Arthur Baan
@Pittfurg Blog Series: SQL Server Backup and Restores with PowerShell – Port 1433
@fade2blackuk Checking SQL Server User Role Membership with PowerShell « SQL DBA with A Beard
@gorandalf Database Compatibility Level 101 | Gorandalf’s SQL Blog
@nocentino Designing for offloaded log backups in AlwaysOn Availability Groups – Monitoring – Centino Systems Blog
@SqlrUs Detaching a Database – File Security Gotcha | John Morehouse | sqlrus.com
@MartynJones76 Devon DBA: Check Database Integrity Task Failed … Oh Dear Master Luke!
@toddkleinhans How Do You Visualize Abstractions? | toddkleinhans.com
@AalamRangi How to Have Standard Logging in SSIS and Avoid Traps | SQL Erudition
@gorandalf How to Test Existing T-SQL Code Before Changing the Compatibility Level | Gorandalf’s SQL Blog
@EdDebug HOWTO-Get-T-SQL-Into-SSDT | the.agilesql.club
@DBAFromTheCold In-Memory OLTP: Part 2 – Indexes | The DBA Who Came In From The Cold
@nicharsh It’s a Harsh Reality – SQL Sentry Team Blog
@SQLBek Learn Something New – SSMS Tips & Tricks « Every Byte Counts
@cjsommer Modify SQL Agent Jobs using PowerShell and SMO | cjsommer.comcjsommer.com
@MyHumbleSQLTips My Humble SQL Tips: Full List of SQL Server 2014 DMVs
@MyHumbleSQLTips My Humble SQL Tips: Running DBCC CHECKDB on TEMPDB
@way0utwest New Blogger Challenge 1 – Adding a Primary Key | Voice of the DBA
@uMa_Shankar075 Querying Microsoft SQL Server: In Memory Optimized Table in SQL Server 2014
@Jorriss Random Thoughts of Jorriss
@pmpjr Sidenote, the 4200 databases are a different story for another week… | I have no idea what I’m doing
@ALevyInROC SQL New Blogger Challenge Weekly Digest | The Rest is Just Code
@jh_randall SQL Server Monitoring – Getting it Right – SQL Sentry
@cathrinew Table Partitioning in SQL Server – The Basics – Cathrine Wilhelmsen
@eleightondick Teach Something New: PowerShell Providers [T-SQL Tuesday #065] | The Data Files
@rabryst The Art of Improvisation – Born SQL
@DBAFromTheCold The DBA Who Came In From The Cold | Advice on working as a SQL Server DBA
@Lance_LT The estimated query plan and the plan cache (Part 2) | Lance Tidwell the Silent DBA
@SQLJudo TSQL Tue 65: Memory Optimized Hash Indexes | Russ Thomas – SQL Judo
@sqlsanctum T-SQL Tuesday #065 – Teach Something New – APPLY | SQL Sanctum
@_KenWilson T-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@m82labs Untangling Dynamic SQL · m82labs
@cathrinew Using a Numbers Table in SQL Server to insert test data – Cathrine Wilhelmsen
@tpet1433 Why yes I can do table level restores – Tim Peters
@Jorriss Why You Absolutely Need Alternate Keys: A Unique Constraint Story

SQL New Blogger Challenge Weekly Digest

Watching all of the tweets as people posted their first entries in the SQL New Blogger Challenge earlier this week, I quickly realized that keeping up was going to be a challenge of its own. Fortunately, there are ways to reign it in.

My first stop was IFTTT (If This Then That). IFTTT allows you to create simple “recipes” to watch for specific events/conditions, then perform an action. They have over 175 “channels” to choose from, each of which has one or more triggers (events) and actions. I have IFTTT linked to both my Google and Twitter accounts, which allowed me to create a recipe which watches Twitter for the #sqlnewblogger hashtag, and writes any tweets that match it to a spreadsheet on my Google Drive account (I’ll make the spreadsheet public for now, why not?).

The next step is to export the spreadsheet to CSV. I don’t have this automated, and may not be able to (I may have to find another workaround). Once it’s a CSV, I can go to PowerShell to parse my data. I want the end result to be an HTML table showing each post’s author (with a link to their Twitter stream) and a link to the post (using the title of the post itself).

Once I import the CSV file into an object in my PowerShell script, I need to do some filtering. I don’t want to be collecting all the retweets (posts starting with RT), and I should probably exclude any post that doesn’t contain a URL (looking for the string HTTP).

To extract the post URLs, I ran a regular expression against each tweet. Twitter uses their own URL shortener (of course), which makes this pretty easy – I know the hostname is t.co, and after the slash is an alphanumeric string. The regex to match this is fairly simple: [http|https]+://t.co/[a-zA-Z0-9]+

Then, for each URL found in the tweet, I use Invoke-WebRequest to fetch the page. This cmdlet automatically follows any HTTP redirects (I was afraid I’d have to do this myself), so the object returned is the real destination page. Invoke-WebRequest also returns the parsed HTML of the page (assuming you use the right HTTP method), so I can extract the title easily instead of having to parse the content myself. It’ll also give me the “final” URL (the destination reached after all the redirects). Easy!

My full script:

#requires -version 3
[cmdletbinding()]
param ()
set-strictmode -Version latest;
Add-Type -AssemblyName System.Web;
$AllTweets = import-csv -path 'C:\Users\andy.LEVYNETSCUS\Dropbox\MiscScripts\Sqlnewblogger tweets - Sheet1.csv' | where-object {$_.text -notlike "RT *" -and $_.text -like "*http*"} | select-object -property "Tweet By",Text,Created | Sort-Object -property created -Unique;
$TweetLinks = @();
foreach ($Tweet in $AllTweets) {
    $Tweet.text -match '([http|https]+://t.co/[a-zA-Z0-9]+)' | out-null;
    foreach ($URL in $Matches) {
        $MyURL = $URL.Item(0);
# Invoke-WebRequest automatically follows HTTP Redirects. We can override this with -MaxRedirection 0 but in this case, we want it!
        $URLCheck = Invoke-WebRequest -Method Get -Uri $MyUrl;
        $OrigUrl = $URLCheck.BaseResponse.ResponseUri;
        write-debug $Tweet.'Tweet By';
        Write-debug $URLCheck.ParsedHtml.title;
        write-debug $URLCheck.BaseResponse.ResponseUri;
        $TweetLinks += new-object -TypeName PSObject -Property @{"Author"=$Tweet.'Tweet By';"Title"=$URLCheck.ParsedHtml.title;"URL"=$URLCheck.BaseResponse.ResponseUri;};
    }
}
Write-debug $TweetLinks;
$TableOutput = "<table><thead><tr><td>Author</td><td>Post</td></tr></thead><tbody>";
foreach ($TweetLink in $TweetLinks) {
$TableOutput += "<tr><td><a href=""https://twitter.com/$($TweetLink.Author.replace('@',''))"">$($TweetLink.Author)</a></td><td><a href=""$($TweetLink.URL)"">$([System.Web.HttpUtility]::HtmlEncode($TweetLink.Title))</a></td></tr>";
}
$TableOutput += "</tbody></table>";
$TableOutput;

And now, my digest of the first week of the SQL New Blogger Challenge. This is not a complete listing because I didn’t think to set up the IFTTT recipe until after things started. I also gave people the benefit of the doubt on the timing (accounting for timezones, etc.) and included a few posted in the early hours of April 8th. For week 2, it will be more complete.

Author Post
@eleightondick Kevin Kline on Twitter: “Advice to New Bloggers http://t.co/o1jfLOR4QI

@BarbiducH
Safe exit from WHILE loop using ##global temp tables | One developer’s SQL blog
@eleightondick Mike Donnelly on Twitter: “T-SQL Tuesday #065 – Teach Something New http://t.co/LoyFbhVOpw #tsql2sday”
@GuruArthur Arthur BaanTrace flags in SQL Server – Arthur Baan
@cjsommer Blogging and Intellectual Property Law | legalzoom.com
@ceedubvee A Insider’s View of the Autism Spectrum: Autism and Information Technology: Answering a Blog Challenge (Plus, Why I Like Data)
@arrowdrive Anders On SQL: A bit about me continued. Anders meets SQL
@SQLJudo Experience Is Overated | Russ Thomas – SQL Judo
@SQLBek T-SQL Tuesday #065 – Teach Something New | Mike Donnelly, SQLMD
@MtnDBA #SQLNewBlogger Week 1 “Eye of the Tiger” | DBA With Altitude
@Lance_LT The estimated query plan and the plan cache (Part 1) | Lance Tidwell the Silent DBA
@AalamRangi How to Use Temp Table in SSIS | SQL Erudition
@DwainCSQL An Easter SQL

Limitations

There are a couple limitations and gotchas with this process:

  • The IFTTT recipe only runs every 15 minutes (all IFTTT triggers run on 15 minute intervals) and only fetches 15 tweets each time it runs (again, IFTTT’s configuration). So if there’s a flood of tweets, they won’t all be captured.
  • I don’t really check the final destination of a link. For example, one of the first tweets captured contained a link to another tweet, which then linked to a blog post. Could I detect this & resolve the true final destination? Probably. But it’d involve a lot more logic, and it’s getting late.
  • I also don’t pick up every duplicate link/post. Again, I can probably get by this with some extra logic, but I don’t think it’s necessary right now.
  • It doesn’t post automatically to my blog, or anywhere else. I have to manually paste the HTML into my blog post(s).
  • I had to manually remove one link as it didn’t actually point to a post written for the challenge; it was tweeted with the hashtag, so my IFTTT recipe caught it.
  • I started collecting these tweets mid-day April 7th. If you posted before that, I’ve likely missed your post. You will be picked up for Week Two!

Connecting SQLite to SQL Server with PowerShell

This post is part of Ed Leighton-Dick’s SQL New Blogger Challenge. Please follow and support these new (or reborn) bloggers.

I’m working with a number of SQLite databases as extra data sources in addition to the SQL Server database I’m primarily using for a project. Brian Davis (b|t) wrote a blog post a few years ago that covers setting up the connection quite well. In my case, I’ve got nine SQLite databases to connect to, and that gets tedious. PowerShell to the rescue!

I started by installing the SQLite ODBC Drivers and creating one ODBC connection for reference. Brian’s post linked above covers that well. But I don’t want to do it eight more times, so I’ll use the first DSN as a template so I can script the creation of the rest.

I named my DSN GSAKMyFinds. To inspect the DSN, I can use the Get-OdbcDsn cmdlet.

Get-OdbcDsn -Name GSAKMyFinds;
Name : GSAKMyFinds
DsnType : System
Platform : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute : {Database, Description, NoTXN, LongNames...}

This looks pretty simple, but there’s a collection of Attributes I need to look at too. I’ll do this by expanding that property with Select-Object.

Get-OdbcDsn -Name GSAKMyFinds | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;
Name        Value                                               
----        -----                                               
Database    C:\Users\andy\Dropbox\GSAK8\data\My Finds\sqlite.db3
Description                                                     
NoTXN       0                                                   
LongNames   0                                                   
Timeout                                                         
FKSupport   0                                                   
JDConv      0                                                   
StepAPI     0                                                   
BigInt      0                                                   
NoWCHAR     0                                                   
SyncPragma                                                      
LoadExt                                                         
OEMCP       0                                                   
NoCreat     0                                                   
ShortNames  0                                                   

Now I have everything I need to create a new DSN with Add-OdbcDsn. All of my SQLite databases are stored in a directory structure under C:\Users\andy\Dropbox\GSAK8\data\, with each one in a different subdirectory. For now, I’ll just create one to make sure that I’m doing it right, then use Get-OdbcDsn to see if it matches with my GUI-created DSN.

Add-OdbcDsn -Name GSAKPuzzles -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3";
Get-OdbcDsn -Name GSAKPuzzles;
Get-OdbcDsn -Name GSAKPuzzles | Select-Object -ExpandProperty Attribute |Format-Table -AutoSize;

Results:

Name       : GSAKPuzzles
DsnType    : System
Platform   : 64-bit
DriverName : SQLite3 ODBC Driver
Attribute  : {Database}

Name     Value                                                      
----     -----                                                      
Database C:\Users\andy\Dropbox\GSAK8\data\Far-off puzzles\sqlite.db3

Looks pretty good! Note that not all of the Attributes seen above are here; those are default values that are set when creating the DSN through the GUI. After deleting my first two test DSN, I can move on to looping through all of my SQLite databases and creating DSNs for all of them. SQLite databases are just files on your filesystem, so by iterating over all of the db3 files in the parent directory I can build the list of files to point my DSNs at.

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3 | Select-Object -ExpandProperty FullName | ForEach-Object {
 $DSNName = $_.split("\")[6];
 Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";
};
Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";

In a few seconds, the DSNs are created.

Name               DsnType Platform DriverName          Attribute                                   
----               ------- -------- ----------          ---------                                   
GSAKMain           System  64-bit   SQLite3 ODBC Driver {Database, Description, NoTXN, LongNames...}
Far-off puzzles    System  64-bit   SQLite3 ODBC Driver {Database}                                  
Home200            System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Finds           System  64-bit   SQLite3 ODBC Driver {Database}                                  
My Hides           System  64-bit   SQLite3 ODBC Driver {Database}                                  
New England        System  64-bit   SQLite3 ODBC Driver {Database}                                  
Niagara Falls      System  64-bit   SQLite3 ODBC Driver {Database}                                  
NJ                 System  64-bit   SQLite3 ODBC Driver {Database}                                  
Seattle            System  64-bit   SQLite3 ODBC Driver {Database}

Next up is creating the linked servers in SQL Server. I created one with Management Studio using all the defaults, then scripted it to see what I need to do. The only parts I really need are sp_addlinkedserver and sp_addlinkedsrvlogin; the defaults for the other options are good enough for what I’m doing here (this may not be true for you, so be sure to check!).

EXEC master.dbo.sp_addlinkedserver @server = N'GSAKMAIN', @srvproduct=N'GSAKMain', @provider=N'MSDASQL', @datasrc=N'GSAKMain'
 /* For security reasons the linked server remote logins password is changed with ######## */
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'GSAKMAIN',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL

Now I can put this into a PowerShell loop and run it for all of my other DSNs.

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
"@;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;
}

I let this run and when it’s finished, all my DSNs are linked servers ready to be queried.

LinkedServers

Because I’m going to be querying all of these linked servers together, I wrote some additional code to give me a skeleton query performing a UNION across all of my linked servers which I can use as a starting point.

Here’s the final script:

#require -version 3.0
#requires -module sqlps
set-strictmode -Version latest;
set-location c:;

Get-ChildItem -Path C:\users\andy\Dropbox\gsak8\data -Recurse -Filter sqlite.db3|Select-Object -ExpandProperty FullName | ForEach-Object {
	$DSNName = $_.split("\")[6];
	Add-OdbcDsn -Name $DSNName -DriverName "SQLite3 ODBC Driver" -Platform 64-bit -DsnType System -SetPropertyValue "Database=$_";
};

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
foreach ($DSN in $AllDSNs) {
    $CreateLinkedServerSP =@"
EXEC master.dbo.sp_addlinkedserver @server = N'$($DSN.Name)', @srvproduct=N'$($DSN.Name)', @provider=N'MSDASQL', @datasrc=N'$($DSN.Name)';
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'$($DSN.Name)',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL;
"@;
    invoke-sqlcmd -query $CreateLinkedServerSP -serverinstance sql2014 -database master;
}

$AllDSNs = Get-OdbcDsn -DriverName "SQLite3 ODBC Driver";
$AllDatabasesUnion = "";
foreach ($DSN in $AllDSNs) {
    $AllDatabasesUnion += "SELECT * FROM OPENQUERY([$($DSN.Name)], 'select * from caches') UNION ALL`n";
}
$AllDatabasesUnion = $AllDatabasesUnion.Substring(0,$AllDatabasesUnion.Length - 10);
Write-Output $AllDatabasesUnion;

And the query that it generated for me:

SELECT * FROM OPENQUERY([GSAKMain], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Far-off puzzles], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Home200], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Finds], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([My Hides], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([New England], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Niagara Falls], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([NJ], 'select * from caches') UNION ALL
SELECT * FROM OPENQUERY([Seattle], 'select * from caches')

With a little exploration of the PowerShell OdbcDsn cmdlets, I’ve eliminated a tedious process and prevented any accidental mouse clicks in a GUI.

Rochester SQL Server User Group February Meeting – Slides & Demos

On Thursday, February 26th I presented “Easing Into Windows PowerShell” to a packed house at the Rochester SQL Server User Group meeting. Thanks to Matt Slocum (b | t) for being my semi-official photographer.

Me, presenting!

Presenting Easing Into Windows PowerShell at the Rochester SQL Server User Group February 26, 2015

We set a chapter attendance record! I had a lot of fun presenting this (my first time speaking outside my company) and we had some great conversations during and after the meeting.

I’ve posted my slides & demos for your enjoyment.

Rochester PASS Chapter February Meeting – I’m Speaking!

On Thursday, February 26th at 6:00 PM EST I will be speaking at the Rochester PASS chapter meeting. The topic is “Easing Into PowerShell – What’s It All About?“.

You’ve been hearing a lot about Windows PowerShell, but you’re wondering if it’s something you should be looking into. In this introductory session, we’ll talk about what PowerShell is, where it came from, how it works, and what it can do for you. Whether you’re a junior DBA or seasoned veteran, you’ll find something that PowerShell can help you do easier.

If you’re planning to attend, please let us know by RSVPing at Nextplex. Slides will be posted here the following day.

T-SQL Tuesday #61 – Giving Back

T-SQL Tuesday LogoWayne Sheffield (b|t) is hosting this month’s T-SQL Tuesday and his topic is Giving Back to the SQL Community. More specifically, he’s asking how each of us is planning on giving something back to the SQL Community in 2015. He offers up a few suggestions, so I’ll start by addressing those and then move on to additional ideas.

  • Are you going to start speaking at your local user group?
    Yes, I expect that by the end of 2015 I will have spoken to our local chapter at least once. I spoke to various groups at work in 2014 and plan to continue doing so in 2015 as well.
  • Perhaps step up and help run your local user group?
    I was named the Vice President of our local chapter a couple months ago, and I will continue in that capacity.
  • Do you want to start becoming an active blogger – or increase your blogging?
    Yes! At the time of this writing I’ve only published 7 posts here, and I have 6 others in various stages of preparation. I have some ideas brewing, I just need to get things written and then actually press that Publish button. Part of it is fear/insecurity, and I need to get out of my comfort zone a little and Just Do It.
  • Do you plan on volunteering your time with larger organizations (such as PASS), so that SQL Training can occur at a larger level?
    If I have the opportunity to attend PASS Summit in 2015, I will volunteer at the event. When the call for pre-event volunteers go out, I’ll look at what’s needed and try to step a little out of my comfort zone & do something there as well.
  • Other ways of contributing
    • For the 3rd year, I will be helping to organize and run SQL Saturday Rochester in 2015. If you’re reading this, you probably know about SQL Saturday, and have probably even been to one. Next time, bring a friend!
    • I’ve been promoting PASS and our local chapter for a while at work and will be a more vocal in 2015. There are a lot of people with knowledge and experience they can share who aren’t even aware that PASS and the local and virtual user groups exist. I want to help bring those people into the community.

Lightning Talks at SQL Saturday?

We’re already in the early stages of preparing for our 2015 SQL Saturday. One thing that was missing from this year’s event was local speakers; we just didn’t have many, and I’m hoping we can change that the next time around.

For a lot of people (myself included), getting into speaking can be intimidating. Do I even have something interesting to say? What if I can’t fill an entire hour (or 75 minutes)? What if I get everything all wrong?

One session held each day at PASS Summit may hold the answer – Lightning Talks. In a standard session time block, 5-7 speakers each present for only about 10 minutes each. Step up, hook up your laptop, talk about a very focused, narrowly-scoped topic, and then yield the floor to the next speaker.

I held my own “lightning talk” at work one day as an experiment. While working on some reports, I found an alternative way to write a portion of the query which resulted in a significant improvement in the execution plan. I brought my team together, gave a brief intro (basically, the preceding sentence here), then gave a demo:

  • Present both versions of the query
  • Explain the change made for the faster version
  • Run the query
  • Compare I/O and time statistics
  • Compare the execution plans
  • Wrap up

With a couple of questions and some technical difficulties sprinkled in, it was over in just under 15 minutes. Not bad, and with some practice & honing, I think I could get it to 10 minutes.

Could this work at SQL Saturday scale? I think it’s mostly a matter of finding volunteers.  But this should be easier than full sessions. Everyone has at least one thing they’ve discovered, learned, experimented with or implemented in a creative way that’s worth sharing – even if it’s something very brief. With it being only 10-15 minutes, you don’t have to worry about filling a full session slot or losing the audience.

PASS Summit: Things to Do, People to See

PASS Summit is nearly upon us. I’m excited to be attending my second Summit in Seattle and cannot wait to get there to see everyone. With one Summit and a few SQL Saturdays under my belt I’ve got a laundry list of things and people I can’t miss, and very little time to pack it all into.

Let’s Meet!

The greatest part of Summit (and SQL Saturday) for me is meeting people and exchanging ideas. If you haven’t experienced it, #SQLFamily is amazing. When I reached the convention center two years ago, the first feeling that hit me was “I finally found my people!” We’re all friendly, I swear. Just say “hi, I’m <your name here>.”  I guarantee you will find people who are into the same stuff you’re into, and I’m not talking just talking about SQL Server. Music, dance, outdoor activities, all kinds of stuff. We have a common thing that brought us together, but that’s not what keeps us together. It is an amazing community and it just keeps getting better. On Sunday, as you’re decompressing from the event and travel, you will miss these people who you didn’t even know a week before.

You can even connect strangers with common interests. In 2012, I met someone over a power outlet who asked if I’d done anything with a particular piece of hardware and what I thought of it. Turns out that I hadn’t, but I knew that a former co-worker was also in attendance and he had used the hardware, so I gave them each others’ contact information.

Ping me on Twitter, find me at one of the places/events listed below, breakfast or lunch in the dining hall, or if you think you see me passing in the hall (picture on my Twitter profile), say something (and if it’s not me, you’ll meet someone else, which is still awesome). Maybe even dinner at the airport on Friday evening.

Get on Twitter

So many things happen at Summit which are announced and/or organized via Twitter. The main hashtag to follow is (I think) #summit14 but once you hit the ground you’ll start figuring out who and what to follow to get all the dirt.

Schedule

Tuesday

I’m arriving in Seattle late Tuesday morning and doing some sightseeing before checking into the hotel and Summit late in the afternoon. Then it’s off to the welcome reception. The first of several visits to Tap House Grill may be in order too.

Wednesday

Wednesday starts dark & early with #SQLRun at 6 AM. I had a great time getting a 5K in before dawn at my first Summit and I’m sure this one will be great too. Don’t forget to bring the right gear; it’s pre-dawn and right now the forecast is for 50°F and rain (in Seattle. Go figure).

Aside from the sessions throughout the day, I’ll probably be found in the Community Zone. I’ll also be serving as an Ambassador helping to direct people to the dining hall for lunch, posted outside room 4C so stop by and say hi.

Wednesday evening, I’m hosting a dinner for geocachers at the Daily Grill at 6:15 PM. If you’re a cacher, or just curious about it, stop by!

Once we’ve wrapped up there, I’ll go wherever the wind may take me; probably back to the Tap House.

Thursday

Thursday is my light day at Summit. I don’t have any sessions double-booked and the only event I really need to catch is the Argenis Without Borders folks in their fuzzy rainbow leggings.

Thursday evening I’ll be at the Ray Gun Lounge for Table Top Game Night. I’m looking forward to getting to know folks there and learn some new games. We don’t play a lot of table top games at home and I’d like to change that.

Friday

Lots more sessions on Friday, plus winding everything down. By the afternoon, I’ll probably be beat and just trying to rest at the Community Zone.

I fly out late Friday night, so I’ll be trying to find dinner somewhere between the convention center and airport. I’ll probably kill a lot of time in the terminal by wandering around, playing Ingress.

Packing List

At my first Summit, I learned a few lessons about what to take and what not to take. The most important thing to bring: empty space for all the stuff you’ll bring home. SWAG from the exhibitors, souvenirs, books and more. Next most important: power! Electrical outlets are few and far between, and there will be 5000 people vying for them to top off their phones and tablets. A quick rundown of some of the stuff that might not be obvious to bring (or easily forgotten) that I’m packing:

  • Small (1 pint) widemouth water bottle. I’m partial to this Nalgene bottle I got at a 5K earlier this year.
  • NUUN electrolyte tabs. Water gets boring after a while. These will help you stave off SQLPlague (don’t forget your vitamins too!).
  • Comfortable shoes. You’ll be on your feet a lot and walking even more; the convention center is big. Not to mention the evening activities.
  • A small notepad for taking non-session notes – phone numbers, names, etc. I love my Field Notes notebook.
  • A larger notepad for taking notes in sessions. Oh, and don’t forget a pen or three. I’ve tried doing notes on a tablet and on a computer, and it just doesn’t work as well as paper & pen for me. Bonus: no batteries!
  • Hand sanitizer. Because when you have 5000 people in one place, germs get around in a hurry no matter how careful you are.
  • good wall charger for your devices. I found myself short chargers last time and had to buy one at Radio Shack. It didn’t cut it. This one has two USB ports that charge at 2.1A, which will give you a good boost when you get near a plug, and you can share with a friend. It’ll also recharge pretty much anything while you sleep. Best of all, it’s really compact.
  • A good external battery pack. Matt Slocum (b | t) got me hooked on the Anker E5 15000 mAH battery. 2 ports so you can share with a friend and it’ll recharge most phones 4-5 times from a completely empty battery.
  • Plenty of USB cords to go with both of the above.
  • Business cards! I ordered mine at the office too late last time and had to get some made at Staples in a pinch.
  • A small, light backpack to carry all of this in (well, not the shoes). Session rooms get cramped, so carrying a big pack can be a pain.
  • A lock code on my phone and tablet. I normally don’t use one but at any large gathering like this, it’s better to be safe.
  • A list of the people I need to see/find/meet/reconnect with.

This Summit is going to be a blast. I cannot wait. There’s only two things I don’t look forward to:

  1. Having to sleep (I’ll miss stuff!)
  2. It’ll eventually end

Next Tuesday cannot come soon enough.