SQL New Blogger Challenge November 2015 Edition – Week 3 Digest

This week’s #sqlnewblogger posts!

Author Post
@eleightondick [T-SQL Tuesday] Data modeling: The trouble with prefixes | The Data Files
@tomsql Adventures With TomSQL, aka Tom Staab
@EdDebug Automatically name primary key constraints in SSDT | the.agilesql.club
@rabryst Born SQL on Twitter: “Temporal Tables – Under the Covers with the Transaction Log. 
@YatesSQL Community Involvement–Why Wait? | The SQL Professor
@cjsommer Identity Column Increment Value (EVEN/ODD) | cjsommer.com
@DBA_ANDY Nebraska SQL from @DBA_ANDY: CHECKDB – The database could not be exclusively locked to perform the operation
@ALevyInROC Selectively Locking Down Data – Gracefully – The Rest is Just Code
@eleightondick SQLNewBlogger, Week 3 | The Data Files
@tomsql Being Our Collective Best
@SQLMickey T-SQL Tuesday #72 Summary – Data Modeling Gone Wrong | Mickey’s T-SQL Ponderings

Selectively Locking Down Data – Gracefully

I have a situation where I need to retrieve the data in an encrypted column from, but don’t want to give all my users access to the symmetric key used to encrypt that column. The data is of the sort where it’s important for the  application to produce the required output, but if a user runs the stored procedure to see what the application is getting from it, it’s not critical that they see this one field.

The catch is that if the stored procedure is written with the assumption that the caller has permission to access the encryption key or its certificate, they’ll get an error. After a bit of research and pondering later, I came up with two options:

  1. Create the stored procedure with EXECUTE AS OWNER (the owner in this case is dbo). This would let all users see the encrypted data; not an ideal solution.
  2. Use SQL Server’s TRY/CATCH construct to gracefully handle the error thrown when the user attempts to open the key, but doesn’t have permission to do so.

Let’s check out option 2. This example is simplified from my actual scenario to demonstrate the idea.

declare @BankId varchar(6) = '123456';

SELECT cast('' as varchar(50)) AS AccountNum,
INTO #AccountData
FROM dbo.Accounts
WHERE OriginatingBank = @BankId
AND AccountType = 'Checking'



SELECT @AcctNo = CONVERT(VARCHAR, decryptbykey(AccountNum))
FROM dbo.Accounts
WHERE OriginatingBank = @BankId
AND AccountType = 'Checking'


SET @AcctNo = 'Access Restricted';

UPDATE #AccountData SET AccountNum = @AcctNo;

SELECT * FROM #AccountData;

DROP TABLE #AccountData;

TRY/CATCH in T-SQL works similarly to how it does in languages like C# or PowerShell. It allows you to attempt an operation and take care of any error conditions fairly easily.

In this case, I’m attempting to open the encryption key. But if the user doesn’t have permission to do so, it doesn’t terminate the stored procedure with an error. Instead, it jumps to the CATCH block, where I’ve defined an alternate way of handling the situation. Here, if the user doesn’t have the appropriate permissions, they’ll just get “Access Restricted” for the account number, and access to that sensitive data is a little more tightly controlled – while still letting users access the data they do need.

SQL New Blogger Challenge November 2015 Edition – Week 2 Digest

This week’s #sqlnewblogger posts!

Author Post
@arrowdrive Anders On SQL: T-SQL Tuesday #72: Data modelling gone extremely wrong
@rabryst Time After Time – An Introduction to Temporal Tables in SQL Server 2016 using a DeLorean
@EdDebug Deploy SSDT INSERTS in Batches | the.agilesql.club
@ALevyInROC Don’t Trust the Wizard – The Rest is Just Code
@DBA_ANDY Nebraska SQL from @DBA_ANDY: T-SQL Tuesday #72 – Implicit Conversion Problems
@eleightondick SQL New Blogger Challenge: Week 1 recap | The Data Files
@eleightondick SQL New Blogger Challenge: Week 2 ideas | The Data Files
@BeginTry SQL Server 2012 Upgrade: The RPC Server is Unavailable | It's All Just Electrons

Don’t Trust the Wizard

The one wizard you can trust
The one wizard you can trust

If you need to move data from one table into a new table, or even tables in a database into another database, the Import/Export Wizard in SQL Server Management Studio looks pretty tempting. Set up a source & destination, click a few buttons, kick back with a cup of tea and watch the progress bars, right?

It turns out that the wizard just isn’t as smart as it may seem. If you’re not careful, you won’t get what you’re expecting. Let’s check it out.

We’ll start by creating a real simple table in a database, containing a primary key and a computed column.

Create table sample.dbo.SourceTable (
RowId int identity(1,1) not null primary key,
Num1 int not null,
Num2 int not null,
Total as (Num1+Num2)

Let’s populate it with a few rows of data, then update some of that data to make sure the computed column is working. Remember, this is just to demonstrate the idea.

insert into sample.dbo.SourceTable (Num1, Num2) values (1,2);
go 100
select top 5 * from sample.dbo.SourceTable order by RowId;
update sample.dbo.SourceTable set Num1 = Num1 * RowId where RowId <= 3;
select top 5 * from sample.dbo.SourceTable order by RowId;

wizard_Screenshot 1

Great! We’ve got data, the computed columns are working, let’s copy it over to a new table in another database. We’ll just going to click Next, Next, Next through the wizard this time around.

wizard_Screenshot 2
wizard_Screenshot 3
wizard_Screenshot 4
wizard_Screenshot 5
Success! Our table has been copied and the data’s all there.

select top 5 * from Sample2.dbo.SourceTable order by RowId;

wizard_Screenshot 6

Let’s do some work on our new table and check out the results.

select top 5 * from Sample2.dbo.SourceTable order by RowId;
update Sample2.dbo.SourceTable set Num2 = Num2 * RowId where RowId < 3;
select top 5 * from Sample2.dbo.SourceTable order by RowId;

wizard_Screenshot 7

Woah! That’s not right. That Total column is supposed to be Num1 + Num2, and last time I checked 2 + 4 was not 4. Let’s keep going and try adding a new record the same way it was done earlier.

insert into Sample2.dbo.SourceTable (Num1, Num2) values (100,200);

Cannot insert the value NULL into column 'RowId', table 'Sample2.dbo.SourceTable'; column does not allow nulls. INSERT fails.

Huh. Now that’s really odd, isn’t it? RowId is supposed to be an identity – we shouldn’t have to populate it. What is going on here? Let’s script out the table.

USE [Sample2]

/****** Object: Table [dbo].[SourceTable] Script Date: 2015-11-10 22:36:23 ******/


CREATE TABLE [dbo].[SourceTable](
[RowId] [int] NOT NULL,
[Num1] [int] NOT NULL,
[Num2] [int] NOT NULL,
[Total] [int] NULL


This is all kinds of wrong! What happened to the primary key? Or the computed column? Well, it turns out that the wizard isn’t that smart, and if you just take all the default values, you’re going to get burned. Let’s go back to the wizard and click that Edit Mappings button in the Select Source Tables and Views screen.
wizard_Screenshot 8

Well…that looks like what we got above. And it’s not what we wanted. If we click Edit SQL, this is confirmed – the table being created is not defined the same way the source table is being defined.
wizard_Screenshot 9
Fortunately, we can edit the SQL here and make it match the source table definition, then finish the wizard.
wizard_Screenshot 10
OK, data’s copied – what do we have?

select top 5 * from Sample3.dbo.SourceTable order by RowId;
update Sample3.dbo.SourceTable set Num2 = Num2 * RowId where RowId < 3;
select top 5 * from Sample3.dbo.SourceTable order by RowId;
insert into Sample3.dbo.SourceTable (Num1, Num2) values (100,200);
select * from sample3.dbo.SourceTable where rowid >= 100 order by RowId;

wizard_Screenshot 11

Everything’s there, and it’s working the way it’s supposed to. Lesson learned: don’t blindly trust the defaults, especially the ones in a wizard. Double-check everything, and then verify that your schema works the way you expect it to before doing any business with it.

SQL New Blogger Challenge, November Edition, Week 1 Digest

Ed Leighton-Dick has renewed his New Blogger Challenge this month. Here are all (I think) the posts for this week after Ed posted his announcement. If I’ve missed any, please let me know and I’ll update.

Author Post
@arrowdrive Anders On SQL: First Timer Summit impressions.
@EdDebug Deploy SSDT INSERTS in Batches | the.agilesql.club
@EdDebug Looking at SSDT upgrade scripts | the.agilesql.club
@DBA_ANDY Nebraska SQL from @DBA_ANDY: PASS Summit 2015 Recap
@eleightondick PASS Summit 2015 Highlights | The Data Files
@OliverAsmus PASS Summit 2015: My Experience | OliverAsmus.com
@EdDebug ScriptDom Visualizer | the.agilesql.club
@eleightondick SQL New Blogger Challenge: Looking back… and a new challenge! | The Data Files
@Clem1029 Tearing down the wall | SQLDEV@Clemsplace
@ALevyInROC Why Ask Why? – The Rest is Just Code
@rabryst The SQL Server Family

Why Ask Why?

Spend any time around a 4 year old, and you will inevitably find yourself involved in a conversation which evolves into this:

  • Please do this thing
  • Why?
  • Reasonable answer
  • Why?
  • Restatement of reasonable answer
  • Why?
  • Shorter, more frustrated restatement of reasonable answer
  • Why?
  • Because that’s what has to be done
  • Why?
  • Because
  • Why?
  • I give up. Go ask your other parent

It’s a simple, but powerful and important question. The trouble is that when it’s a 4 year old asking it, in a lot of cases they can’t understand the answer. More often, they aren’t interested in understanding it.

Fortunately, there aren’t any 4 year olds in the average IT shop (although it may not be too far off).

A while ago, a data issue came to my team. Nothing major, but enough that it caused problems for a user. It’s a small glitch with an application component which pops up maybe once or twice a year, so it’s been decided that it’s better to just fix the data in those rare cases as opposed to spending 20 hours tracking down the root cause & fixing it there (I’m the SME for this component).

The requested correction was to delete the entire record, based on a previous fix to a similar but unrelated data problem. By the time I saw the request, a teammate had picked it up & started working on it.

“Wait! Don’t do it that way!” I said. “All we should be doing here is fixing the one erroneous field on that record.” This had come up in the past, but with it happening so rarely it’s easy to forget about.

I paused to catch my breath, then heard it.


I had to pause even longer to collect my thoughts. I don’t often get asked questions on things like this but I wish it happened daily.

This is the moment in which knowledge is gained, even by the answerer.

When you live & breathe a system for years on end, it’s easy to take certain aspects of it for granted. You respond without having to think about why things work the way they do – you just know that’s how it is.

The ensuing conversation was productive and I hope informative for my co-workers. While deleting the record would have the desired short-term result (making the application function properly), in the long term it would break the link between the data and a document which is referenced by that data. A net loss. Fixing the one column (setting it to the value which it should have been in the first place) allows the application to function correctly and retain access to that referenced document.

The conversation also forced me to take a closer look at my own understanding of the issue and re-evaluate what I thought I knew about it. It turns out, I had some bad assumptions in there too, which I was able to correct.

Not only did my teammates learn, I learned too. Everyone wins.

So why was the original solution of deleting the whole record requested? The answer isn’t too far removed from the idea of cargo cult programming. That is, someone saw the solution of deleting the whole record used in a similar case years ago, documented it, and it was seen as the One True Answer from that point forward – regardless of its applicability to the situation at hand.  A detailed explanation of “why” isn’t usually written for every issue that comes to our team for resolution, for a few reasons:

  • We don’t think to do it.
  • There isn’t a good way to distinguish between this bug in the system and others without having a fairly deep knowledge of the system.
  • There isn’t a way in our ticketing system to record information that isn’t visible to everyone, and the whole company does not need to see the dirty details of the internals of every system – in fact, it would probably be counterproductive.

In hindsight, a carefully-written, more thorough explanation many years ago may have prevented this particular request from being written as it was.

Asking why became the basis for Toyota’s approach to improving their manufacturing processes, and is built into Six Sigma and many other process improvement methodologies. This one word is the gateway to understanding, and once we understand, we can find ways to do things better.

If you’re curious about something, release your inner 4 year old. Just don’t act like a 4 year old when you do it. Keep asking why, get to the answers – and make sure you understand them.

If someone asks you why, embrace the question. This person is interested, they’re engaged, they want to learn! Take advantage of that opportunity to teach and spread that knowledge. Along the way, you just might learn something yourself.

Hello GETDATE() My Old Friend…

So you’ve decided that your new web application needs to record some page load time metrics so you can keep tabs on performance. Terrific!  You set up a couple page load/complete functions to write to a logging table when a page request comes in, and then update the record when it finishes loading.

    SET ResponseTime = GETDATE()
    WHERE SessionId = '883666b1-99be-48c8-bf59-5a5739bc7d1d';

You set up an hourly job to delete any logs older than 2 weeks (just to prevent information overload) and you call it a day. Each morning, you run a report to look at the previous day’s performance, watch the trends over the past week or so, and you’re pretty happy with things. Pages are loading in a fraction of a second, according to the logs. People find the application useful, word spreads around the office, and adoption takes off. The project is a success!

Then the support calls start rolling in. Users say it’s taking “forever” to load pages (they don’t have exact numbers, but it’s just too slow). This can’t be possible. The report says everything’s running just as fast as it did in test!

You walk down the hall and visit your friendly Senior DBA. He pulls up his monitoring tools and shows you that the hourly maintenance that keeps the PageLogs table fit & trim is causing a bunch of blocking while it does lots of DELETEs. And your INSERT queries are the victims.

Here’s the thing: GETDATE() (like any other function) doesn’t get evaluated until that query executes. Not when you call ExecuteNonQuery(), not even when SQL Server receives the query. So even if your INSERT isn’t holding up the execution of your page (because you’ve executed it asynchronously), it won’t accurately represent when the page load started. Instead, it tells you when your query executed. In this context that can be misleading because it won’t tell you how long it really took for your page to load.

If you need to log the time an event transpired accurately, GETDATE() isn’t your friend. You need to explicitly set the time in the query.

    SET ResponseTime = '2015-05-15T09:45:02Z'
    WHERE SessionId = '883666b1-99be-48c8-bf59-5a5739bc7d1d';

If you aren’t used to seeing significant blocking in your databases, you may not have run into this. But get into this habit anyway. At some point you probably will see blocking on a table like this, and logging with GETDATE() will make the data you attempted to write during that blocking invalid. If you can’t trust all of your data, can you trust any of it?

Getting Over It or: How I Learned to Stop Worrying and Love Speaking

Consider this the outtakes from my previous post about speaking at SQL Saturday.

It took a while for me to build up the courage to finally get up in the front of a room at SQL Saturday. As I mentioned in my prior post, I did quite a bit of studying of other peoples’ sessions, read peoples’ studies of other peoples’ sessions (Grant Fritchey’s “Speaker of the Month” series) and talked to a few people at the speakers’ dinner. Here are a few of the key things I learned which put me more at ease.

Everyone gets a little nervous

Feeling a little twinge of nerves is completely normal, even for seasoned speakers. Those feelings are what keep you on your toes. Get “comfortable”, get complacent, and you’ll probably overlook something.

Your audience is there for you

If you’ve only ever spoken previously in a classroom setting or making a pitch at work, SQL Saturday is very different. In those other scenarios, you have a mandated audience. People are there because they have to be there. They don’t really care much about you or what you have to say. At SQL Saturday, your audience is has opted into your session. They’re there because you have something they want. They’re receptive. They’re giving you their time and attention.

It’s OK to unwind after you speak

I don’t mean you should run out of the room as soon as you’ve finished the last slide. People may have questions they want to ask you. But if you need to go to the speaker’s room for a bit to decompress and unwind afterwards, it’s OK.

You’ll never finish the slide deck

I completely redid one slide on Thursday night, and was still fiddling with a few others Saturday morning. Just don’t tell people that you were working on it right up until the last moment; as long as what you say matches up with what’s on the screen, they don’t have to know.

It’s only SQL Saturday

That’s not meant to diminish SQL Saturday at all. But you’re not hosting the Oscars. If something goes wrong, it’s not happening live on TV with 50 million people (including your parents and kids) watching. You aren’t a paid professional speaker – you’re just there to share with people. People will give you some slack if you aren’t perfect.

It’s important to look at the feedback you get (attendees: please fill out those evals!). Reflecting on what went well is just as important as looking for areas of improvement.

What went well

  • I hit all but one of the points I wanted to hit. The one I missed wasn’t critical.
  • I didn’t run short on time. I think I paced myself pretty well, and took a sip of water when I felt I needed to slow myself down.
  • All my demos & equipment worked. My demos depended upon Azure, and RIT (our hosts) made major improvements to their guest network since last year.
  • I picked up a Logitech R400 remote so I wasn’t tethered to the podium for changing slides. I’m a “fidgety” kind of person, so in addition to achieving that goal, it gave my hands something to do without attracting attention
  • I didn’t spontaneously combust

What I need to work on

  • People want demos, not talk and slides. I’m already working on trimming the deck down to give myself more time to show and explain code.
  • I had trouble reading the audience. This is something I have trouble with elsewhere as well. Maybe I need to pick up a book on body language.
  • Most of my attempts at levity fell flat. I knew I was rolling the dice and while I didn’t roll snake eyes, I didn’t roll a 7 or 11 either. I also had one obscure reference which I knew only one person in the building would be likely to pick up on, but he wasn’t in the room. That one didn’t hurt me, but had I been able to find the image I really wanted, it would have worked better.
  • I didn’t move around as much as I wanted or expected to. I thought I was going to make more use of the notes I’d written in PowerPoint but to read them, I would have had to stay too close to the podium. Next time, fewer notes & more moving around.

I’m looking forward to working on that last point. I was disappointed with how few demos I had for my session, and that feeling was backed up by some of the feedback I got. Next time, it’ll be better.

I’m not sure when the next time will be. Unfortunately, the SQL Saturdays that are close enough for me to get to interfere with other obligations on my calendar.

I Spoke at SQLSat (and I Liked It)

That is the first and last Katy Perry reference you will find on this blog or anywhere else in my life.

Last weekend I spoke at the 4th edition of my “home” SQL Saturday, SQL Saturday #383. This was the end of a path that started four years ago, and the beginning of an exciting new one.

About four years ago, I was introduced to PASS. It didn’t take long for people to start talking to me about public speaking. I went to my first-ever SQL Saturday, and kept thinking to myself “I could never do that.” Then I was given the opportunity to attend PASS Summit 2012 and was hooked on the PASS community – SQL Family. I stepped onto the floor at the convention center and felt comfortable immediately. I think my exact words when I called home that evening were “I’m home. I found my people.” Mid-Summit, in a 10-minute conversation with a chapter leader, I was told “you should speak at one of my user group meetings.”

But I have nothing to talk about. I’m terrified of public speaking. I’ve only ever done it in a classroom, in college or high school and I hated it. It terrified me. And I’m not an expert on anything. Well…maybe. Someday. A long time from now.

Time passed. I got involved with my local PASS chapter, got heavily involved with our annual SQL Saturday events, and got to know (or at least meet) more people in the SQL Server community. And I kept hearing the question “so when are you going to start speaking?”

But I have nothing to talk about. I’m not a speaker. I don’t have the polish that all these people on stage at Summit or in the front of the room at SQL Saturday have. I’m not even a DBA!

In 2014, one of my professional development goals at work was to give at least two presentations. I pretty much didn’t have a choice now, I had to get up in front of a crowd. So I wrote & delivered two sessions:

  • An introduction to PowerShell. Adoption of PowerShell had been slow in my office and I wanted to demonstrate how it could benefit the entire IT department. This wasn’t targeted at any particular job role; I was addressing the whole department.
  • A demo of SQL Sentry Performance Advisor & Event Monitor. We’ve been using this software for a few years now and I’ve spent quite a bit of time getting comfortable with these two portions of it, mostly in the course of figuring out why our systems were running poorly.

I was starting to get a bit more relaxed about talking in front of people. But this was a comfortable environment – I knew everyone in the room. That summer, I attended Mark Vaillancourt’s (b | t) session DANGER! The Art and Science of Presenting at SQL Saturday Albany, looking to fill in some gaps and figure out how to put myself at ease in less familiar territory.

Well, maybe I can put together a beginner-level session.

In February 2015, I attended SQL Saturday Cleveland. One of my goals for the day was to catch beginner-level sessions. I wanted to study the type and depth of the material, as well as how it was presented. Late in the day I had my breakthrough moment. The room was completely packed and the crowd was hanging on the presenter’s every word. I finally had a grasp of how to tailor a topic to a “beginner” audience.

I don’t have to put on a flashy show with 20 different advanced features and techniques. There’s room for the basics because there are always people who are new to this stuff and they want sessions too!

That same month, we needed a a speaker for our chapter meeting and rather than find someone to do a remote presentation, I decided to dust off my PowerShell talk from work, retool it for a DBA crowd, and go for it. It went pretty well, and the next week I took the plunge. I wrote up an abstract and submitted for SQL Saturday.

Pressing this button is one of the most nerve-wracking things I’ve done. Deep breath…go.

At the chapter meeting, I’d gone over 90 minutes with my slides and demos. At SQL Saturday, I’d only have 60. I had my work cut out for me. I spent April tweaking and tuning my slide deck, honing my demos. I felt like I had a pretty solid setup. The Sunday before SQL Saturday, I sent myself to the basement and started rehearsing my presentation. I went 48 minutes. Without demos or questions from an audience (proving that cats don’t care about PowerShell).

Hard stop at 60 minutes. What can I cut? Where did I waste time? Am I speaking too slowly?

Every night that week I was in the basement, running through my presentation and demos. I got myself to 55 minutes for the whole package.

That’ll have to do. If I get questions mid-session, I’ll just drop a demo or two to make up the time.

I arrived home from the speaker dinner Friday night and did one last run through my deck. I had just redone one of my big slides Thursday night. Friday was a terrible run, but it was getting late. I had 38 minutes on the slides themselves.

Saturday morning, I awoke at 6 and my brain was already in overdrive; on a scale of one to ten, I was at an eleven. I fired up my Azure VMs so they’d be ready well ahead of time and hit the road for RIT. I found my room (I was speaking in the first slot) and got myself set up. I wanted to check and re-check everything. I was not about to let a technical problem take me down.

That settled, I milled around a bit and as 8:15 arrived, I found myself escalating from 11 to 15. People started filtering into the room and I tried to chat with them a bit as I’d read about doing so in Grant Fritchey’s (b | t) most recent Speaker of the Month post. That helped calm me down a bit.

8:30. Showtime. Breathe.

I feel like I fumbled a little bit on my intro (before I even got off my title slide), but by the time I hit my 3rd slide, a calm fell over me. I got out of my head and cruised through the material. It seemed like it was going smoother than any of my rehearsals. I wasn’t relying on my written notes. I got a couple chuckles out of the audience before I reached my demos. As I returned to the keyboard, I glanced at the clock.

What? 9:00? I burned through my slides in 30 minutes and I’d planned for close to 40. Am I speaking that quickly? Did I stumble that much when I practiced?

Fortunately, I’d set up my demos in preparation for such an event. I had a set of “must do” demos, and then a bunch of alternates which I could bring in to fill some time. I got through my demos, answered the lone question I was asked, and wrapped up right on time.

As people filtered out of the room and I started packing up, an enormous weight was lifted off my shoulders. I was done. I survived. And scanning through the feedback, it looked like I did an OK job. Reading through it later, I saw a few notes that meshed with things I was thinking during the session, and I will definitely take into consideration for the next time.

Yes, the next time. I’m doing this again. I’m hooked.

Slides & demos from SQL Saturday Rochester

Slides & demos from my SQL Saturday Rochester presentation “Easing into Scripting with Windows PowerShell” have been posted on the SQL Saturday site.

Thank you to everyone who came out for my session and all of SQL Saturday!