Change My Mind… T-SQL Tuesday #146

Andy Yun (t | b) is this month’s host. Our challenge this month is:

 …to think about something you’ve learned, that subsequently changed your opinion/viewpoint/etc. on something.

I’m going to give a twofer for this one, since he’s also allowing non-technical answers.

8K Blocks

“SQL Server writes everything in 8K blocks.” I recall learning this probably close to 2 decades ago. And, it makes sense, at a lot of levels. And it was “confirmed” when we reformatted the disks on one of our production servers into 64K blocks so SQL Server could read/write 8 blocks at a time. Performance definitely improved. But, then I learned from Argenis Fernandez that this isn’t necessarily true. SQL Server will write what it wants to write. And if you think about it, that makes sense. If you update one record and it’s a single value you’re updating, SQL Server isn’t going to simply sit there and not write your 1 byte change to disk. And it’s not going to make up random 8191 bytes just so it can satisfy this rule of 8K. Yes, SQL Server will try to consolidate disk I/O and be efficient about it, but even then, it may not matter. Gone are the days where we’re writing data to bare metal (some of us are old enough to recall an option in early versions of SQL Server where one could create a database on a “raw” partition to get a performance boost). No, today we’re probably writing through multiple layers, more than we realize. For one client for example, a disk write from SQL Server will pass through an encryption layer, then to the OS, which will pass it through a disk driver layer that will then pass it on to the VM which will have its own layers. At that point, even if SQL Server were trying to only write 8K blocks, it’s quite possible every other layer has its own rules.

Yes, changing our disk formatting from 8K blocks to 64K blocks helped. It helped us. But, your requirements and situation may be different and ultimately you may end up writing more or less than 8K blocks all the time. (and I hope I summed up Argenis’s position accurately.)

Toss the Rope Down

As many know, I’m a caver. I’ve been caving for decades. Early in my caving career I learned vertical caving and back then we still used what was known as a “3-knot” system or “prussiks”. That hardware has improved and changed. But one habit took longer. It was (and unfortunately still is) common to tie one end of the rope to a tree or other rigging point, and drop the rest down the pit. Sure, you ended up with a pile of rope at the bottom, but no one really cared, as long as you didn’t step on it (which is another myth for another time). This helped guarantee that your rope actually reached the bottom. The only thing that sucks more than rappelling down a pit and reaching the knot at the end of the rope 50′ from the bottom of the pit is rappelling down a pit and realizing 50′ from the bottom of the pit that you forgot to put a knot in your rope.

But somewhere along the line, folks started to realize, “hey, that rope at the bottom of the pit is useless. It’s FAR more useful if we can leave it at the top of the pit.” As the depth of most pits are known, it’s actually not that hard to measure out the rope you think you need (plus a bit extra) and then rig the rope so that you have extra at the top. Why is this important? Some call this “rigging for rescue” (or more accurately, one part of the bigger concept).

Imagine the scenario where you’re ascending the rope and have an equipment failure. You can’t go up and can’t go down. If all the extra rope is below you, it doesn’t do you any good. You can’t do anything with it. But, if that extra 10′ or 20′ (or more) is at the top and you’ve rigged correctly, someone at the top can, without too much effort, safely change the rigging (with you still on the rope) to a 3:1 or if nothing else, a 2:1 haul system. Suddenly that extra rope sitting at the top of the pit becomes useful.

Beginners will still often toss the extra rope to the bottom of the pit, but more experienced cavers will rig it to stay at the top and this may literally save lives.

Conclusion

Stop and think about practices that you do now that you may have learned that could be wrong or no longer applicable. And more importantly, do those bad practices interfere with doing something that’s better or safer? With SQL Server, over the past few decades, a lot has changed and improved, but are you still doing something you were taught 2 decades ago because “well that’s the way things are done.” A lot has changed in 2 decades. Make sure your knowledge is still valid!

What T-SQL Do You Find Difficult?

It’s been awhile since I’ve focused exclusively on SQL Server or even T-SQL and perhaps this would make a great question for someone to ask on a T-SQL Tuesday, but for now I’ll ask.

An open question to any of my readers who work with T-SQL: “What command or construct do you find difficult?”

Let’s be honest, we all have something about T-SQL that trips us up. I’m hoping the basic SELECT statement doesn’t. Though, start to throw in stuff like an outer join and a correlated subquery, it can get complicated pretty quickly. But even those I can generally get written without too much effort.

The one construct I do encounter infrequently, but enough that I feel I should fully grok it by now is a DELETE with a join. It’s actually not all that complicated, but I often have to look up the syntax to remind myself.

But the one that gets me every single time: PIVOT. I’ve probably only written a half-dozen pivots in my life and that’s 6 too many in my opinion. I’m not sure what it is about them, but my mind still doesn’t just fully grasp the syntax in an easy way.

Thinking about it, I think one reason is because when I’ve had to use them, it has never been a trivial case.

Fortunately, since my primary focus is on the operations side of being a DBA, and most of the code I write is to support that, I don’t come across the need for a pivot very often. I suppose though if I were writing a lot more code, I’d use it more and understand it better.

So, what’s your T-SQL nemesis?

And Fun Was Had By All

I want to give shout-outs to Rob Farley, Peter Shore, Deborah Melkin, and Rob Sewell for making the final Capital Area SQL Server User Group meeting of the year a rousing success. And this being T-SQL Tuesday, I’m going to try to very loosely tie this into the topic this month How much do you love meeting in person, where would you like for your next event to take place, and why Costa Rica? as invited by Xavier Morera.

tsqltuesday
T-SQL Tuesday

First, let me get his questions out of the way:

  1. Favorite Conference: The easy answer has been SQL Pass, but honestly, at this point, any where I get to see folks in person!
  2. Best Venue: Ignoring Pass at Seattle, I have to say Manchester UK was nice, simply because it was my first overseas SQL Saturday, or perhaps Virginia Beach SQL Saturday, because Monica Rathbun and her group provided a nice charcuterie board!
  3. Best Presenter: Oh, this is a tough one. I’m going to take a pass. But then cheat and answer below. Sort of.
  4. Next event and why it’ll be Costa Rica: I’m suspecting sort of a bias in this question, but to be honest, I’d love to go. I think 2022 will be a bit too busy for me to visit, but perhaps 2023 or 2024. Maybe I can work in some caving then too!

That all said, I want to get back to my shout-outs above and tie that into this T-SQL Tuesday.

As the coordinator for the Capital Area SQL Server User Group, one of my primary roles, in fact perhaps the most important, is finding speakers to present. I’ve tried over the past few years to have a good variety and to bring some variety. We haven’t really missed a meeting since the pandemic has started, but we have been virtual for well over a year now. This has presented both drawbacks and opportunities. The biggest drawback of course is the lack of actual in-person interaction and the feeling of connectedness that has brought. On a personal note it also means not only have I not gotten out of cooking dinner the night of meetings, but often, I’m juggling getting something together for dinner and getting the session started (though last night my wonderful wife did take care of dinner for me.)

On the flip side, being virtual has allowed me to invite speakers who might not otherwise be willing or able to travel in person to Albany NY and for attendees from across the country to show up. It has also given me the opportunity to experiment a bit more with formats.

Last year, instead of our traditional in-person holiday party format, we did a version of “Bluff the Listener” where I asked various presenters to tell their worst IT/SQL horror stories, but one was lying. It was a success and a lot of fun.

Not wanting to repeat that, this year I decided to ask the above 4 presenters to present lightning rounds. That’s not so bad, except I added a twist. They didn’t get to choose their topics, they were given them: 10 minutes before they were scheduled to present. (And yes, some may I stole this idea from Buck Woody, I’d like to say I was inspired).

I’ll admit I was very nervous about this idea. It seemed a bit gimmicky and it could have been a complete disaster with lesser speakers. Fortunately, all four brought their A-Game.

Rob Farley, presenting from the future, in I believe a public work space, managed to give one of the best talks on column-store indices I’ve seen. Given he had only 10 minutes of prep, I was impressed. His presentation included the use of Powerpoint in sort of a “green screen” mode so he could draw on his screen and we could see what he was drawing.

Peter Shore followed up talking about Tips in Advancing a Career in Data. Again, off-the-cuff with limited prep time, he did very well with this topic. I think in some ways this was almost harder than the more technical topics because you can’t fall back on a demo or graphics.

Deborah Melkin followed, talking about the Best new SQL Server features (2019, 2022, Azure). I had announced previously that the best speaker would be awarded a prize. By I think unanimous declaration, even before Rob Sewell finished out the night with his presentation, the other speakers decided Deborah was the winner. She included some demos in her presentation, which, given the lead time, really impressed folks.

Closing out the evening, Rob Sewell entertained us with a demo of SQL Injection. Not surprisingly, he made use of PowerShell and Notebooks.

As I said, it was an entertaining and educational evening. I purposely set expectations low and made sure folks understood that the entertainment value was as much, if not more important than actual educational value. But I was very pleased with how educational it turned out to be. It was a nice way to end the year and honestly, I think a decent way to get a break from the bad news that seems to have surrounded us lately.

I do have a theory though about why the educational part turned out as well as it did though. In general I’ve always enjoyed lightning talks and I honestly, think they’re among the hardest type of talk to give. Sometimes people promote them as a good introduction to speaking for novice speakers, but I’m not so sure. To give a successful lightning talk, one really has to strip a presentation to the bare essentials and really focus on just one or two key concepts. This can be difficult. But done well I think it really makes those concepts stick.

Now, combine that with topics only being given out 10 minutes in advance, I think that really forces a presenter to focus on key concepts even more. I wouldn’t give an inexperienced presenter a random topic, and even with an experienced presenter, I’d give them a chance to decline a topic if they feel it’s completely outside their wheelhouse. But otherwise, give them a chance to see what they can do. It might surprise you. Heck, it might surprise them.

So, to go back and answer a question from above: Best Presenter… at least last night Deborah Melkin, who if nothing else proved her Google-foo was impressive.

And I think if I can find volunteers, I will definitely try to do an in-person version of this at a future SQL Saturday or Data Saturday or other conference.

Thanks to all who participated and joined us. It was a blast. But honestly, next year, I hope to see you all in person at our holiday party!

Changing Technologies – T-SQL Tuesday

Select <columns> from Some_Table where Condition=’Some Value’

T-SQL Tuesday Topic

The above statement is pretty much the basis of what started my current career. Of course it actually goes back further than that. I have a Computer Science Degree from RPI. So I’ve done programming, learned hardware and more. I even took an Intro to Databases course while at RPI. I still recall the professor talking about IBM and something called Structured Query Language. The book had a line that went something like “while not the most popular database technology, its use may grow in the future.” Boy did it.

When I first started working with SQL Server, it was 4.21 and for a startup. I had a lot to learn. Back then, a lot was by experience. Sometimes I made mistakes. But I learned.

When I started at that startup, if one could write basic queries and backup and restore a database, one was a half-way decent DBA. Knowing how to tune indices was a definite bonus, as was knowing things like how to set up log-shipping and replication.

Back then, besides experience, I learned new stuff two ways: SQL Server Magazine and the SQL Connections conference. Work paid for both. It was worth it. But honestly, there wasn’t too much to learn. But there also weren’t as nearly as many resources as there were today.

Fast forward 30+ years and here I’ve written a book, worked for several startups, regularly write about databases and database related topics, and often presented at User Groups, SQL Saturdays and at the now defunct PASS Summit. Today as a consultant I regularly touch the SQL Server Query Engine, SSAS, SSRS, SSIS, use PowerShell, write the occasional C# and VB.Net, sometimes do work on a Linux machine or VM and more. A lot has changed.

Obviously the technology has changed. So how have I responded? By doing what I said above. This may sound like a tautology or even circular reasoning but it’s true. When I would go to a SQL Saturday, I’d often attend 3-5 different topics. I’d learn something. But then I started presenting. And that forced me to learn. As much as I may like to think I know about a topic, when I go to present about it, I like to ensure I know even more. This forces me to read white papers, other articles and perhaps attend other sessions.

When I’ve written an article, I’ve often had to do a lot of research for it.

So strangely, I would say a bit part of keeping my skills up to date is not just learning from others, but from teaching. Teaching forces me to keep my skills up.

In summation, I’ve responded by learning from others, but also forcing myself to teach myself before I taught others. It’s a feedback loop. The more technology changes, the more I reach out and learn and the more learn, the more I do outreach.

The impetus for this week’s blog was Andy Leonard’s call for a T-SQL Tuesday topic.