About Greg Moore

Founder and owner of Green Mountain Software, a consulting firm based in the Capital District of New York focusing on SQL Server. Lately I've been doing as much programming as I have DBA work so am learning a lot more about C# and VB.Net than I knew a couple of years ago. When I'm not in front of a computer or with my family I'm often out caving or teaching cave rescue skills.

The Value of Testing

This is one of those posts where I wish I could show actual code snippets, but since it involves a 3rd party vendor for one of my clients and I don’t have permission, I can’t.

So, I’m forced unfortunately to talk about the issue in a roundabout way.

My client uses a 3rd party tool to track documents. I’ve mentioned this before. They’ve been growing fairly fast and running into performance issues. I suppose growing fast is a good thing, but having performance issues is not.

In any case, using Query Store, I was able to send the vendor a list of queries and stats about them for them to review and to ideally improve the queries that needed work.

Yesterday they got back to me. The email was essentially we took this first query (let’s call it Doubly-Joined) and rewrote it as this second query (let’s call it Singly-Joined). I looked at the two queries, which join 4 tables. They’re very similar to each other, but the first one did join in the main table a second time (hence why I’m calling it Doubly-Joined). It’s not clear why this was done. The second query basically removed the second join and in the select clause, changed the aliases to the second join to the first join. This does give them a slightly different query plan, but ultimately, they return the same number of rows.

The first query plan
The second query plan

As you can see, the 2nd query plan is definitely a bit simpler (ignore the one warning, it’s not something that appears to be fixable here).

So, a naive take would be “we removed an unnecessary join, so of course it should be faster!” But is it?

Sometimes intuition can be correct, sometimes not so much. In this case though, it’s easy to confirm by seeing exactly how many rows are being read in each query.

I wrapped each query in a

Set Statistics IO ON/OFF
Set Statistics TIME ON/OFF

block and ran it. Here are the results

The Doubly-Joined

Table 'Table1'. Scan count 0, logical reads 337264, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table2'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table3'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table4'. Scan count 1, logical reads 396, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

The Singly-joined

Table 'Table1'. Scan count 0, logical reads 337260, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table2'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table3'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Table4'. Scan count 1, logical reads 396, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

I’ve highlighted the relevant change. The single-joined query consistently performed with 4 fewer logical reads. Now, if the original number had been 8 and had dropped in half to 4, I’d be happy. But the change from 337264 to 337260 leaves me a bit underwhelmed. Furthermore, under multiple runs, the second query did not consistently use less CPU time, sometimes it took faster to run. Further testing was consistent in the lack of apparent improvement.

Needless to say, I don’t think this query improvement will help much. I’ve reached out to the vendor to see if they can provide more details, but honestly, I’m not hoping for much.

Goals

A week ago an email alerting me to a new post from Steve Jones showed up in my in-box. I had to chuckle a bit because the topic was one, Goal Progress, that I had seriously thought about writing about for last week’s blog.  But I decided to write about teaching instead and put off a post on goals. There was a reason for that. I thought it would make a better post for this week.

There are some goals I’m fairly public about. I’ve started to write about my yearly goals and my results at the start and end of each year. But often I have more private goals. These are usually of a more personal nature. This post is about one such goal.

My Biking History

Ever since I got my first 10 speed, on I think my 10th birthday, I’ve loved to bike. Learning to read gave me intellectual freedom, but learning to bike gave me physical freedom. Growing up in a small town meant that to go any place interesting, biking was often the fastest and easiest way to do so. I could cover miles in fairly short order. Before I knew it, I was biking everywhere.

In high school, in the spring sports season I continued my relationship with the outdoors with the Outdoor Experience (OE) program. Part of that time was spent biking. I quickly upgraded my original 10 speed to one I bought from my original OE coach, Ben Kaghan. It was a beautiful Italian bike I continued to ride through high school and college until I was in a serious accident with it where I ended up going head over the handle bars and destroying the front forks. Twice in high school, as part of the OE program I managed to get in a Century ride, once my freshman year and once my senior year. They were nice bookmarks to my high school experience. Surprisingly to me, my senior year ride was a bit tougher, I think the route selection was a bit worse and we ended up with more hills.

For college graduation, I received a beautiful Trek 520 from my mom. I continued to ride, though not as much as I’d like. But several times, in September or October, I would ride to an annual event that my college outing club hosted known as “Fall Lake George”. This was an IOCA event where as many as a dozen colleges would canoe, sail, swim (yes a few have over the years) or powerboat to Turtle Island and camp for the weekend. I’d send my gear up in another vehicle and arrange for a car-ride back. Depending on factors this was generally about a 65-70 mile ride. The last time I did this was in 2015, the day before my father died. For various reasons, including changing college policies which have resulted in cancellations of this event, I haven’t done it since.

That said, most years I’ve routinely gotten in at least a few rides over 30 miles, with a few 50 mile rides snuck in there. But the elusive 100 miles had not been accomplished.

Due in part to COVID and wanting to get out more, and long stretches of good weather, I’ve found myself riding more this year than any year in memory. As of this morning my mileage for the year is over 1200 miles in the saddle. That was sort of a personal goal I had set for myself without publicizing it.

But there was one more goal. One I finally managed to realize this past weekend: the Century ride.  I had actually already accomplished 2 half-Century rides this year which I had felt good about, including one with some significant altitude gain. So I felt good about my chances. But I still wasn’t 100% sure.

Near Lake George is an amusement park, The Great Escape. My entire family has season passes and while I enjoy a day or two there, the rest of the family loves it there. My original goal was bike there, join them for a ride or two, and then bike home. And if I felt my legs weren’t up to it, toss the bike on the back of my wife’s car and ride home with them. Alas, due to Covid-19, The Great Escape has not been open at all this year.

But, I still sort of wanted a bail-out option. So I came up with a back-up plan. Right across from The Great Escape is a really incredible ice-cream place known as Martha’s Dandee Creme, which is a traditional stop for the family after a day at The Great Escape. So, about a month ago, I figured, if the weather worked, this past Labor Day weekend would be a good weekend to attempt my ride. And I’ve got to say the weather was nearly perfect for it. Not to warm nor too cold (though the first few miles from the house which was mostly downhill so I had lots of wind, but little muscle action, and still early in the morning were noticeably chilly.)

So at 7:45 AM I set out. The first 2-3 miles are basically coasting downhill from my house to the Hudson River and then the next 5 or so are avoiding potholes and city traffic as I make my way north to the first river crossing.

20200905_094609

32 miles in and going strong, but fuel and bio break.

I find that on longer rides, after about 2 hours, I need generally need some sort of refueling stop. Fortunately, in my area there’s lots of Stewart’s Shops. It’s generally easy to plan a break around them and I did.

20200905_095543

Milk (and a brownie) does the body good!

At this point I was more than half-way to my turn-around point, which was actually at about 55 miles, not 50. Once I had taken care used the facilities and refueled, I was back into the saddle for another stint. This was a shorter stint, but finally I would be gaining some altitude. Up until now the ride was basically along the Hudson, but in about 15 miles, I’d pass through Fort Edward and Hudson Falls where the river turns west and I would have to do some, albeit minor hill-climbing.  

My original goal had been to hit Martha’s by 12:30 PM. I had given myself lots of time because I had no idea how fast or slow I’d be. Well at this point I texted my wife to say that my new goal was now Noon.

20200905_113408-1.jpg

11:45 at Martha’s Dandee Creme!

I beat even that goal and that included a stop to make sure I hadn’t missed a turn.

One of the peculiarities I’ve found with endurance events like this is that my appetite essentially disappears. I knew I needed calories and as such ordered a soda, some chicken fingers and fries. The soda I had no problem drinking. Of the 5 chicken fingers, I managed to down 2 and of the fries, not even 1/2 of them. I saved the rest for my family.

Once they showed up, we ordered ice cream. I mean, what’s the point of biking to an ice cream place if you don’t get ice cream?

A very surreal experience, The Great Escape on Labor Day weekend, empty, and soundless

Finally it was time to get back in the saddle and head southbound. I felt strong about it, but wasn’t sure about the wind and all. And the wind at spots was definitely a factor and it definitely slowed me down.

But….

The big 100!

I made it. There was just one problem. As I had mentioned, my turnaround point was about 55 miles from my house. This meant I was still over 10 miles from home. I made a quick pit stop just south of here (again at a Stewart’s) and then raced home. Honestly, the about 2 miles near the house was the worst, not because of the distance per se, but because I finally had to regain all the altitude I had lost from my house to the Hudson 9 hours previous.

But I pulled into my driveway right before 5:00 PM.

Finally home, 111.6 miles later

I had, for the first time in 35 years, finally completed another Century ride. Actually a bit more than a Century ride. Goal accomplished.

Final stats:

  • 9:44 Door to door including stops
  • 111.6 miles covered
  • Crossing the Hudson, 4 times
  • Altitude changes: 515->13 feet and then slowly back up to 476 feet. And then all that in reverse
  • 1 Stewart’s Chocolate Milk consumed
  • 1 Stewart’s large brownie devoured
  • 2 chicken fingers digested
  • Some french fries eaten
  • 1 small (which is actually quite large at Martha’s) salted-caramel soft-serve cream in a cone ingested
  • 7:16:12 actual riding time
  • 29.20 top speed on some random hill.
  • 15.3 mph average speed overall (down from 16.3 mph turn around and 15.8 at 100 mile mark, the hills at home and city traffic killed me)

Conclusion

So, the first question that comes to mind, “would I have written this blog post had I failed to achieve my private goal or always kept the failure private?” – Good question. I think it depends on the reason for the failure.

“How did you feel the next day?” – Honestly? Pretty good. Other than my knees, I find if I’m in shape, long rides like this don’t really leave me overly sore the next day. And taking doses of ibuprofen definitely helps with the knees and more.

And of course, “Would you do it again?” – Well I probably won’t wait another 35 years. We’ll see what happens next year or the year after that.

“What other private goals do you have?” – That would be telling! 🙂

Seriously, it was a great time for me, I’m SO glad I did it and am feeling great. Not to bad for someone my age in a year of Covid.

Learning and Teaching

This past weekend was the first of 3 weekends I’ll be spending in teaching a cave rescue class. As I’ve written before, I usually spend at least 1 week a year teaching students how to help rescue folks out of caves. I don’t get paid money, and in fact have to pay for my own travel and sometimes other expenses. But, I love it. Unfortunately, the large event we had planned for NY this year had to be postponed due to Covid-19.

A Little Background

Fortunately, New York is one state where folks have been very good about social distancing and wearing masks, so that gave me the opportunity to try something new: teaching what we call a “Modular Level 1” class. Instead of taking an entire week off to teach, we spread the teaching out over three weekends and several nights. This can often better accommodate peoples schedules. After a lot of planning and discussions I finally decided to go ahead and see if I could host a class. Through a series of fortunate events4, by the time I was ready to close registration, I actually had more than enough students. What makes this class different from other classes I’ve taught is that more than 1/2 the students have never been in a cave. However, most of those are in medical school and a goal of mine has been to get more highly trained medical folks into cave rescue. So, we greenlighted the class.

Teaching

The first day of class is really mostly about “check-ins”. Each student must demonstrate a certain set of skills. When I teach the Level 2 class, this generally goes quickly because the students have already gone through Level 1 and the students tend to be more serious in general about their caving skills. But for Level 1, we get a broader range of students with a broader range of skills. And in this case, some folks who were just entering the community of being knot tying and SRT (Single Rope Technique).

There’s a mantra, I first heard among the medical education community, but is hardly unique to them, “See one, do one, teach one.” There’s a logic to this. Obviously you have to see or learn a skill first. Then obviously you need to be able to do it. However, the purpose and goal of that last one eludes some people.

Without getting too technical, let me give an example: in SRT, cavers and rescuers need the ability to climb the rope and, while attached to the rope, successfully change-over to be able to descend the rope. I’ve literally done this 100s of times in my life. I obviously have the first two parts of that mantra down I’ve seen it, and and done it. But teaching it is a whole other ball game. Being able to DO something, doesn’t mean you can successfully teach it. We do many things based strictly on experience and muscle memory. If you think about walking, you may realize you do it naturally without any real thought. But imagine trying to teach someone how to do it. You probably can’t, unless you’re a trained physical therapist.

Much is the same with the aforementioned change-over. Just because I could do it, didn’t mean I could successfully teach it. However, over the years, as I’ve taught it more and more I’ve come to recognize certain mistakes and certain areas I need to focus on. I’ve gotten better at teaching it. So by teaching more, I’m learning to become a better teacher. By being able to teach it, I also understand it and know it better. The “teach one” part of the mantra is important because it means you can give forward the skills you’ve learned, but also means you have a better understanding of them in the first place. You can’t effectively teach what you don’t understand.

In addition to learning how to teach better, I’ve also realized that some approaches work better than others for people. There’s a common knot we tie in the rope community called an “alpine butterfly”. There are at least four ways I’m aware of to teach it. One method involves looping the rope over your hand 3 times in a certain pattern and then pulling on the right loop in the right way through the others, the knot “magically” appears.  I’ll admit I’ve never been able to master this and as a result, obviously don’t teach this way. The method I use is a bit more off-color in its description. Writing it down it comes down to:

  1. Take a bight of the rope
  2. Put two twists in it
  3. Take the loop, aka head, pass it between the legs of the rope
  4. Shove the head through the asshole formed between the two twists
  5. Pull tight and dress

At the end of that, you have a beautiful alpine butterfly. On Saturday night I was helping a student perfect her butterfly. She was having trouble with the 3 loops over the hand method. I showed her the asshole method. She almost instantly got it. Now, that’s NOT to say the asshole way is the better way, it’s simply the way that worked better for her.

Learning

Besides learning how to teach better, I actually learn a lot from my students. For example, one of the students who does have extensive alpine rescue experience was asking about our use of what are known as Prusik loops to tie Prusik Knots. In her training and experience she uses something similar called a VT Prusik. I had seen these before in previous training, but had not had a chance to see them in action or play with them. She did a quick demonstration and then on Monday sent me a link with more information. Needless to say, by the end I was ordering a pair so I could start to play with them myself. I can already see where I might use them in certain cases.

Another example of learning is that I’m starting to adopt a different way of tying what’s known as a Münter hitch. I’ve been tying these successfully for decades, but started noticing another method that’s fairly common and in my mind, if not more intuitive, it is at least a bit more of a visual mnemonic. I think it’ll reduce my chances of tying one poorly so I’ve started using it more and more. And this is because I saw how quickly students would pick it up.

Gelling

By Saturday night most of the students had passed their check-offs, but not in what I’d call a solid fashion. They were still at the stage where they were simply reproducing what they saw. This is common in the early stages of learning. As a result, I decided to adjust the Sunday morning schedule and spend a bit more time on simply practicing and honing their skills. What we really want at some point is for the skills to “gel” (i.e. go from a liquid state where their ability is in flux to a state where there abilities are more solid). What can be interesting about this is for some folks, this can be a fairly quick process and in fact I noticed by lunchtime for a number of students, their abilities had gone from simple rote reproduction to an actual more gelled state. After lunch we put in some more time and with some of the students I’d simply walk up, call out a knot for them to tie, walk away, give them a minute or so and come back to see what they had done. In most cases, they were successful. The night before that would not have worked. They’re still a long way to go from being as good as I or they might like, but they were no ready to go out in the field and safely put a patient over the edge.

Level 1 students pull a patient up over a cliff

Safely getting a patient over the edge

Concluding

So we have two more weekends to go before they can call themselves trained as Level 1 students and hopefully they’ll keep learning and improving beyond that. For me, as long and tiring as the weekend was (I think I got about 5-6 hours of sleep each night, at most) it was rewarding because I got to see students learn skills we taught AND because I got to learn stuff too. It was a great weekend and I look forward to the next two.20200829_134511

 

 

Caving and SQL

Longtime readers know that I spend a lot of my time talking about and teaching caving, more specifically cave rescue, and SQL Server, more specifically the operations side. While in some ways they are very different, there are areas where they overlap. In fact I wrote a book taking lessons from both, and airplane crashes to talk about IT Disaster Management.

Last week is a week where both had an overlap. One of the grottoes in the NSS (think like a SQL User Group) sponsored a talk on Diversity and Inclusion in the caving community. The next day, SQL Pass had a virtual panel on the exact same subject.

Welcoming

Let me start with saying that one thing I appreciate about both communities is that they will welcome pretty much anyone. You show up and ask to be involved and someone will generally point you in the right direction.  In fact several years ago, I heard an Oracle DBA mention how different the SQL community was from his Oracle experience, and how welcoming and sharing we could be.

This is true in the caving community. I recall an incident decades ago where someone from out of town called up a caving friend he found in the NSS memberhsip manual and said, “hey, I hear you go caving every Friday, can I join you?” The answer was of course yes.  I know I can go many places in this country, look up a caver and instantly be pointed to a great restaurant, some great caves and even possibly some crash space to sleep.

So let’s be clear, BOTH communities are very welcoming.

And I hear that a lot when the topic of diversity and inclusion comes along. “Oh we welcome anyone. They just have to ask.”

But…

Well, there’s two issues there and they’re similar in both communities. The less obvious one is that often anyone is welcome, but after that, there’s barriers, some obvious, some less so. Newcomers start to hear the subtle comments, the subtle behaviors. For example, in caving, modesty is often not a big deal. After crawling out of a wet muddy hole, you may think nothing of tearing off your clothes in the parking lot and changing. Perhaps you’re standing behind a car door but that’s about it. It’s second nature, it’s not big deal. But imagine now that you’re the only woman in that group. Sure, you were welcomed into the fold and had a blast caving, how comfortable are you with this sudden lack of modesty? Or you’re a man, but come from a cultural or religious background where modesty is a high premium?

In the SQL world, no one is getting naked in the datacenters (I hope). But, it can be subtle things there too. “Hey dudes, you all want to go out for drinks?” Now many folks will argue, “dudes is gender neutral”. And I think in most cases it’s INTENDED to be. But, turn around and ask them, “are you attracted to dudes?” and suddenly you see there is still a gender attached.  There’s other behaviors to. There’s the classic case of when a manager switched email signatures with one of his reports and how the attitudes of the customers changed, simply based on whose signature was on the email.

So yes, both groups definitely can WELCOME new folks and folks outside of the majority, but do the folks they welcome remain welcomed? From talking to people who aren’t in the majority, the answer I often get is “not much.”

An Interlude

“But Greg, I know….” insert BIPOC or woman or other member of a minority.  “They’re a great DBA” or “They’re a great caver! Really active in the community.”  And you’re right. But you’re also seeing the survivorship bias. In some cases, they did find themselves in a more welcoming space that continued to be welcoming. In some cases you’re seeing the ones who forged on anyway. But think about it, half our population is made up of women. Why aren’t 1/2 our DBAs?  In fact, the number of women in IT is declining! And if you consider the number of women in high school or college who express an interest in IT and compare it to those in in their 30s, you’ll find the number drops. Women are welcome, until they’re not.

In the caving community during an on-line discussion where people of color were speaking up about the barriers they faced, one person, a white male basically said, “there’s no racism in caving, we’ll welcome anyone.”  A POC pointed out that “as a black man in the South, trust me, I do NOT feel safe walking through a field to a cave.”  The white man continued to say, “sure, but there’s no racism in caving” completely dismissing the other responder’s concerns.

There’s Still More…

The final point I want to make however is that “we welcome people” is a necessary, but not sufficient step. Yes, I will say pretty much every caver I know will welcome anyone who shows an interest. But that’s not enough. For one thing, for many communities, simply enjoying the outdoors is something that’s not a large part of their cultural.  This may mean that they’re not even aware that caving is a possibility. Or that even if it is, they may not know how to reach out and find someone to take them caving.

Even if they overcome that hurdle, while caving can be done on the cheap, there is still the matter of getting some clothing, a helmet, some lights. There’s the matter of getting TO the cave.

In the SQL world, yes anyone is welcome to a SQL Saturday, but what if they don’t have a car? Is mass transit an option? What if they are hearing impaired? (I’ve tried unsuccessfully 2 years in a row to try to provide an ASL interpreter for our local SQL Saturday. I’m going to keep trying). What if they’re a single parent? During the work week they may have school and daycare options, but that may not be possible for a SQL Saturday or even an afterhours event. I even had something pointed out to me, during my talk on how to present, that someone in the audience had not realized up until I mentioned it, that I was using a laser pointer. Why? Because they were colorblind and never saw the red dot. It was something that I, a non-colorblind person had never even considered. And now I wonder, how many other colorblind folks had the same issue, but never said anything?

In Conclusion

It’s easy and honestly tempting to say, “hey, we welcome anyone” and think that’s all there is to it. The truth is, it takes a LOT more than that. If nothing else, if you’re like me, an older, cis-het white male, take the time to sit in on various diversity panels and LISTEN. If you’re invited to ask questions or participate, do so, but in a way that acknowledges your position. Try not to project your experiences on to another. Only once have I avoided a field to get to a cave, because the farmer kept his bull there. But I should not project MY lack of fear about crossing a field onto members of the community who HAVE experienced that.

Listen for barriers and work to remove them. Believe others when they mention a barrier. They may not be barriers for you, but they are for others. When you can, try to remove them BEFORE others bring them up. Don’t assume a barrier doesn’t exist because no one mentions it. Don’t say, “is it ok if I use a red laser pointer?” because you’re now putting a colorblind person on the spot and singling them out. That will discourage them. For example find a “software” pointer (on my list of things to do) that will highlight items directly on the screen. This also works great for large rooms where there may be multiple projection screens in use.

If caving, don’t just assume, “oh folks know how to find us” reach out to community groups and ask them if they’re interested and offer to help. (note I did try this this year, but never heard back and because of the impact of Covid, am waiting until next year to try again.)

Don’t take offense. Unless someone says, “hey, Greg, you know you do…” they’re not talking about you specifically, but about an entire system. And no one is expecting you to personally fix the entire system, but simply to work to improve it where you can. It’s a team effort. That said, maybe you do get called out. I had a friend call me out on a tweet I made. She did so privately. And she did so because, she knew I’d listen. I appreciated that. She recognized I was human and I make mistakes and that given the chance, I’ll listen and learn. How can one take offense at that? I saw it has a sign of caring.

Finally realize, none of us are perfect, but we can always strive to do better.

So, today give some thought about how you can not only claim your community, whatever it may be, is welcoming, but what efforts you can make to ensure it is.

 

On a separate note, check out my latest writing for Red-Gate, part II on Parameters in PowerShell.

Let me Try this… in Prod

A more light-hearted look at things today. There are certain phrases or ideas you hear that should give you pause. There’s the classic, “here, hold my beer and watch this.”

And of course what I heard yesterday while on a call with a client and their developer said, “well let me try this, what’s the worst that could happen?”

Just the other day, fellow #SQLFamily member David Klee tweeted:

A software vendor just told my client to restart their SQL Server after every backup. I am beyond speechless.

I read that tweet and literally sat there slack-jawed for half a minute. I swear I felt a disturbance in the Force as a million DBAs cried out in terror.

But this got me thinking of other bad advice I’ve seen over the years, such as “we reboot our SQL Server nightly because it has a memory leak and uses all the memory on our server. Oh and by the way, can you tell us why our server is so slow in the morning?” (Ok the 2nd sentence is partly made up, but I’ve had clients complain about performance issues which were due in part to them restarting their SQL Server.)

Or, “don’t index, keep everything as a heap.”  Yes, I saw that someplace, I’m still not quite sure the reason they had for that.

“Oh, we had a problem with this stored procedure sometimes running really slowly, so we hardcoded WITH RECOMPILE in. Now it runs consistently.” Fortunately by the time I had arrived they had stopped this particular process, instead they just had  scheduled task that recompiled it at least once a day. This one was interesting. After determining a couple of performance issues with this sproc, including parameter sniffing and using SQL Server 2005 XML parsing in it, I developed a far better solution that eliminated the parameter sniffing and in most cases eliminated the need to parse the XML at all. The client didn’t adopt it. A year later, a DBA brought in for another project took a stab at this sproc and came up with a similar solution (though he didn’t reduce the XML parsing like I did.). They didn’t adopt it. Finally, over 2 years after my initial recommendation, I was able to convince them to implement it.

“Oh, I have covering indexes for every column and include all the other columns!” Ok, I haven’t seen it quite this bad, but I’ve seen indexing that approached this level.

“We use NOLOCK, because it’s faster.” This is a common one. Now, I’ll admit years ago on a platform we built we did use NOLOCK because it was faster, BUT we also actually understood what it could do and honestly didn’t care about inconsistent results (we were serving up classified ads, so if you saw a different set on a page refresh, it was actually a useful side effect.)

In general, I find bad advice or bad ideas fascinating. I love to understand how they came into being. In some cases, like my employer’s use of NOLOCK, it was actually a conscious choice and we understood the drawbacks and accepted them. I wouldn’t necessary call that a bad idea once all the particulars were known. But on the face of it, it certainly looked like a bad idea, and that was SQL 7.0 and later SQL 2000. With more modern versions of SQL Server, I would argue there are betters solutions now.

In the case of my former client doing the RECOMPILE, that’s more subtle. Yes, their solution worked, but, it was clearly the wrong solution because they didn’t understand what the problem was, or how to fix it properly. So I’d argue this was a bad idea.

But when it comes to restarting SQL Server after a backup, I really still have no words. It’s not clear to me at all what problem the vendor thought they were solving or why this would solve it. This truly is a bad idea all around.

Fortunately, in the particular case of my client and their developer, the worst was, we’d resend 25,000 rows of data to Salesforce to be updated. That would take 2 minutes and not break anything. He knew this and was joking, but it was funny to hear.

So my question to my readers is: what’s the worst idea or advice you’ve heard and did it in retrospect have enough context to perhaps at lease explain why someone came up with it, or was it simply so bad you’re still shaking your head? This doesn’t have to be SQL related.

P.S. – my next Redgate article should be published soon. Keep your eyes open for it!

Covid Challenges

There’s no doubt that Covid-19 has had a huge impact on our lives. Professionally for example it means a change to a virtual SQL Saturday and a Virtual PASS Summit. It means some of my fellow #SQLFamily have gone radio silent for various periods of time as they’ve dealt with this crisis in their own fashion.

I know personally there are days I just want to go outside and scream. There’s so much disruption in the world and in my life. I miss being able to travel freely, to see as many friends in person as I’d like and so much more. I mourn the loss of schooling as we know it for my kids and everyone else’s kids.

20200313_183236

Early on during the pandemic – bare shelves where toilet paper and paper towels should be

But, I’ve also been very fortunate. I’ve had a few friends who have contracted Covid-19, but all have survived (though as we’re learning, surviving may include long-term impacts such as irregular heart rhythms, changes to mental status and drug uptakes and more). I know of one former co-worker who succumbed to the disease when it was in the middle of the NYC spike. Other than that, fortunately, the only deaths I’m aware of have been friends of friends. This doesn’t make the disease any less tragic, but just a bit more remote for me.

But, the above could be said of most of us and rather than focus on the negatives, I wanted to talk about some of the changes in my life this year and how I’ve tried to rise to the challenge. Before I do so, I want to be clear, that how one rises to the challenge is different for everyone and this is not meant to be a brag sheet as much as a statement of what I see as some positive things in my life. I’d love to hear some positivity from YOUR life in my comments. We all know how bad this year has been, let’s talk about some good stuff for a change.

Backpacking

One of my goals for decades has been to hike the Appalachian Trail. No, I can’t say this was the year. But, I had section hiked a portion from my Dad’s house in CT up through Dalton MA while in college and then a year or two later from Bennington VT up through Manchester VT (I’m still grateful to the poor soul who picked up my friend and I have 2 days of sweaty exertion without clean clothes!) But this meant there has been a gap between Dalton MA and Bennington VT.  Last month I was able to FINALLY put a backpack on and close part of that gap.  I’d love to post photos, but silly me left my cell phone in my car! So here’s one after I got back to my car.

Do I look tired?

After hiking 17 miles during a heat wave

I hope to get in the final 20 mile stretch in the coming weeks. This will mean that I can check Massachusetts off my list of states to hike for the AT.

Sourdough

Ok, show of hands, who here has dabbled in sourdough during this pandemic? I know I have. I kept a starter going for about 4 months before taking a break from it over the summer. I made a number of loafs of bread as well as some sourdough waffles (and I’ll admit sacrificed the first batch more than once to the Waffle-Iron gods.) I even added a bit to some homemade pizzas. Tasty stuff and to be honest, I’ll probably do another starter again come fall. I’ve always loved baking so this was just an extension of it.

20200505_120934

Roast Beef Sandwich with homemade sourdough bread!

20200725_190323

Two sourdough pizzas with home grown herbs

More Time with Family

Of course making that much pizza and waffles means I need someone to help eat it. Fortunately we have a full house this summer (and will this fall). So an upside has been more time with the family. Among other things, this meant, especially early on during the pandemic, more family walks in the area.

20200411_145101

Family walk near the house with leaden skies

Bicycling

Of course all that delicious food I’ve been making needs to be burned off. I’ve literally biked more this year than the last few years combined. As of today, that means over 850 miles for the year. This includes a 55+ mile ride this past weekend. I’ve really been enjoying it. I’ve been a bicyclist my entire life but have missed riding this much. I’m hoping next year to upgrade my road bike (open to suggestions) to replace my 30 year old Trek 520.

20200809_134643

You mean I still have to bike to the top of THAT? 20 miles in and about 7 miles to go, but that’s where all the altitude gain is!

20200809_142741

“It looks like we made it!” – after the climb

And I’m still hoping to complete my first century ride in over 3 decades (that makes me feel old!)

Speaking

On one hand I’ve done far less speaking for SQL events than most years. I believe SQL Saturday Albany was my first SQL event this year. But I’ve been asked to present virtually at 3 NSS Grotto meetings on “So this is your first rescue.” I’ve also been selected to speak at PASS Summit for the first time, so even though it won’t be in person, I’m excited! I’ve also volunteered to speak at the Denver User Group meeting on September 17th.

Webinars

I’ve taken advantage of the fact that so much is now virtual and attended some Red-Gate Live webinars, a few SQL WIT webinars and others. One piece of advice I’ll give here, if you can, attend what you can. You no longer have to be physically present for most SQL User Group meetings, I know several #SQLFamily members who have attended 2-3 User Group presentations in the same week! It’s one advantage of everything going virtual!

Virtual Get-togethers

I, and in some cases the rest of my family, have Zoomed with my mom, my aunt and others and almost weekly, with members of my #SQLFamily. It’s been a great uplift to see so many folks.

What’s Next?

I’ll admit, it’s been a different year. We had to postpone our NCRC National Weeklong Training Seminar until 2021. But, I just got approval to host a Modular Level 1 this fall. I’m still not sure we can pull it off, but if we can, it’ll be great.

I’ve really missed seeing a lot of folks in person.

Covid still looms large in my planning of travel and events. I don’t know what the next 6-9 months will bring, but I know I’ll try to make the best of it!

What about you?

What’s something positive you’ve been able to accomplish during this pandemic? I want to hear it!

And Remember

20200809_161758

We could all use a little support!

Query Store Saves the Day

It’s never a good thing when you get an impromptu meeting invite on the weekend and the subject line is “Sync Error”. I honestly didn’t even see the invite until the meeting had been going on for over an hour.

I called in and was brought up to speed. A 3rd party tool one of my client uses was having major timeout issues. Normally it’s fine, but my client was taking advantage of the weekend to do a very large import of data and the tool wasn’t keeping up.

I both love and hate being thrown into situations like this. I hate it because often I have very little information to go on, but also love it, because it can be a good challenge. So, I wanted to collect some data. Fortunately the database in question runs on SQL Server 2016. This blog post covers a bit of what we did and ends with why I am so grateful for Query Store.

Query Store and the First Graphs

I quickly enabled Query Store and grabbed a quick report. Based on help with the 3rd party support, I was able to focus on a particular query.

Query Store first graph

Initial Query Store screen grab

So, right away, I knew that at times this query could flip to a pretty bad query plan. I was curious as to why. But while poking around, I noticed something else going on. The database was at the SQL Server 2008 compatibility level, despite running on SQL Server 2016. Now I know when we upgraded the server a year ago the 3rd party vendor didn’t guarantee compatibility with 2016, so we had left it in its old compatibility level. Since then apparently the vendor had qualified it and I confirmed with their support who was on the line that I could change the compatibility level to SQL Server 2016. Of course, I wanted to see if this would make a difference, so I grabbed another one of the problematic queries and looked at the query plan both before and after.

Compatibility level 100

Query Plan at SQL Server 2008 Level

Compatibility level 130

Query Plan at SQL Server 2016 Level

As you’ll note, the 2008 plan uses 2 hash matches, the 2016 uses two merge joins. That’s interesting by itself, but after collecting a bit of data, I saw the 2016 plan was running in an average of 45ms. The 2008 plan had been averaging 1434ms. That’s quite the improvement, simply by a single change!

That said, I still wasn’t entirely comfortable with what was going on and dug a bit deeper.

Digging Deeper

The change to the compatibility level had essentially eliminated the green bar in the above graph. This was good. But the blue bar to the left of it was still an issue. It also had a similar issue with flipping between two different query plans, but this was even worse.

Query Store second graph

Better, but not that one query really stands out!

I find this particular chart to be the most useful. I set a custom time frame (in this case 3 hours) and looked at the total duration of 25 queries that had accumulated the most time running. It’s pretty clear that one query dominates and working on this is probably where I want to spend my efforts. It’s also very hard to pick out, but the query (#12) from the first graph that I had looked at, has improved so much that it’s now moved to 12th on the list from the 2nd position.  That’s quite an improvement and simply by changing the compatibility level! More on my thoughts on that below.

The more I thought about it, the more I started to focus on statistics. This was an educated guess based on the fact that my client was doing a LOT of inserts and updates into a particular table. There’s another issue I’ll also discuss. This one I couldn’t fix unfortunately, but if the 3rd party can, I think they’ll see a HUGE improvement in performance.

Slow query plan

Slow version of the query

Fast version of the query

Fast version of the query

These look VERY similar, except the position of the Key lookups and the Index Seeks are swapped. That may not seem like much but the slow version was on average taking about 93.95 ms and the fast version was on average taking about .11ms. That’s a HUGE difference, about 850x difference! It took me a bit to realize what was going on, but let’s talk about that Key lookup. Even with the faster version, it’s obvious that if I can eliminate that, I could get things to be even faster!  The problem is that the query wants to return some columns not covered in the IX_FileID index. That’s generally easy to fix and while I’m loathe to make updates to 3rd party packages, I was willing to test this one out by making it a covering index. Unfortunately, this is where I was stymied. One of the columns is an IMAGE datatype and you can’t throw those into an index. I’ve recommended to the 3rd party vendor they try to change this. It wouldn’t be easy, but it could have dramatic performance improvements here and elsewhere (I had run into this problem last year while trying to tackle another performance issue).

I should note, that even though this query is actually very fast, it is executed so much that its total time dominates in the system. This is one reason why any improvement here would have a dramatic impact.

Statistics

In any case, looking at these two query plans and doing some further testing confirmed my hypothesis and also suggests why changing the compatibility level helped so much: statistics were very quickly getting out of whack.

I was able to confirm this by grabbing some data from the query store for just the last hour and it showed only the slow version of the query was running. I then forced an update of stats on the table in question and immediately saw the query flip over to the faster plan. This continued for awhile before it flipped back to the slower version.

We developed a plan, which I’ll admit upfront didn’t work. We decided that updating the stats on that particular table every hour might give us tremendous performance gains. And in fact it did initially. BUT, what we found was that after an hour of inserts, running the update stats for that table took about 45-60 seconds and the vendors tool has a hard-coded 30 second timeout. And because of the way this particular tool works, it means after a failure you have to start from scratch on every run. Since the job can take 4-6 hours to run, we couldn’t update stats every hour, even though I would have liked to.

Query Store third graph

The graph that should our plan wasn’t working

Above shows how at the time the update stats was running (that particular column of the query story graphic is cut off) the query times jumped to 30 seconds.  So while overall updating the stats is a good thing, here it was definitely killing our process.

Above I mentioned that changing the compatibility level still had an impact here. What I didn’t show here was that I was also looking at a bunch of statistics histograms and could see how badly things had gotten in some cases. But this is an area where SQL Server 2016 makes a difference! It can do more in the background better to help keep statistics a bit more accurate (still not as good as a full update, but it can dramatically help.) This is a hug part I believe of why the first query addressed above improved AND stayed improved.

Loving Query Store

They say a picture is worth a 1000 words. Honestly, I probably could have figured out the above issues with running a bunch of queries, looking at some DMVs, statistics histograms and the like. But it would have taken longer. Note too you can query the query store. But, the ability to instantly look at a graph, see what’s taking the most time, or executing the most, or a variety of other parameters makes the graphical interface to Query store EXTREMELY valuable. I was able to instantly zero in on a couple of key queries and focus my energies there. By varying the timeframes I was looking at, I could try changes and see the impact within minutes. I could also look at the stored query plans and make judgments based on what they showed.

If you’re NOT using Query Store to debug performance issues, start doing it. To be honest, I haven’t used it much. I wouldn’t call myself an expert in it by any means. But, I was able to pull it up and almost instantly have insight into my client’s issues and was able to make actionable suggestions.

And to quote the product manager there after I fixed the first query simply by changing compatibility mode, “A good DBA is like having a good mechanic to work on your car.” That one made me smile.

Oh and I’ve been known to swap out the alternator on my old Subaru in under 10 minutes and have replaced the brakes a number of times. So if this DBA thing doesn’t work out, I guess I’ve got another career I can look into!

Final Note

Per my NDA, I obviously haven’t named my client. But also, simply out of respect, I haven’t named the third party tool. I don’t want folks thinking I’m trying to besmirch their name. Their product is a fine one and I’d recommend it if asked. But my client is one of their larger users and sometimes pushes it to the limits so we sometimes find some of the edge cases. So nothing here is meant to disparage the 3rd party tool in an way (though they should replace that image field since it really doesn’t need to be one!)

 

 

SQL Saturday Albany 2020

So, another SQL Saturday Albany is in the books. First, I want to thank Ed Pollack and his crew for doing a great job with a changing and challenging landscape.  While I handle the day to day and monthly operations of the Capital Area SQL Server User Group, Ed handles the planning and operations of the SQL Saturday event. While the event itself is only 1 day of the year, I suspect he has the harder job!

This year of course planning was complicated by the fact that the event had to become a virtual event. However, it’s a bit ironic we went virtual because in many ways, the Capital District of NY is probably one of the safer places in the country to have an in-person event. That said, virtual was still by far the right decision.

Lessons Learned

Since more and more SQL Saturdays will be virtual for the foreseeable future, I wanted to take the opportunity to pass on some lessons I learned and some thoughts I have about making them even more successful. Just like the #SQLFamily in general passing on knowledge about SQL Server, I wanted to pass on knowledge learned here.

For Presenters

The topic I presented on was So you want to Present: Tips and Tricks of the Trade. I think it’s important to nurture the next generation of speakers. Over the years I was given a great deal of encouragement and advice from the speakers who came before me and I feel it’s important to pass that on. Normally I give this presentation in person. One of the pieces of advice I really stress in it is to practice beforehand. I take that to heart. I knew going into this SQL Saturday that presenting this remotely would create new challenges. For example, on one slide I talk about moving around on the stage. That doesn’t really apply to virtual presentations. On the other hand, when presenting them in person, I generally don’t have to worry about a “green-screen”. (Turns out for this one I didn’t either, more on that in a moment.)

So I decided to make sure I did a remote run through of this presentation with a friend of mine. I can’t tell you how valuable that was. I found that slides I thought were fine when I practiced by myself didn’t work well when presented remotely. I found that the lack of feedback inhibited me at points (I actually do mention this in the original slide deck). With her feedback, I altered about a 1/2 dozen slides and ended up adding 3-4 more. I think this made for a much better and more cohesive presentation.

Tip #1: Practice your virtual presentation at least once with a remote audience

They don’t have to know the topic or honestly, even have an interest in it. In fact I’d argue it might help if they don’t, this means they can focus more on the delivery and any technical issues than the content itself. Even if you’ve given the talk 100 times in front of a live audience, doing it remotely is different enough that you need feedback.

Tip #2: Know your presentation tool

This one actually came back to bite me and I’m going to have another tip on this later. I did my practice run via Zoom, because that’s what I normally use. I’m used to the built-in Chroma Key (aka green-screen) feature and know how to turn it on and off and to play with it. It turns out that GotoWebinar handles it differently and I didn’t even think about it until I got to that part of my presentation and realized I had never turned it on, and had no idea how to! This meant that this part of my talk didn’t go as well as planned.

Tip #3: Have a friend watch the actual presentation

I actually lucked out here, both my kids got up early (well for them, considering it was a weekend) and watched me present. I’m actually glad I didn’t realize this until the very end or else I might have been more self-conscious. That said, even though I had followed Tip #1 above, they were able to give me more feedback. For example, (and this relates to Tip #2), the demo I did using Prezi was choppy and not great. In addition, my Magnify Screen example that apparently worked in Zoom, did not work in GotoWebinar! This feedback was useful. But even more so, if someone you know and trust is watching in real-time, they can give real-time feedback such as issues with bandwidth, volume levels, etc.

Tip #4: Revise your presentation

Unless your presentation was developed exclusively to be done remotely, I can guarantee that it probably need some changes to make it work better remotely. For example, since most folks will be watching from their computer or phone, you actually may NOT need to magnify the screen such as you would in a live presentation with folks sitting in the back of the room. During another speaker’s presentation, I realized they could have dialed back the magnification they had enabled in SSMS and it would have still been very readable and also presented more information.

You also can’t effectively use a laser pointer to highlight items on the slide-deck.

You might need to add a few slides to better explain a point, or even remove some since they’re no longer relevant. But in general, you can’t just shift and lift a live presentation to become a remote one and have it be as good.

Tip #5: Know your physical setup

This is actually a problem I see at times with in-person presentations, but it’s even more true with virtual ones and it ties to Tip #2 above. If you have multiple screens, understand which one will be shown by the presentation tool. Most, if not all, let you select which screen or even which window is being shared. This can be very important. If you choose to share a particular program window (say PowerPoint) and then try to switch to another window (say SSMS) your audience may not see the new window. Or, and this is very common, if you run PowerPoint in presenter mode where you have the presented slides on one screen, and your thumbnails and notes on another, make sure you know which screen is being shared. I did get this right with GotoWebinar (in part because I knew to look for it) but it wasn’t obvious at first how to do this.

In addition, decide where to put your webcam! If you’re sharing your face (and I’m a fan of it, I think it makes it easier for others to connect to you as a presenter) understand which screen you’ll be looking at the most, otherwise your audience may get an awkward looking view of you always looking off to another screen. And, if you can, try to make “eye contact” through the camera from time to time. In addition, be aware, and this is an issue I’m still trying to address, that you may have glare coming off of your glasses. For example, I need to wear reading glasses at my computer, and even after adjusting the lighting in the room, it became apparent, that the brightness of my screens alone was causing a glare problem. I’ll be working on this!

Also be aware of what may be in the background of your camera. You don’t want to have any embarrassing items showing up on your webcam!

For Organizers

Tip #6: Provide access to the presentation tool a week beforehand

Now, this is partly on me. I didn’t think to ask Ed if I could log into one of the GotoWebinar channels beforehand, I should have. But I’ll go a step further. A lesson I think we learned is that as an organizer, make sure presenters can log in before the big day and that they can practice with the tool. This allows them to learn all the controls before they go live. For example, I didn’t realize until 10 minutes was left in my presentation how to see who the attendees were. At first I could only see folks who had been designated as a panelist or moderator, so I was annoyed I couldn’t see who was simply attending. Finally I realized what I thought was simply a label was in fact a tab I could click on. Had I played with the actual tool earlier in the week I’d have known this far sooner.  So organizers, if you can, arrange time for presenters to log in days before the event.

Tip #7: Have plenty of “Operators”

Every tool may call them by different names but ensure that you have enough folks in each “room” or “channel” who can do things like mute/unmute people, who can ensure the presenter can be heard, etc. When I started my presentation, there was some hitch and there was no one around initially to unmute me. While I considered doing my presentation via interpretive dance or via mime, I decided to not to. Ed was able to jump in and solve the problem. I ended up losing about 10 minutes of time due to this glitch.

Tip #8: Train your “Operators”

This goes back to the two previous tips, make sure your operators have training before the big day. Setup an hour a week before and have them all log in and practice how to unmute or mute presenters, how to pass control to the next operator, etc. Also, you may want to give them a script to read at the start and end of each session. “Good morning. Thank you for signing in. The presenter for this session will be John Doe and he will be talking about parameter sniffing in SQL Server. If you have a question, please enter it in the Q&A window and I will make sure the presenter is aware of it. This session is/isn’t being recorded.” At the end a closing item like, “Thank you for attending. Please remember to join us in Room #1 at 4:45 for the raffle and also when this session ends, there will be a quick feedback survey. Please take the time to fill it out.”

Tip #9: If you can, have a feedback mechanism

While people often don’t fill out the written feedback forms at a SQL Saturday, when they do, they can often be valuable. Try to recreate this for virtual ones.

Tip #10Have a speaker’s channel

I hadn’t given this much thought until I was talking to a fellow speaker, Rie Irish later, and remarked how I missed the interaction with my fellow speakers. She was the one who suggested a speaker’s “channel” or “room” would be a good idea and I have to agree. A private room where speakers can log in, chat with each other, reach out to operators or organizers strikes me as a great idea. I’d highly suggest it.

Tip #11: Have a general open channel

Call this the “hallway” channel if you want, but try to recreate the hallway experience where folks can simply chat with each other. SQL Saturday is very much a social event, so try to leverage that! Let everyone chat together just like they would at an in-person SQL Saturday event.

For Attendees

Tip #12: Use social media

As a speaker or organizer, I love to see folks talking about my talk or event on Twitter and Facebook. Please, share the enthusiasm. Let others know what you’re doing and share your thoughts! This is actually a tip for everyone, but there’s far more attendees than organizers/speakers, so you can do the most!

Tip #13: Ask questions, provide feedback

Every platform used for remote presentations offers some sort of Q&A or feedback. Please, use this. As a virtual speaker, it’s impossible to know if my points are coming across. I want/welcome questions and feedback, both during and after. As great as my talks are, or at least I think they are, it’s impossible to tell without feedback if they’re making an impact. That said, let me apologize right now, if during my talk you tried to ask a question or give feedback, because of my lack of familiarity with the tool and not having the planned operator in the room, I may have missed it.

Tip #14: Attend!

Yes, this sounds obvious, but hey, without you, we’re just talking into a microphone! Just because we can’t be together in person doesn’t mean we should stop learning! Take advantage of this time to attend as many virtual events as you can! With so many being virtual, you can pick ones out of your timezone for example to better fit your schedule, or in different parts of the world! Being physically close is no longer a requirement!

In Closing

Again, I want to reiterate that Ed and his team did a bang-up job with our SQL Saturday and I had a blast and everyone I spoke to had a great time. But of course, doing events virtually is still a new thing and we’re learning. So this is an opportunity to take the lessons from a great event and make yours even better!

I had a really positive experience presenting virtually and look forward to my PASS Summit presentation and an encouraged to put in for more virtual SQL Saturdays after this.

In addition, I’d love to hear what tips you might add.

A Summit To Remember

There’s been a lot of talk about the 2020 PASS Summit and how the impact of making it virtual this year. I’ve even previously written about it. I’ll be clear, I would prefer an in-person summit. But that said, I think having it virtual does provide for some fascinating and interesting possibilities and I look forward to seeing how they’re handled.  It will certainly be different being able to watch a session at a later time as a default option. And my understanding is that session schedules will no longer be constrained by the timezone the Summit is being held in.

That said, I also have to admit a certain bias here. I’ve wanted to speak at Summit for a couple of years now and have been turned down twice in the past two years. This year I was hoping again to speak, but alas, I procrastinated a bit too long and literally missed the original window to submit by a few hours.

Note I said original window. Because the Summit was moved to a virtual Summit the decision was made to re-open the call for speakers. This time I took advantage of that 2nd chance and submitted a bid.

And I’m so glad I did. Because if you didn’t have a reason to attend summit before, you do now! You get to hear me talk about PowerShell! So, I’ll admit to getting an unexpected benefit out of the move to a Virtual Summit.

I still recall one of my first attempts to use PowerShell at a client site, about 8 years ago. It did not go well. The security policy wouldn’t let me do what I wanted and the available knowledge on the Internet was sparse. Basically I wanted to loop through a list of servers and see if they had SQL Server installed. I eventually gave up on that project.

Since then though, I’ve been drawn to PowerShell and have come to love it. Now, when you hear a DBA talk about PowerShell, they will almost always mention dbatools. I want to go on record right now, I think it’s a GREAT addition, but I rarely use it. Not because there’s anything wrong with it, but mostly because my current usage is a bit different than what it provides. I do talk about it a bit here though.

For the talk I’ll be presenting, my plan is to start with a real simple PowerShell Script and slowly build on it until it’s a useful script for deploying SQL Scripts to multiple servers. For anyone who has read my articles at Red-Gate, much of this will be familiar territory, but I hope to cover in 75 minutes what I cover in 3-4 articles.

Checking this morning, I noticed that I’m among good company, and it’s humbling to see it, when it comes to speaking about PowerShell.

So, I hope you “come” and see me present on PowerShell at SQL Summit 2020. I’ll be in New York, where will you be?

“We want information…information… information!!!”

For anyone who has ever watched the classic British mini-series “The Prisoner” this is a very recognizable line. But it applies to many parts of our lives.

This is a tale of hiking, a non-cave rescue, and yes, eventually Extended Events.

“I went to the woods…”

This past weekend I spent some time in the woods hiking and getting away from it all. This is the first time in literally decades I had done an overnight hike on the Appalachian Trail. My goal was to get in an overnight and work on closing a gap of it that I had not yet hiked.

The last time I hiked the trail, cell phones were a very rare item, carried by business people only and often weighing several pounds, they certainly weren’t something the average hiker could afford, and even if they could, they would be too heavy to carry.

I mention this because I had fully intended to carry mine with me, so that I could take pictures, and perhaps even, I’ll admit it, if I had connectivity when I camped that night, catch up on some Wikipedia reading, or send a picture or two to friends and family. But alas, about 2 miles into the hike, at a gorgeous viewpoint (see older photo above), I stopped, tried to pull out my phone and realized that unsettled feeling I had at my car before locking it wasn’t “Am I sure I have my keys” but really should have been “am I sure I have my phone!”

It turns out, other than my inability to document my trip with some photos, and not being able to call my wife to let her know I’d be at the pick-up point much earlier than we had planned, not having access to information of the outside world was a refreshing change of pace. I’m almost glad I didn’t have my phone.

A Missed Call

As some of my readers know, besides being a DBA, I also teach and at times perform cave rescues. As I tell folks once they get past the “That’s so cool” phase, it’s not really all that glamorous. If I get called out to one actual rescue a year here in the Northeast, it’s a busy year. But, on warm weekends in the summer, the odds are higher than say the middle of the week in the winter (though that has happened too).

So a concern I had in regards to not having my phone was that I would miss a call for a potential rescue.

It turns out I was partially correct in my concern.  On the way home, I saw my phone buzz. I didn’t answer it, but a few minutes later did glance down to see “Missed Call”. It was from my team co-captain. (To be transparent here, the terms team and co-captain are used loosely, it’s not a very formal setup). She rarely calls, especially if it’s a weekend, except in an emergency. I waited until I got home to call her back. And it wasn’t an actual call-out, yet. It was at this point a “potential missing caver.” What this meant in this case was a vehicle had been spotted outside a popular cave, and it had been there for at least 18 hours. That is unusual for this cave, most trips are 2-3 hours in length. So, this was concerning. But, we didn’t have enough information. Was someone in the cave? If so, where? Were they in need of assistance? We needed information, and by hook or by crook we were going to get it. Or at least some of it.

In general, one of the biggest issues we have when starting a cave rescue is the lack of information. In this case case it was even, “are they in the cave?” Had we determined they most likely were, the next question would have been, “where?”. That shapes our search. “How long?” That might shape what equipment we bring on our initial search. “What injuries?” That would also shape our response. In any cave rescue we eventually get the information, but it can be frustrating to have to wait. Caves don’t have cell service inside. (We often do literally put our own phone system into caves during a rescue however!) When we train folks, they often find it hard to believe at first that a patient could be 300 feet into a cave, and it would take a skilled, fresh caver 45 minutes to simply get to them, and another 45 minutes to get back. So as simple a request as “can you get me information about the patient” could easily take 90 minutes or more. And yes, that’s a real life incident.

In this case, eventually the authorities ran the plates and it appears the plates had expired before 1990, the VIN that could be found on the insurance card sitting on the dashboard was made up (or belonged to a vehicle decades older) and the address on the card was fake. We stood down. There wasn’t going to be a search that day. It was entirely a police matter.

#TeamExEvents

I said I’d get to Extended Events and here we are.  I’ve written about them before and I’m a huge fan of them. Simply put, if you’re not using them, you’re probably missing information that you can very useful. I started in the days of SQL Server 4.21a, but really started to cut my teeth on SQL Server with 6.5 came out. Back then our problem sets were probably easier and smaller, but we still dealt with similar issues, the biggest has often been performance related. In the early days there were some decent tricks and ways of diagnosing where your performance bottlenecks were, but to be honest, sometimes it was hit or mess. Over the years, Microsoft has added a lot of functionality to SQL Server including DMVs and Extended Events. I now routinely use Extended Events to track down performance issues or other problems. Last night at our local User Group Meeting, Grant Fritchey did a lightning round where he highlighted one of the features of Extended Events that honestly, I know about, but don’t use enough: Causality Tracking

Causality Tracking Checked

Causality Tracking extends the power of Extended Events to a new Level!

Let’s just say this is a feature that makes picking out the specific events you want to follow much easier. The example Grant gave showed a ton of detail, more than you’d normally need, but extremely useful if you did in fact need it. In other words a simple checkbox can now give us a great deal of useful information.

With the right information, you can often identify bottlenecks and make huge performance gains.

At times I feel like I’m Number Six, trying to get information about a database problem or a potential cave rescue

Number Six: Where am I?

Number Two: In the village.

Six: What do you want?

Two: Information.

Six: Whose side are you on?

Two: That would be telling. We want information…information… information!!!

Six: You won’t get it!

Two: By hook or by crook, we will.

In conclusion, there are times when disconnecting from the information around us can make a weekend in the woods more enjoyable, but a dearth of it is standard at the start of a cave rescue, while having ready access to it can make solving a problem far easier.

Where do you stand on the information spectrum today? Do you have a lack of it, the right amount, or too much?