About Greg Moore

Founder and owner of Green Mountain Software, a consulting firm based in the Capital District of New York focusing on SQL Server. Lately I've been doing as much programming as I have DBA work so am learning a lot more about C# and VB.Net than I knew a couple of years ago. When I'm not in front of a computer or with my family I'm often out caving or teaching cave rescue skills.

Fixing a “Simple” Leak

I was reminded of the 80/20 rule yesterday as I was briefing my team on a project I needed them to work on.  Ok, actually I was telling my kids about this, but it was still for a project.

In this case, I was pointing out that 80% of the effort for a project often is used on completing only 20% of the project. Researching this a bit further, there’s an actual name for this rule: the Pareto Principle. We’ve probably all come across this rule or some variation of it in our lives.

While not exactly the situation here, the point I was trying to make with them is that sometimes when fixing a simple problem, the issues and work that spin off easily take 10 times as much time and effort as the original problem.

In this specific case, I was asking them to do what was essentially an annoying and dirty job, clean up some edges to sheetrock before I finally re-enclosed a portion of the basement ceiling. They started later in the day than I would have preferred, but honestly, as this project has been sitting on hold for a few months, a couple more hours didn’t really matter. That said, I think the actual work took them longer than they thought it would.

So what prompted yesterday’s work was a small leak that I fixed over a year ago.

Over the years, we had noticed a slight leak in the downstairs bathroom. It wasn’t always apparent, but it was slowly getting worse. Essentially water was soaking into the sheetrock and walls of the finished portion of our basement.

Now, I’m a fairly handy guy, I spent several summers in high school and a bit of college working for my father in the construction trade. While he’d point out my finish work needed more work, in general, if it’s involved in residential construction, I’ve done it and I can feel comfortable doing it. But, besides learning how to use the tools, I also learned an important lesson: no project is ever as simple as it appears. This is actually one reason I hate starting home-improvement projects. I know that it’s going to turn into a lot more work than it originally looks like. It isn’t exactly the 80/20 rule, but I’m reminded of it. So let me dive a bit into what was and still is involved in fixing this simple leak.

First, I had to identify where it was.  From lots of inspection, guesswork and experience, I guessed in the wall behind the tiles.  Ok, so that means, rip out the tiles and the plaster and lathe underneath.  That’s simple enough, and honestly fun, albeit it dusty.

Second, once the leak was found, replace the plumbing.  With modern Pex plumbing, that’s actually the quickest part of the work. So yes, actually FIXING the leak, took maybe an hour. I will note I also took the opportunity to move the showerhead up by about 9″. One thing I hate are low showerheads!

WP_20181223_001

New Pex plumbing to showerhead

But, hey, while I’m in here, I might as well run the wiring for a fan for the bathroom because it’s always needed one.  And given a weird quirk of construction in this house, the easiest way is to run it into the long wall of the bathtub/shower, then over to the outside wall and then up to the approximate location of where the fan will go.  So there’s another hour or two for a project to was started to fix a leak.

Oh and while I’m at it, let me take some photos with a tape measure in them of where the pipes are for future reference. So there’s a bit more work.

Great, the leak is fixed.

Except, obviously the shower can’t be used as is with open walls. So now it’s a matter of getting backerboard and putting that in, and sealing it.  What I used is waterproof as it is, so we left it at that. And I say we, because for much of this product both the kids were helping with it. And quite honestly, that’s where this part of the  project sat for months. The shower was usable, though a bit ugly.

But, we still had the basement to deal with. That meant ripping out the damaged sheetrock and studs that had rotted. That was fun. Not! For that I actually used a full respirator, body suit, and sprayed anti-fungal stuff liberally. Some of the water damage here was actually older than the bathroom leak and was due to poor grading and then more recently, runoff from the roof of my addition (a problem that gutters finally solved).

But hey, if we’re putting in new walls, might as well put in better insulation and if we can seal the old concrete (hint that went poorly) and oh, put in a couple of dimmable lights for a work area and some network jacks.

WP_20190106_001

Basement wall in progress

Once that was all roughed in, it was time to at least sheetrock the walls.

WP_20190106_004

Sheetrocked walls

And that is basically where the basement project sat until yesterday. I’ll come back to that in a minute.

As for the bathroom, there was only so long I wanted to look at the backerboard. It was time to finally tile it.  Oh, but before I could do that, I had to put that fan in. Besides finding just enough room in the outside wall between the framing for the 2nd floor and the window and other vagaries, it just fit. Of course that was just 1/2 the battle. The other 1/2 was then wiring up the switch. Oh and while I’m at it, might as well run a circuit for a GFCI outlet since the bathroom was lacking one.  Once all that was done, THEN, I could tile.

WP_20190804_001

Tiled and Grouted

And yes, you may note the window does intrude into the shower space. Hey, I didn’t build the house!  What you can’t see is the replaced trim on the top edge and left edge of the tile that my daughter literally spent hours sanding and resealing. It looks great.

Oh and I still have to find replacement cones for behind the handles! So that’s another thing to do for the project.

But back to the basement. The area in front of the wall has become my son’s de facto computer space when he’s home from college. It was ‘good enough’.  But the ceiling still needed its sheetrock replaced and I needed to tape and paint the new wall.  This has waited until now.

The problem with the ceiling is when I pulled down the old stuff, I didn’t have nice clean edges to butt the new sheetrock against.  It was ragged where it had broken, or broke at awkward places so I couldn’t easily put in new sheetrock.

But I also took advantage of this time to reroute all my network drops so they will be hidden in the ceiling and come out nicely to my rack.

So yesterday, the kids did the dirty work of trimming the edges, cleaning stuff up, etc. It looks great and will make my job of sheetrocking much easier.

20200121_081823

Open basement ceiling

By the way, I should note that the board you see sticking out is what I had used in the past when I had to slide in here to do some wiring or other work. This was sort of my own private Jefferies Tube. This should now be relatively easy to sheetrock, right?

Well, except for one small detail.

20200121_081837

Houston, we have a problem.

Yes, that is a piece of electrical cable that was run OUTSIDE the studs and essentially between the join of where two pieces of sheetrock met at an inside corner.  I absolutely HAVE to move this before I can sheetrock.

So that’s going to be a few more hours of work before I can even start to sheetrock. I have to identify which circuit this is, cut power, cut the wire, reroute it, put the ends in a junction box (which code says can’t be hidden!) and then make sure it’s safe.

After all that work, I can finally get around to sheetrocking the ceiling. Then I’ll have to mud and tape all the joints, sand, mud again, prime and then finally get the walls and ceiling painted.

But the good news is, the leak is fixed. That was the easy party!

Does this “simple” project of fixing a leak remind you of any projects at work? It does for me!

How Much We Know

Last night I had the privilege of introducing Grant Fritchey  as our speaker to our local user group. He works for Redgate who was a sponsor. The topic was on 10 Steps Towards Global Data Compliance.  Between that and a discussion I had with several members during the informal food portion of our meeting I was reminded me of something that’s been on my mind for awhile.

As I’ve mentioned in the past, I’ve worked with SQL Server since the 4.21a days. In other words, I’ve worked with SQL Server for a very long time. As a result, I recall when SQL Server was just a database engine. There was a lot to it, but I think it was safe to say that one could justifiably consider themselves an expert in it with a sufficient amount of effort. And as a DBA, our jobs were fairly simple: tune a query here, setup an index update job there, do a restore from backups once in awhile. It wasn’t hard but there was definitely enough to keep a DBA busy.

But, things have changed.  Yes, I still get called upon to tune a query now and then. Perhaps I making sure stats are updated instead of rerunning an index rebuild, and I still get called upon to restore a database now and then. But, now my job includes so much more. Yesterday I was writing a PowerShell script for a client. This script calls an SFTP server, downloads a file, unzips it and then calls a DTSX package to load it into the database.  So now I’m expected to know enough PowerShell to get around. I need to know enough SSIS to write some simple ETL packages. And the reason I was rewriting the PowerShell script was to make it more robust and easier to deploy so that when I build out the DR box for this client, I can more easily drop it in place and maintain it going forward.  Oh, did I mention that we’re looking at setting up an Availability Group using an asynchronous replica in a different data center? And I should mention before we even build that out, I need to consult with the VMWare team to get a couple of quick and dirty VMs setup so I can do some testing.

And that was just Monday.  Today with another client I need to check out the latest build of their application, deploy a new stored procedure, and go over testing it with their main user. Oh, and call another potential client about some possible work with them. And tomorrow, I’ll be putting the finishing touches on another PowerShell article.

So what does this have to do with last night’s meeting on Global Data Compliance? Grant made a point that in a sense Data Compliance (global or otherwise) is a business problem. But guess who will get charged with solving it, or at least portions of it?  Us DBAs.

As I started out saying, years ago it was relatively easy to be an expert in SQL Server. It was basically a single product and the lines tended to be fairly distinct and well drawn between it and other work. Today though, it’s no longer just a database engine. Microsoft correctly calls it a data platform.  Even PASS has gone from being an acronym for Professional Association of SQL Server to simply PASS.

Oh, there are still definitely experts in specific areas of of the Microsoft Data Platform, but I’d say they’re probably more rare now than before.  Many of us are generalists.

I mentioned above too that I’d probably be more likely to update stats than an index these days.  And while I still deal with backups, even just the change to having compression has made that less onerous as I worry less about disk space, network speed and the like. In many ways, the more mundane tasks of SQL Server have become automated or at least simpler and take up less of my time. But that’s not a problem for me, I’m busier than ever.

So, long gone are the days where knowing how to install SQL Server and run a few queries is sufficient. If one wants to work in the data platform, one has to up their game. And personally, I think that’s a good thing. What do you think? How has your job changed over the past decade or more. I’d love to hear your input.

A Good Guy

I wrote previously about the dangers of calling yourself an ally. Two completely unrelated incidents in the last week reminded me of that post. Both on their own are rather small items, but I think worth considering.

The first basically happened to a friend at a recent rally in NYC to support the Jewish community. Apparently a young non-Jewish woman accosted an elderly Jewish immigrant at the march for comments he had made about the goal or purpose of the rally. Or to put it another way, a non-Jewish person was telling a Jewish person that the way he was expressing his support for Judaism was wrong. Let that sink in for a minute. Now, to be fair, as a my Jewish friend commented, the young woman’s comments weren’t necessarily technically wrong, but they were out of place.

In the second incident, I replied to a comment a friend had made on Twitter. In reaction she sent me a pair of emojis that equated to, “seriously?” I was confused at first because my tweet had been intended to agree with and support her observation. However, because, as she put it, “I was one of the good guys” she wanted to explain how my reply could be perceived as a form of mansplaining. She realized I hadn’t intentionally tried to overshadow her comments or to be rude. She would have had no problem calling me out in public had that been the case. Instead, she took the time to privately explain to me why what I had done was problematic. I ended up, despite her saying it was unnecessary, removing my tweet because I was no longer comfortable with it. I realized were better ways of I could have replied.

The point of my two examples isn’t to say that the young woman was a bad person, or to self-flagellate myself. The point is that even as a ally, one will make mistakes. This is in part because by not being an actual part of the group in question, one can’t fully internalize what it means to be part of that group and how comments and actions will impact members of that group. But, one can ideally still listen and learn. I appreciate that my friend took the time to explain to me why my tweet was problematic. She was under no obligation to do so. But I appreciate it.

That said, two other quick items: I want to toss a shout out to the South Florida BI SQL Saturday. One can’t go 100% based on names as to how one identifies, but the organizers have tweeted about how they managed to have a 50/50 balance of men and women presenting. It is definitely possible to do this folks.

Finally, a shoutout for my latest Redgate article on Comments and More in PowerShell. This was a fun one to write. I hope you enjoy it.

 

2020 in Preview

Ok, time for the obligatory dad joke: I can’t see what’s coming in the next year, I genuinely do not have 20/20 vision!

But I suppose my vision looking back was better. So I will try to prognosticate for the coming year and set some goals. I said last year I’m not a fan of New Year’s Resolutions, but I suppose I may have to reassess that claim as this is the second year in a row I’ve gone out on a limb and set goals, and what are goals if not a form of a resolution?

  • I’m going to continue to blog at least once a week. While I hope my readers get something out of it, I also blog for my own personal reasons: it helps me keep my writing and creative juices flowing. If years ago you told me I would have written a book and was blogging I’d have laughed and not believed it. I also would have wondered what blogging was!
  • Related to that, I will continue to writing for Red-Gate. This is a bit different from my blogging. It’s far more technical in nature which requires more effort. Since I’ve set aside an hour a week (and in fact my calendar just reminded me it was time for that hour) I’ve found I’ve been more productive. It’s in part why I wrote 5 articles last year and got 4 published. All so far have been on PowerShell. Generally my approach as been either, “here is a problem I had at a client and how I solved it with PowerShell” or lately it’s been a bit more of “hey, here’s a challenge, let’s see how to do it in PowerShell.” The best example of this last year was my article on using PowerShell to create a countdown timer with a GUI. It’s perhaps not the most productive way to do it, I think other languages and approaches would be easier, but it was a fun challenge and I learned a lot.
  • Extended Events! Or as Grant Fritchey would say #TeamExEvents! I’m a proud member and my goal is to learn more about them and to write more about them this year. It’s just a question of how much. But I’m a convert and a definite fan!
  • Read more blogs on a regular basis. I sporadically read Grant’s and also Monica Rathbun’s and would recommend both. I also sometimes read Cathrine Wilhemsen’s and she’s recently been on a tear with her guide to Azure Data Factories. I’ll admit I haven’t worked with it, but 25 posts in 25 days is an incredible feat and she’s great and knowledgeable on the topic, so I can highly recommend it in any event. I also want to add a few non-technical blogs to the mix. We’ll see.
  • Keep speaking at SQL Saturdays. I have yet to put in for any, but I will. Perhaps I’ll be visiting a city near you!
  • Create a couple of new topics to speak on. I’ve suggested a collaboration with someone and now I have to get off my butt and put together notes and see if they’re still willing to speak with lil’ ol’ me.
  • Speak at SQL Summit. This is an ongoing goal. Someday I’ll achieve it.
  • Have a successful NCRC Weeklong Cave Rescue Seminar here in NY. I’m the site coordinator for it this  year. I’ve got a great team backing me up, but as they say, the “Buck Stops Here”.  Registration is looking great, but until I get hit my goals, I’ll be stressing.
  • Read more! – I received several books for the holidays, including:
    • The Power Broker, I biography of Robert Moses
    • Station Eleven, a fiction  book (and if you’re the one that recommended it to me, please remind me who you are so I can thank you.)
    • Headstrong, 52 Women Who Changed Science and the World

And finally some rather generic goals

  • Love more!
  • Cave more!
  • Hike more!
  • Bike more!
  • Travel!
  • Vote the bastard out!
  • Have fun!

And I’ll conclude with one more dad joke because… that’s the way I roll!

When does a joke become a dad joke?

When it becomes a-parent.

Hey, don’t blame me if you groaned. I warned you it was coming!

Have a great New Year!

2019 in Review

Last year I did a review of 2018 and then the next day I did a post of plans for 2019. I figured I would take time to look back on 2019 and see how well I did on some of my goals and then perhaps tomorrow set goals for 2020.

One of my first goals always is to make one more revolution around the Sun. I can safely say I successfully achieved that.

But what else? I vowed to blog once a week. I did miss a few this year, but pretty much succeeded on that one. But, perhaps those misses where why I failed to break 2000 page views for 2019. That said, I don’t feel too bad. In 2018, I had one particular post in 2018 that sort of went viral, and that alone really accounts for the higher number in 2018. So if I ignore that outlier, I did as well or better for 2019. That said, I think I’ll set a goal of 2020 page views for 2020. It’s a nice symmetry.

I’ve continued to speak at SQL Saturdays in 2019 and will do so in 2020. Still working on additional topics and may hint at one tomorrow.

But I again failed to get selected to speak at SQL Summit itself. That said, I was proud to again speak at the User Group Leadership meeting this year. My topic was on moving the needle and challenging user group leaders to bring more diversity to their selection of speakers (with a focus on more women, but that shouldn’t be the only focus).  It was mostly well received, but I could tell at least a few folks weren’t comfortable with the topic. I was ok with that.

I set a goal of at least 3  more articles for Redgate’s Simple Talk.  I’m pleased to say I not only succeeded, but exceeded that with 4 articles published. It would have been 5, but time conspired against that. That said, I should have another article coming out next month.

I never did take time to learn more about containers.

I continue to teach cave rescue.

I think I caved more.

I didn’t hike more, alas.

And there were a few personal goals I not only met, but I exceeded. And one or two I failed it.

But, I definitely succeeded at my last goal, having fun. 2019 was a great year in many ways and I spent much of it surrounded with friends and family. For reasons I can’t quite put my finger on, I think I enjoyed SQL Summit this year far more than previous years. It really was like spending time with family.

I’ve been blessed with great friends and family and 2019 just reminded me of that more than ever.  Thank you to everyone who brought positive contributions to my life in 2019. I appreciate it.

 

‘Tis Better to Give than Receive

My family complains that I’m hard to buy gifts for, and I have to admit, I suppose they’re right. Things I want, I’m likely to buy for myself. And honestly, I’d rather give than receive.  But sometimes, it’s two way street:

CASSUG

This is the local Capital Area SQL Server User Group I head up. I haven’t added up the number hours a year I spend on this, but I wouldn’t be surprised if it’s in the triple digits. And I don’t get paid. It’s all volunteer.  Now that’s not to say I don’t get something tangible out of it, I do get to attend PASS Summit every year at no cost. But that’s not the only reason I do it. I do it for #sqlfamily.  I’ve mentioned them before, but let’s just say, that the help and advice I’ve received from them is amazing. It’s made me a far better DBA.  So I give a lot, but get a lot more in return. Thanks #sqlfamily.

NCRC

If I give a lot of time to CASSUG, I give even more time to the National Cave Rescue Commission. In a normal year, I will teach at least one 2-day OCR and a Weeklong. To be clear, a “week-long” for instructors typically means arriving sometime on a Thursday and working 14-15 hours days until the following Saturday. I’m planning the 2020 Weeklong, which means I will spend far more hours than usual doing work for the NCRC. I also am a Regional Coordinator, which means meetings with my fellow coordinators as well as working with local resources.  Now I’ll admit, there’s an additional reason I do this. I figure if I ever get stuck, I want some trained folks out there.

RPI Outing Club

I still work with the RPI Outing Club, mostly on caving, because it gave me so much I want to give back. That and being around young people does make me feel younger.

Blood (and more)

This and the holiday tomorrow is what prompted this post. I give blood pretty much as often as I can.  It literally is the gift of life. I figure I’ve got plenty and I can make more. I’m partly inspired by a childhood friend who had a rare platelet diseases and needed multiple transfusions. I was too young to give then, but I figure I’ve more than made up for it since then.

I’m also a registered bone-marrow recipient and a certain friend knows, if the time comes and I’m a match, she’s got dibs on one of my kidneys.

My Family

I’ll admit, I thought twice about putting this down. Not because I don’t love giving them things, but because I figure it’s sort of my job. But I’ll admit, I take enormous satisfaction at times at sitting back and seeing the smiles on their faces and knowing that I had a role in that. And ultimately, they’re the most important to me. And for everything I’ve given them, they’ve given back to me 10x.

What do I want?

Now, I know I’m not on the gift list of most of my readers. So I don’t expect anything, but I’ll say what I want. Be kind. Give time. Give your skills to another. To quote Whitman:

Oh me! Oh life! of the questions of these recurring,
Of the endless trains of the faithless, of cities fill’d with the foolish,
Of myself forever reproaching myself, (for who more foolish than I, and who more faithless?)
Of eyes that vainly crave the light, of the objects mean, of the struggle ever renew’d,
Of the poor results of all, of the plodding and sordid crowds I see around me,
Of the empty and useless years of the rest, with the rest me intertwined,
The question, O me! so sad, recurring—What good amid these, O me, O life?

Answer.
That you are here—that life exists and identity,
That the powerful play goes on, and you may contribute a verse.

What will your verse be in this holiday season?

 

Crossing the Threshold…

So it’s the usual story. You need to upgrade a machine, the IT group says, “no problem, we can virtualize it, it’ll be better! Don’t worry, yes, it’ll be fewer CPUs, but they’ll be much faster!”

So, you move forward with the upgrade. Twenty-three meetings later, 3 late nights, one OS upgrade, and two new machines forming one new cluster, you’re good. Things go live.  And then Monday happens. Monday of course is the first full day of business and just so happens to be the busiest day of the week.

Users are complaining. You look at the CPU and it’s hitting 100% routinely. Things are NOT distinctly better.

You look at the CPUs and you notices something striking:

cpu not being used

CPU 8 is showing a problem

4 of the CPUs (several are missing on this graphic) are showing virtually no utilization while the other  8 are going like gang-busters.  Then it hits you, the way the IT group setup the virtual CPUs was not what you needed.  They setup 6 sockets with 2 cores each for a total of 12 cores. This shouldn’t be a problem except that SQL Server Standard Edition uses the lower of either 4 sockets or 24 cores. Because your VM has 6 sockets, SQL Server refuses to use two of them.

You confirm the problem by running the following query:

SELECT scheduler_id, cpu_id, status, is_online FROM sys.dm_os_schedulers

This shows only 8 of your 12 CPUs are marked visible_online.

This is fortunately an easy fix.  A quick outage and your VM is reconfigured to 2 sockets with 6 cores a piece. Your CPU graphs now look like:

better CPU

A better CPU distribution

This is closer to what you want to see, but of course since you’re doing your work at night, you’re not seeing a full load. But you’re happier.

Then Monday happens again.  Things are better, but you’re still not happy. The CPUs are running on average at about 80% utilization. This is definitely better than 100%. But your client’s product manager knows they’ll need more processing power in coming months and running at 80% doesn’t give you much growth potential. The product manager would rather not have to buy more licenses.

So, you go to work. And since I’m tired of writing in the 2nd person, I’ll start writing in 1st person moving forward.

There’s a lot of ways to approach a problem like this, but often when I see heavy CPU usage, I want to see what sort of wait stats I’m dealing with. It may not always give me the best answer, but I find them useful.

Here’s the results of one quick query.

Fortunately, this being a new box, it was running SQL Server 2016 with the latest version service pack and CU.  This mean that I had some more useful data.

CXPackets

CXPackets and CXConsumer telling the tale

Note one of the suggestions: Changing the default Cost Threshold for Parallelism based on observed query cost for your entire workload.

Given the load I had observed, I guessed the Cost Threshold was way too low. It was in fact set to 10.  With that during testing I saw a CPU graph that looked like this:

43 percent CPU

43.5% at Cost Threshold of 10

I decided to change the Cost Threshold to 100 and the graph quickly became:

25 percent CPU

25% at Cost Threshold of 100

Dropping from 43.5% to 25.6%. That’s a savings you can take to the bank!

Of course that could have been a fluke, so I ran several 5 minute snapshots where I would set the threshold to 10, collect some data and then to 100 for 5 minutes and collect data.

CXPacket_10      CXPacket_10_Waittime_MS
635533 5611743
684578 4093190
674500 4428671
CXConsumer_10              CXConsumer_10_Waittime_MS
563830 3551016
595943 2661527
588635 2853673
CXPacket_100   CXPacket_100_Waittime_MS
0 0
41 22
1159 8156
CXConsumer_100            CXConsumer_100_Waittime_MS
0 0
13 29443
847 4328

You can see that over 3 runs the difference between having a threshold of 10 versus 100 made a dramatic difference in the total time spent waiting in the 5 minute window.

The other setting that can play a role in how parallelization can impact performance is MAXDOP. In this case testing didn’t show any real performance differences with changing that value.

At the end of the day though, I call this a good day. A few hours of my consulting time saved the client $1,000s of going down the wrong and expensive road of adding more CPUs and SQL licenses. There’s still room for improvement, but going from a box where only 8 of the 12 CPUs were being used and were running at 100% to a box where the average CPU usage is close to 25% is a good start.

What’s your tuning success story?