Speaker’s Timeline Part II

This is a follow-up to my first part. But before I dig into it, I want to thank all the readers who check in on my post last week. I had the best week ever in blogging. And I’ll admit, while my ego was pleased with the numbers, I think what really warmed me heart was the number of my fellow #SQLFamily members who retweeted, shared, or gave positive comments about it. Thanks.

So, back to my speaking timeline. On November 11th I’m giving my presentation on PowerShell for DBA Beginners at 2:00 PM EST. I’d be thrilled if you joined me.

So, last time I wrote about this, I had ended with what my next steps would be.

October 20th around 10:00 PM EDT

Upload final version of slide deck. Yes, I could probably improve upon it (and looking back now, there’s 1-2 slides I’d probably like to fix, but oh well).

October 20th – 10:44 PM EDT

Confirmation email: slides were received. Excellent!

October 21st – Midday

One more run through. Basically nail it at right around 0:58. But now really worried, what happens if I finish early before the live Q&A? Will there be 2 minutes of dead air?

October 22nd-24th

Do nothing. I deserve a break. Right? Right?

October 25th – Late Afternoon

Record my presentation with Zoom. It’s acceptable, but I made a mistake or two. Worst case, if I run out of time, I can use this, but honestly, I want a redo. But, like a good dba, I basically have a contingency plan in place in case I don’t get time to do a redo.

October 26th – Morning

Decide to use OBS to record, in part so I can include a window of me talking. I think it’ll be a bit more personal and interactive than simply having slides and a demo with a faceless voice talking.

October 26th – Morning 30 minutes later

What was I thinking? Why go through all this trouble. This is more work than I want to deal with today.

October 26th – Morning 45 minutes later

Ok, this just might work! I’ve figured out how to get the overlay the way I want, but gave up on green-screening me against a background, but that’s ok because the thumbnail video is small enough my background is not distracting.

October 26th – Morning 1:00:08 later

This recording is nearly perfect. I think it ran over by about 8 seconds, but if they cut that, it won’t hurt anything. Honestly, I’d ilke one more try, but I can’t stand the thought of listening to my own voice one more time.

October 26th – Late afternoon

Wait until the kids are done with school Zooms and my wife has no more meetings to start the upload.

October 26th – an hour later

Hmm, seems stuck. Do I wait or start over?

October 27th – an hour and 5 minutes later

PASS Virtual Summit 2020 – ‘Upload Video’ Upload Confirmation email arrives.

Excellent!

October 27th – an hour and 6 minutes later

Tweet about it!

Since Then

Several of my #SQLFamily members admit, some publicly, some in private that they missed the deadlines or at least feel better that they’re running as late as me. I feel for them and I’m glad that my timeline and tweets made them feel better about their own timelines.

Up until I had finally submitted my video, I had put off watching any other presenters talk about PowerShell. But now that I’ve submitted my video, I’ve decided to relax that rule and watch at least one other presentation on an introduction to PowerShell and start to think, “why didn’t I bring that up? Hmm, he’s got a good point there. Hmm, I should have covered that.” I start to have doubts about whether my presentation will hit the mark. But fortunately, upon further reflection I realize the other presenter took a different tack than I did and mine has a focus he doesn’t. Someone watching both will actually get useful information from each of these. Now I’m feeling better. In fact, feeling great because I think this is the way it should be, multiple paths to the same end point that can broaden your horizons. And given the time limitations there’s only so much any presenter can cover in a limited amount of time.

That said, I realize that Rob Sewell is doing a full-day pre-con called Introduction to PowerShell. I’m curious what he’ll cover and both am jealous he has a full-day to do this and thankful I didn’t have to come up with a full-day’s worth of slides and scripts! That said, I know this will be a great one, so highly recommend you attend. I’ve seen Rob present at lest once before and it was great.

October 28th – 6:40 PM EDT

Get an email from Audrey at Pass Summit asking if I want to be part of a part of a live Q&A panel with Rob Sewell, Hamish Watson, Brandon Leach, and Ben Miller at 8:00 AM on the 11th. I have to think about this? There’s some big names on that panel and they want lil’ ol’ me?

October 28th – 6:41 PM EDT

Reply, “Hell yeah!”

October 29th – Over the course of the day

Folks at PASS realize the world is round and that we all live in different timezones and 8:00 AM may not be the best time for folks living Down-Under. Of course their first suggestion for a new time is even worse. Finally Hamish steps in, declares the entire world is in the Hamish Time Zone and that the original time is fine and he’ll let FutureHamish deal with the lack of sleep. Fair enough!

October 31st – Morning

My wife reminds me I’ll be out of the house at 8:00 AM on the 11th. I start to panic, but decide, “I can do it from the car with my cell phone.” So this is going to happen!

November 2nd – 2:00 PM

Tech check with Zoom and all to make sure things will work for next week. Learn a little more about how the recorded session will work. Still nervous for the “live from the car” presentation, but do the tech with the cell phone as my uplink and it works.

It’s getting real.

Today – November 3rd

It’s election day and just over a week from my presentations. I’m excited. I’ve made it clear to work I won’t be available at all on the 11th and not much on the other days. This is going to be a summit unlike any other. I’ going to have to remind myself to actually “attend” it.

And now, finish up a few things and go vote.

I’m voting today for my kids and my friends and my family, blood or chosen. I’ll be voting for the future and for hope.

Voting and Apple Pie – Two American Traditions!

A Speaker’s Timeline

This post will be short, for reasons that are hopefully obvious by the end.

Sometime in February

Hmm, I should put together some ideas to submit to present to SQL Summit in Houston (not Dallas as Mistress SQL pointed out to me) this year.

March 16th

An update, the call for speakers has been postponed. Darn.

March 23rd

Call for speakers is finally open!

March 30th

Submit 3 possible topics.

April 1st

Approach a fellow speaker about a possible joint session, but after discussion, decide not to go ahead with the idea.

June 3rd

Get an update, Summit will be virtual this year. Thankfully I didn’t book any tickets or hotel rooms in Dallas.

July 20th 6:49 PM EDT

Woohoo! I got the email! One of my submissions got selected to present!

July 20th 6:50 PM EDT

Crap, now I actually have to write the entire thing!

July 20th 6:51 PM EDT

Wait, and it’s going to be virtual too. That’s going to make it a bit more of a challenge to present. But I’m up to it!

Sometime in August

I really should get started. Hmm, here’s one of the scripts I want to present.

But honestly, I’m preparing to teach a bunch of cavers and medical students cave rescue, I need to concentrate on that first.

September 5th

I just biked over 100 miles. I’m certainly not working on my presentation THIS weekend.

Later in September

Ok, now I’m going to sit down and really work through this. Here’s a basic outline.

October 1st

Oh wait, it’s going to be virtual AND I have to prerecord it? How is that supposed to work? I had better read up at the speaker portal!

October 2nd

Huh, ok, that sorta makes sense, upload the slides, do a recording, but I still don’t get how it’ll work with a presentation like mine with lots of demos. Well I’ll figure it out.

October 6th around 11 PM EDT

Well the PowerPoint template deck they provided looks pretty slick. I should start prepping my slides.

October 6th, approximately 5 minutes later

There, got the first slide done. Of course it’s only my name and pronouns, etc. But it’s a start.

Oh and the 2nd slide is done, but that’s simply the default PASS slide talking about chapters, SQL Saturday etc, so technically I didn’t do anything there.

I’ll start working on the closing slides.

October 7th, sometime after midnight

Ok, about 5 slides done. I’ll like to myself and say I’ve made great progress!

October 9th, approximately 10:00 PM EDT

Ok, I’ll at least start writing out the scripts I need.

October 9th, 20 minutes later

What the bloody hell? Why is this script failing? I’ve got to present this. If I can’t get this script working how is anyone going to believe that I know PowerShell, let alone actually use it.

October 9th, 5 minutes later

Well, damn, that was an embarrassing mistake, just had the , in the wrong place

October 10th around 9:00 PM EDT

Hmm, to properly demo this, I really need to run against 3-4 SQL Servers and I really don’t want to spin up a bunch of VMS and I can’t use my development one, too much proprietary data there.

I know, NOW is a perfect time to start to learn to use Docker! Why not? And besides Cathrine Wilhemsen has a great post on it. I’ll simply follow that.

2 hours and 1 reboot later

Hey, would you look at that? I’ve actually got a docker container running SQL. This is awesome!

Another minute later

But why can’t I actually connect? What network is it on? Why did I decide docker was easier? Why did I even submit this proposal? What the heck am I doing here? What is the meaning of life?

5 more minutes

That’s it, I’m going to bed.

October 11th, late night

Oh, I get it it now, I didn’t setup a full separate network, it’s bridged and that’s why it’s showing 0.0.0.0. I just need to change the port and I’m good to go!

A minute later

This is pretty awesome. Not what I’d do for a production setup, but definitely works for my demos. Now if I were really smart, I’d also setup persistent storage and the like, but this is good enough. And honestly now, setup a loop, increment a variable and bam, I’ve got 4 instances of SQL running in docker, 2 are 2017 and 2 are 2019. This is really incredible. I’m proud of myself.

Oh and even better, I’m doing all this in a PowerShell script, so I can actually make it PART of my presentation!

October 12th 2:26 PM EDT

Send off an email to the Program folks at PASS asking about how the recording stuff works with demos. Eagerly awaiting a reply.

October 15th, another late night

Yes, there’s a theme here, much of my work is being done late at night. It seems to work for me. But dang that deadline is getting closer!

October 16th, late night, again

Watched some Schitt$ Creek with the family. “Why didn’t we start watching this sooner? It’s hilarious! But I need to work on my presentation some more.”

Get all the PowerShell scripts basically done. I’m happy with it, need to work on my speaking script some.

October 19th 3:00 PM EDT

Get off the phone with a fellow Cave Rescue expert. Just before I get off, I mention my upcoming virtual, prerecorded session I have to finish. He says, “Oh, you know I just did 2-3 of those for a rescue conference, exact same format. It worked out really well. I can send you some details and feedback.”

I find that reassuring.

Also recheck email, still no answer from the folks at PASS on my questions about demos, etc.

October 19th, guess what time

I’ve finished everything, even updated the slides and scripts a bit more. I’m a bit worried I’m going to run too long, but decide to do my first of several practice run throughs.

Do my first full run through. Stop and correct a few mistakes or rough edges here and there. I’m not too worried if I run over now since I know I’ve artificially added some time.

October 19th, 42 minutes later

I get done, look at the PowerPoint timer: 42 minutes. “CRAP! I need this to be 60 minutes!” I’m not too worried, I can add more, but I’m not sure where and I don’t want to simply add fluff for the sake of fluff. I need to give this some thought.

Later on October 19th

Talking to a friend of mine who among other things has a background in adult education. She doesn’t know SQL or PowerShell, but she’s a good sounding board and she’s going to sit through my next run-through, not so much for the technical details but to give feedback on the flow and perhaps suggestions on where I may be making too many assumptions on what my listeners will know.

October 20th Early Morning

It’s a Tuesday, time to blog. As always I face that question, what should I blog about?

“I know, I’ll blog about how I’m getting my presentation together and the deadline is fast approaching. I can’t be the only speaker that often finds themselves up against the deadline and panicking.”

Next 36 hours

Add a bit more content and run through it 2-3 more time and then… RECORD! (technically it looks like I have until the 26th to upload my recording, but I want to get done early).

Conclusion

The above may or may not be a wholly accurate timeline or description of the process I’ve gone through trying to get my presentation ready for Pass Virtual Summit. I may have elided a few details and over-hyped a few others, but in general it’s close to true and accurate. Despite my always best intentions, I find myself often working up close to the deadline for submissions. Since for Summit they want NEW presentations, I can’t simply dust-off one of my previous presentations and use that, so there’s definitely more work involved here.

And honestly up until I learned it was going to be prerecorded, I thought I’d have most of October to work on it. The deadline to get the slides and recordings submitted sort of threw my original timeline for working on it in the dumpster so I’m actually a bit further behind than I expected to be.

On the other hand, I really did learn to use Docker and I think that’s valuable and I am making that part of my presentation. And, when all is said and done, I think I’ll be happy with it. I think though like any good speaker, I’ll look back and think “well next time, I’ll have to improve this or that.” There’s always room for improvement. I’m not keen on giving it prerecorded. I value the instantaneous feedback I get from the audience. So that will be different. But I at least can elicit questions during the presentation and there’s a life Q&A afterwards. But, I’ll still be nervous.

I’m in awe of speakers who get their presentations all prepped and prepared months in advance, but I suspect there’s a number out there like me, that don’t operate that way. And I suspect there’s a few who are even more nervous than I thinking, “OMG, am I the only one in this spot?” Nope, you’re not. Or rather, “Please let me know I’m not the only one!”

See you all at Summit, at least virtually!

And in the meantime there’s another possible deadline coming up I need to think about…

A Summit To Remember

There’s been a lot of talk about the 2020 PASS Summit and how the impact of making it virtual this year. I’ve even previously written about it. I’ll be clear, I would prefer an in-person summit. But that said, I think having it virtual does provide for some fascinating and interesting possibilities and I look forward to seeing how they’re handled.  It will certainly be different being able to watch a session at a later time as a default option. And my understanding is that session schedules will no longer be constrained by the timezone the Summit is being held in.

That said, I also have to admit a certain bias here. I’ve wanted to speak at Summit for a couple of years now and have been turned down twice in the past two years. This year I was hoping again to speak, but alas, I procrastinated a bit too long and literally missed the original window to submit by a few hours.

Note I said original window. Because the Summit was moved to a virtual Summit the decision was made to re-open the call for speakers. This time I took advantage of that 2nd chance and submitted a bid.

And I’m so glad I did. Because if you didn’t have a reason to attend summit before, you do now! You get to hear me talk about PowerShell! So, I’ll admit to getting an unexpected benefit out of the move to a Virtual Summit.

I still recall one of my first attempts to use PowerShell at a client site, about 8 years ago. It did not go well. The security policy wouldn’t let me do what I wanted and the available knowledge on the Internet was sparse. Basically I wanted to loop through a list of servers and see if they had SQL Server installed. I eventually gave up on that project.

Since then though, I’ve been drawn to PowerShell and have come to love it. Now, when you hear a DBA talk about PowerShell, they will almost always mention dbatools. I want to go on record right now, I think it’s a GREAT addition, but I rarely use it. Not because there’s anything wrong with it, but mostly because my current usage is a bit different than what it provides. I do talk about it a bit here though.

For the talk I’ll be presenting, my plan is to start with a real simple PowerShell Script and slowly build on it until it’s a useful script for deploying SQL Scripts to multiple servers. For anyone who has read my articles at Red-Gate, much of this will be familiar territory, but I hope to cover in 75 minutes what I cover in 3-4 articles.

Checking this morning, I noticed that I’m among good company, and it’s humbling to see it, when it comes to speaking about PowerShell.

So, I hope you “come” and see me present on PowerShell at SQL Summit 2020. I’ll be in New York, where will you be?

Yesterday was “A Monday”

Yesterday was a Monday. I don’t just mean it was Monday, but it was in the Garfield comic sense of things A Monday.

As a consultant, I’ve come to expect certain patterns in my work load. For one client, I know approximately every 2 months, over 2 weekends I’m going to have to patch their SQL Servers. I know certain passwords will need to be updated quarterly or annually. And I know sometimes I’ll have A Monday.

Yesterday was one of those. I woke up, checked my email and noticed two jobs had not run. So I logged in and it appeared that the PowerShell script on each server had hung. I killed it and tried to rerun it, but got an error. This wasn’t entirely surprising. This script, in its first part downloads a file from a 3rd party vendor and last week for example, their SFTP server had been down. At first I expected this to be the problem again. But further testing showed I was getting inconsistent errors. Finally the script ran. But, what normally took about 20 minutes to download, took about 2 hours. We learned later the vendor had done an upgrade to their product over the weekend. This shouldn’t have impacted their SFTP server performance, but here we were. Today (Tuesday) the process took 20 minutes again and is back to normal. Chalk yesterday’s issue up to being A Monday.

Then I took a look at another job that had failed. This one is purely internal. Basically SFTP a file from a Linux server to a NAS for a backup. A quick check showed that the NAS share was inaccessible. Reporting this triggered an avalanche of emails back and forth. The most interesting line basically came down to “Yes, the internal IT team did a migration of the NAS, but the migration was supposed to be completely transparent to the users.” Famous last words in my book. Actually, honestly, what I decided was more disturbing was that the failure was on the new NAS device apparently due to a typo. To me, this means, most likely, all the old shares were recreated on the new device by hand, rather than using a script that read out the old shares and recreated them. In any event, the problem was solved, the job was rerun and the backup created on the now new NAS. Chalk that one up to being A Monday.

Then one of the developers for one of the platforms at this client emailed me and said, “Hey database FOO is in recovery mode, what happened?” This one, fortunately I knew exactly what the problem was. Unfortunately I knew it was my fault. We had decided to reconfigure that database to be a log-shipped copy of the main database and I had set it up over the weekend. I had simply forgotten to set it up to place itself in Stand-by/Read-only mode after it had applied the most recent logs. I’ll chalk that one up to it being A Monday.

All of the above was taken care of before 10:00 AM. The rest of the day was filled with a variety of other issues and items, including looking at a Hyper-V host machine with 16 physical CPUs with hyperthreading turned on hosting 4 VMs, 1 with 4 vCPUs allocated, and the other 3 with 8 each. They’re having performance issues. I’m still tackling that one. Looking at that happened on Monday, but it’s not A Monday issue, it’s been an ongoing issue for months.

So what was it about this particular Monday, or Mondays in general?

Well in this case, all 3 of my early AM issues had one thing in common: upgrades or changes made over the weekend. I’m not going to debate the value or wisdom of the timing here, but just note, that on the particular Monday, it wasn’t just one issue, but three. It was definitely A Monday. But I survived as did my customer.

Now back to my regularly scheduled workload.

 

Checking the Setup

A quick post outside of my usual posting schedule.

I was rewriting a T-SQL sproc I have that runs nightly to restore a database from one server to another. It had been failing for reasons beyond the scope of this article. But one of the issues we had was, we didn’t know it was failing. The error-checking was not as good as I would have liked. I decided to add a step that would email me on an error.

That’s easy enough to do. In this case I wanted to be able to use the stored procedure sp_notify_operator. This is useful since I don’t have to worry about passing in an email address or changing it if I need to update things. I can update the operator. However, the various servers at this client had been installed over a several year period and I wasn’t sure that all of them had the same operator configured. And I was curious as to who the emails the operators went to on those machines.  Now, I had a decent number of machines I wanted to check.

Fortunately, due to previous work (and you can read more here) I have a JSON file on my box so I can quickly loop through a list of servers (or if need be by servers in a particular environment like DEV or QA).

$serverobjlist = Get-Content -Raw -Path “$env:HomeDrive$env:HomePath\documents\WindowsPowerShell\Scripts\SQLServerObjectlist.json” | ConvertFrom-Json
 
foreach ($computername in $serverobjlist.computername)
{
$results = Invoke-Sqlcmd -ServerInstance $computername -query “select name, email_address from msdb.dbo.sysoperators”
write-host $computername $results.name $results.email_address
$results = Invoke-Sqlcmd -ServerInstance $computername -query “select name from msdb.dbo.sysmail_profile”
write-host $computername $results.name `n
}

This gave me a list of what operators were on what servers and who the emails went to. Now if this were a production script I’d probably have made things neater, but this worked well enough to do what I needed. Sure enough, one of the servers (ironically one of the ones more recently installed) was missing the standard mail Profile we setup. That was easy to fix because of course I have that scripted out. Open the T-Sql script on that server, run it, and all my servers now had the standard mail profile.

Once I had confirmed my new restore script could run on any of the servers and correctly send email if there was an error it was time to roll it out.

deploy

Successful deploy to the UAT environment

So one quick PowerShell Script, an updated T-SQL Script and a PowerShell Deploy Script and my new sproc has been deployed to UAT and other environments.

And best of all, because it was logged, I knew exactly when I had done it and on what servers and that everything was consistent.

I call that a win for a Monday. How is your week starting?

 

 

Getting My Hands Dirty…

… and my clothes cleaned. Or more importantly, dried.

Before I was a programmer I worked for my dad in construction over the summers of high school. It was good solid work. I enjoy working with my hands at times. For one thing, you see and feel the results of your accomplishments in a very tangible manner. For another, you generally can measure the impact of your effort.

After my dad died, I wrote about using a drill of his to work on the addition that was to become my office. I liked the heft and feel of it. I knew I was accomplishing something with it. Being a programmer, sometimes it’s hard to experience that. Currently for example I’m working on an ETL script using PowerShell to SFTP down a file, extract it to some tables and then feed it into Salesforce. For me, it’s just a bunch of data. Yeah, there’s some fun challenges; learning how to setup and deal with GPG and designing a robust and secure information because some of the data is sensitive. But, once the project is finished, it’ll run silently and other than an occasional email, I won’t think much about it. It won’t impact my day to day life in any way and I won’t be able to point to it and say, “See, THERE is something I did.”

But, my dryer on the other hand, now that’s different. For awhile now (say at least a year or two) whenever I’ve run a load it’s made a fearsome rumbling sound. It’s been annoying, but we’ve managed to live with it up until 6 or 7 weeks ago. Generally I’d do most of the laundry on Sundays and if there was a load or two left, on Monday while everyone else was out of the house or at school. But obviously things changed. My wife’s office is in the room next to the laundry room. Whereas for me the rumbling was faint and simply background noise, for her it was quite noticeable.  I tried to work the loads around her work schedule, especially since she’s on so many conference calls for her job, but it was getting less and less practical.

It was finally time for me to do something about it. Now, had I been smart, I’d have started the project on a Monday. But, I’m not always that smart. So, Saturday came around and I disassembled the dryer.

I was fairly confident I knew what the problem was. I assumed that either something had wrapped around one of the rollers for the drum, or a bearing in a roller had seized. If it was the former, the fix would be trivial and I’d have the whole thing back together before dinner. If it was the latter, I figured a shot of silicone or other lubricant and I could at least get a few more weeks out of it while I ordered the parts. And since the tight screws were now loosened and I knew how to take it apart, the final fix would go quickly.

Well, as they say, you know what happens when one assumes. I was wrong about the first guess, it was not something as simple as something wrapped around the roller. And I was even more wrong about it being a seized or flattened bearing. See for that assumption to be valid the bearing assembly inside the wheel has to actually exist.

20200502_162747

Bearing Assembly? What Bearing Assembly?

It’s a bit hard to make out, but inside the blue part of the wheel above, and behind the plastic triangle, there is supposed to be a nice little bearing assembly.  There is none.

20200502_163501

Better view of the roller

You can see the wear on the inner hub.  This is what in the trade is called “less than optimal.”

More seriously though, it unfortunately meant that this was not going to be a quick fix. I had been planning on ordering the parts, but this made it a bit more of a rush. The dryer contains four of these rollers and as such I ordered a four pack, since generally my assumption on items like this is is that if one has worn, all four are worn. Now, none of the other three have shown nearly the damage, but figure, I’m in there, I might as well make it right.

What’s most interesting to me, is that there’s literally NO sign of the roller assembly in the dryer. However it got destroyed, it was pretty cataclysmic.

I also took the time to clean out the rest of the interior space and correctly deduced that the moisture sensor was covered with lint. Now that I know where it is, I can keep that clean in the future.

In any case, sometime later this week, I’ll get my package, swap out the rollers and reassemble the dryer and start doing laundry again. Quietly.

But, unlike the ETL I’m writing above, this change will have a direct noticeable impact on my life I’ll be aware of every time I do a load of clothes. I like that.

This week’s takeaway? I do enjoy my job and the challenges that come with it, but there’s something to be said for doing work you can touch and feel and experience the tangible impact.

20200505_090057

My best sourdough yet!

And perhaps I shouldn’t be posting pictures of homemade bread after talking about dirty hands. Don’t worry, I washed my hands!

The Year So Far

Today happens to be the last day of the month and the last day of the quarter. And according to my calendar, it’s the 4th Blursberyday of the month of Holiecouw.

I decided to take a look back at my first post of the year: 2020 in Preview. Wow, a lot has changed in a scant three months. I mentioned I was reading Station Eleven. It’s set in a post-apocalyptic world after a world-wide flu pandemic. Little did I know at the time I’d be living that reality a scant 3 months later. Ok, this is not nearly as bad as in the book, but it does give on pause to think. We are living in a time of upheaval and it will be interesting to see how this current pandemic changes social structures for coming years.

I wanted to speak at SQL Saturdays. Well, almost every one I’ve put in for or was planning on putting in for has been cancelled or delayed. So much for that goal. On the other hand, members of the #SQLFamily have been holding Friday afternoon (and other times) Zoom hangouts as sort of a morale boost. So I’ve actually gotten to know a number of my fellow DBAs and fellow speakers, so that’s better.

Fortunately, I’m still working. As a consultant, you realize every meal may be your last meal, so you keep working at it and hoping more meals are coming your way. So far my biggest client shows no sign of slowing down, nor does my second largest client. I’ve been fortunate, I know a number of folks across many industries who have been hit with a temporary or even permanent job loss. This is going to be hard for many.

But, I’ve also been taking the time to do more webinars. Last week I sat in on a Redgate webinar on the state of DevOps that was quite informative. The next day, Kendra Little (also of Redgate) gave the WIT webinar and also talked about DevOps. Both were quite informative and I learned a lot. I look forward to the upcoming Redgate Streamed event.

I’ve been using git more and more. I started using it integrated with Visual Studio about two years ago I think. But, after seeing my son working on a project where he was using it at the command line, I decided it was time to start to do that and now for one client that’s my de facto way of checking in and out changes I’ve been making to the PowerShell scripts I write for them. Next up, more version control for the SQL Scripts. I’ve already written a small deploy script I use to deploy scripts and changes and more importantly to log them. So while that client hasn’t really adopted DevOps, I’m doing my part for my small corner of work.

My next goal is probably starting to learn how to use Docker more. Cathrine Wilhemson’s blog post on that has convinced me it’s time.

And I finally finished binge-watching Haven.

So, the last few weeks haven’t been exactly what I planned for, and the upcoming months won’t be what I planned on either, but it hasn’t been a terrible time. What about you?

P.S. While out biking the other day, a thought dawned on me. Many post-apocalyptic books (such as Station Eleven) have characters using cars, but more like carts, either pulling them themselves or with horses because once the gas runs out, you can’t make more. But I got wondering how having a large number of electric vehicles would play out in such a world. Yes, much of the infrastructure would be gone, but even if you had to carry panels with you (much like Mark Watney in The Martian) you could probably be far more mobile. Hmm…

It’s Easier the Other Way!

This is what someone said to me while biking up a hill the other day.

I have a certain bike route I do that includes stopping at the supermarket along the way. It’s just over 5 miles. Obviously over the course of the entire route I return to the same place I started, but the topography varies along the way. But the simple fact of the matter is that the supermarket is at a lower altitude than the house, and about 2/3rds of the way along the route in the direction I go.

So when I tell people about biking this route, I point out that it’s sort of a double-whammy. When I get to the lower point in the route, I go shopping. To finish my loop, my bike is often heavier with groceries and I have all the altitude to gain back (which admittedly isn’t much, a bit over a 100′) in a short distance (maybe 2000′).  It was along this section that my erstwhile adviser offered his advice.  My reply of course was that “true, but home is that way!”, I said pointing up the hill.

Unfortunately, sometimes one can’t do it the easier way. One has to do it the hard way.

But, this leads into something else I wrote on Quora.com: Do Bad Programmers Know hey write bad code? I only partly addressed the question, but to sum it up, I think the worst don’t, but the best programmers do, and sometimes intentionally.

The truth is, we probably should always be writing the best code we can. It should handle error trapping and handling. It should validate inputs. It should fail gracefully.

But often, we need a one-off script. Something that gets the job done here and now.

I recently did a 5-minute lightning round for my SQL User Group on the benefits of using PowerShell. I took to quick and dirty scripts I had written and rewrote them a bit for the presentation.  Afterwards, one of the attendees asked me a few questions about my stylistic choices in the code.  He was right in general, but I pointed out what my goal was. My goal was more to show what PowerShell could do than to actually show how to write good code in PowerShell.  That said, I probably should have written slightly better code, but this got the job done. It definitely didn’t need error handling and the like. It was good enough.

And ironically, this post is sort of like that. (I love it when I can get meta on my own posts). I have about a dozen drafts I have saved in WordPress. Most have just a title and a quick set of notes on what I think I should write about. This post was mostly written and just needed a bit more to flesh it out.  It was easier this way than trying to come up with a new topic for this week. Hope you enjoyed it. (and to keep things even easier, I’m going to let WordPress use a random photo for it!)

How Much We Know

Last night I had the privilege of introducing Grant Fritchey  as our speaker to our local user group. He works for Redgate who was a sponsor. The topic was on 10 Steps Towards Global Data Compliance.  Between that and a discussion I had with several members during the informal food portion of our meeting I was reminded me of something that’s been on my mind for awhile.

As I’ve mentioned in the past, I’ve worked with SQL Server since the 4.21a days. In other words, I’ve worked with SQL Server for a very long time. As a result, I recall when SQL Server was just a database engine. There was a lot to it, but I think it was safe to say that one could justifiably consider themselves an expert in it with a sufficient amount of effort. And as a DBA, our jobs were fairly simple: tune a query here, setup an index update job there, do a restore from backups once in awhile. It wasn’t hard but there was definitely enough to keep a DBA busy.

But, things have changed.  Yes, I still get called upon to tune a query now and then. Perhaps I making sure stats are updated instead of rerunning an index rebuild, and I still get called upon to restore a database now and then. But, now my job includes so much more. Yesterday I was writing a PowerShell script for a client. This script calls an SFTP server, downloads a file, unzips it and then calls a DTSX package to load it into the database.  So now I’m expected to know enough PowerShell to get around. I need to know enough SSIS to write some simple ETL packages. And the reason I was rewriting the PowerShell script was to make it more robust and easier to deploy so that when I build out the DR box for this client, I can more easily drop it in place and maintain it going forward.  Oh, did I mention that we’re looking at setting up an Availability Group using an asynchronous replica in a different data center? And I should mention before we even build that out, I need to consult with the VMWare team to get a couple of quick and dirty VMs setup so I can do some testing.

And that was just Monday.  Today with another client I need to check out the latest build of their application, deploy a new stored procedure, and go over testing it with their main user. Oh, and call another potential client about some possible work with them. And tomorrow, I’ll be putting the finishing touches on another PowerShell article.

So what does this have to do with last night’s meeting on Global Data Compliance? Grant made a point that in a sense Data Compliance (global or otherwise) is a business problem. But guess who will get charged with solving it, or at least portions of it?  Us DBAs.

As I started out saying, years ago it was relatively easy to be an expert in SQL Server. It was basically a single product and the lines tended to be fairly distinct and well drawn between it and other work. Today though, it’s no longer just a database engine. Microsoft correctly calls it a data platform.  Even PASS has gone from being an acronym for Professional Association of SQL Server to simply PASS.

Oh, there are still definitely experts in specific areas of of the Microsoft Data Platform, but I’d say they’re probably more rare now than before.  Many of us are generalists.

I mentioned above too that I’d probably be more likely to update stats than an index these days.  And while I still deal with backups, even just the change to having compression has made that less onerous as I worry less about disk space, network speed and the like. In many ways, the more mundane tasks of SQL Server have become automated or at least simpler and take up less of my time. But that’s not a problem for me, I’m busier than ever.

So, long gone are the days where knowing how to install SQL Server and run a few queries is sufficient. If one wants to work in the data platform, one has to up their game. And personally, I think that’s a good thing. What do you think? How has your job changed over the past decade or more. I’d love to hear your input.

A Good Guy

I wrote previously about the dangers of calling yourself an ally. Two completely unrelated incidents in the last week reminded me of that post. Both on their own are rather small items, but I think worth considering.

The first basically happened to a friend at a recent rally in NYC to support the Jewish community. Apparently a young non-Jewish woman accosted an elderly Jewish immigrant at the march for comments he had made about the goal or purpose of the rally. Or to put it another way, a non-Jewish person was telling a Jewish person that the way he was expressing his support for Judaism was wrong. Let that sink in for a minute. Now, to be fair, as a my Jewish friend commented, the young woman’s comments weren’t necessarily technically wrong, but they were out of place.

In the second incident, I replied to a comment a friend had made on Twitter. In reaction she sent me a pair of emojis that equated to, “seriously?” I was confused at first because my tweet had been intended to agree with and support her observation. However, because, as she put it, “I was one of the good guys” she wanted to explain how my reply could be perceived as a form of mansplaining. She realized I hadn’t intentionally tried to overshadow her comments or to be rude. She would have had no problem calling me out in public had that been the case. Instead, she took the time to privately explain to me why what I had done was problematic. I ended up, despite her saying it was unnecessary, removing my tweet because I was no longer comfortable with it. I realized were better ways of I could have replied.

The point of my two examples isn’t to say that the young woman was a bad person, or to self-flagellate myself. The point is that even as a ally, one will make mistakes. This is in part because by not being an actual part of the group in question, one can’t fully internalize what it means to be part of that group and how comments and actions will impact members of that group. But, one can ideally still listen and learn. I appreciate that my friend took the time to explain to me why my tweet was problematic. She was under no obligation to do so. But I appreciate it.

That said, two other quick items: I want to toss a shout out to the South Florida BI SQL Saturday. One can’t go 100% based on names as to how one identifies, but the organizers have tweeted about how they managed to have a 50/50 balance of men and women presenting. It is definitely possible to do this folks.

Finally, a shoutout for my latest Redgate article on Comments and More in PowerShell. This was a fun one to write. I hope you enjoy it.