Challenge Accepted!

Monica Rathbun in a recent blog post commented on how hard it is to write a blog post in under 5 minutes and challenged her readers to try to do it.

The only thing I can say is… challenge accepted.

But what to write about?

How about how I write, or rather how an idea gets into a blog post.

I have to admit, some Tuesdays my mind is blank. I sit at the screen, sometimes for 5 minutes or longer and my mind draws a blank. That’s rare. Fortunately, I often, sometime in the previous 6 days or so get an idea in my head and start to think about what I should write on it. It might have been a particular issue at work I had to solve, so I might be focusing on a more technical SQL or PowerShell focused blog.  Or it might be something I’ve seen that amused me.  This means I mull the thoughts over in my head and often have a basic outline before I put fingers to keyboard. The can help me cut down on the time I spent blogging.

I’ve also got about a dozen drafts saved in WordPress where I simply write a few lines of an idea for future posts. These are my saving graces. When I really can’t think of an idea I’ll go back and pull one of those up and finish them, such as this one which lay in draft status for months.

So, looking I think I failed. I think this one took just over 5 minutes. And to save time, I’m ignoring adding a picture, so you get the default. For now.

 

Janus 2 – 2019

“All my life’s a circle” – Harry Chapin

The New Year is now upon us. It’s now January around the world.  For those who don’t know where the name of the month comes from, or why my previous blog post and today’s are named as they are, it comes from the Roman God Janus.  Janus looked backwards and forwards. I thought it was appropriate for posts bracketing the New Year. In addition, the name of the month January is often believed to come from the name of the god, but that appears to be a false etymology.

Yesterday I looked back. Today, I’ll look forward.  I’m not necessarily a fan of New Year’s Resolutions (other than resolving to live one more year, which I’ve been successful at so far every time) so call these goals:

  • Continue to blog at least once a week. Last year I think I missed a week while on vacation, but otherwise I pretty much succeeded.
  • Hit 2000 page views. Last year I hit 1907.  I think I can exceed that this year. Of course I’ll need your help!
  • Continue speaking at SQL Saturdays. I haven’t set my schedule, but I already have 3-5 in mind. I’m not sure I’ll do 6 again, but we’ll see.
    • Expand my “SQL Server for under $200” session
    • Expand my “SQL Server Backups” (perhaps into a full precon)
    • Add one more topic to my list of sessions (see current ones here)
    • Shoot for at least one overseas engagement
  • Shoot for speaking at SQL Summit!
  • Figure out how to get an MVP!
  • Publish at least 3 more articles for Redgate’s Simple Talk
  • Continue to promote and support Women in Tech as well as other minority groups
  • Continue to learn PowerShell
  • Continue to learn about SQL Server on Linux
  • Play with containers, just a bit. This is really a minor goal given all the others I have, but I figure I should learn a little.
  • Pick up at least 1-2 more decent sized customers
  • Continue teaching cave rescue
  • Cave more!
  • Hike more!
  • Bike more!
  • Travel
  • Have fun!

That last goal is important to me. If I’m not enjoying what I’m doing, why do it? Life is too short to hate what you do with life. If you can find a way to enjoy life, do it!

Most of the goals above are SQL related, but that doesn’t mean that’s the major focus of my life. It’s just the place this blog touches upon the most these days.

I have a number of personal goals, but that’s for me and I won’t be sharing here.

In any event, I wish everyone in my biological family, #SQLFamily, Caving family, and other chosen families a wonderful and amazing New Year and hope that the new year brings you peace and happiness.

Janus 1 – 2018

As the year draws to a close, I thought I’d look back on the year a bit.

The goal of this blog has been to give me a place to reflect on the purpose of this blog.  I claim in My Goal Here to want to reflect on how we think and what drives certain decisions. And I suppose at times that’s true. At times it’s to give actual SQL or IT related advice.  But at times, it’s simply an exercise in my ability to put fingers to the keyboard and words on the screen and to be a bit self-indulgent if I’m honest.

My most popular page this  year was a mixture of things: The Streisand Effect. It was a bit of an activism piece about events at my alma mater and a chance to broaden my blog to more readers. But, it did also serve to actually reach one of my primary goals; to reflect on how we think and make decisions; primarily sometimes by trying to tamp down an issue, we only serve to draw more attention to it and to inflame things further.

My second most viewed piece this year was one of several on sexism, especially in the IT industry: Math is hard, Let’s Go Shopping. I still haven’t finished the book mentioned in the post, but it’s on my list to finish. The issue of sexism in my primary industry is one that has grown in importance to me and I expect to write more about it in the coming year and to try to do more about it.

Reviewing my SQL Saturday’s in 2018, I had the honor of speaking at Colorado Springs, or at least trying to, which I wrote about here; SQL Saturday Philadelphia, SQL Saturday Atlanta, SQL Saturday Manchester UK (my first overseas SQL Saturday, where I had a blast!), SQL Saturday Albany, and finally SQL Saturday DC. I also presented at the DC SQL User Group in September.  All great times and I had learned a lot and had a great time meeting new people and reconnecting with old friends.

I put in to speak at SQL Pass Summit, but again didn’t make it. But I still attended and had a great time.

I also was pleased to be asked to write for Redgate’s Simple Talk where I know have two articles published on using PowerShell for SQL: My first and second. I’ll be submitting my third article in coming weeks.

But not everything I did or wrote about was SQL related or even IT related. In late June, 13 people became trapped in the Tham Luang Nang Non cave in Thailand. This became a world-wide media event that a few weeks later I found myself part of. Besides at least four blog posts of my own that touched upon it, in my role as a regional coordinator of the National Cave Rescue Commission I did close to a half-dozen media engagements, including one for The Takeaway NPR program.

Oh, one more interview I did this past year was with Carlos Chacon and Steve Stedman of SQL Data Partners: it was a podcast I did with them. You can read about my thoughts here and listen to the podcast here. And definitely go to Amazon and buy my book!

Anyway, it’s been a great, and eventful year and I appreciate everyone who has read my blog and even more so to those who have commented on it, shared it, or somehow given me feedback.

I’m looking forward to 2019. I hope you are too.

SQL Pass 2018

Next week I’m off to the SQL Pass conference in Seattle.  This will be my 4th peregrination to Seattle in 4 years. This has become an annual trip for me. There’s one very obvious reason for going and then a 2nd also important reason.  SQL Pass is one of the top events for folks who work with SQL Server. It’s a 3 day conference (plus up to 2 days of pre-con events, including at least one meeting I’ll be attending as our local group leader) full of technical sessions covering a wide range of topics related to SQL Server and related technologies.

Four years ago, when I first attended, I was a newbie and wasn’t sure what to expect. My father had recently passed and I wasn’t entirely sure I still wanted to make the trip. But tickets had been bought and the price to attend been paid, so I decided to go. One of the first (perhaps the first) session I attended, was a session by Kathi Kellenberger on how to get published as an author. I had for years toyed with an idea for a book and I figured it couldn’t hurt to attend and perhaps learn something. Her session was quite helpful and I approached her afterwards for more input and she introduced me to one of the editors at Apress. I pitched my idea and a few months later, the contracts were signed.  All I had to do now was actually write the thing.  So, I ended up writing IT Disaster Response: Lessons Learned in the Field. (btw, I do obviously recommend it, it covers IT disasters, plane crashes and cave rescues. It’s not your standard cut and dry boring book on disasters.)

A friend of mine who owns a book shop once said, “anyone can write a book, it’s harder actually publish a book.” I had now done both. It was a bit bittersweet because my dad had been an English major and had always wanted to write a book and be published. Now, admittedly, he wanted to write fiction, which I think is far harder, and in his day, the idea of “print on demand” like what Apress tends to do, didn’t really exist.  And to be honest, at the end of the day, as Kathi warned me, if I was in it for the money, I’d be better off in terms of hours spent, getting a job at McDonald’s.

But, I digress. That book ended up being my first foray into actually getting paid to write.  As I mentioned in an earlier blog post I’ve now contributed to Red Gate’s Simple Talk program with my post on an Intro to PowerShell. And my second post has been submitted and accepted and hopefully is going up in a few weeks or so.

So, to say my first PASS event changed my live would probably be accurate.

Beyond that one session four years ago, I’ve attended many other sessions and learned a wealth of knowledge and leveraged that in my job and in finding speakers for my local SQL Server User Group which I now lead. One of my favorite speakers I had in the last year was Bob Ward who did a remote presentation for us about SQL Server on Linux. And this despite me being a Patriots fan and him being a *cough* Cowboys fan.

So again, I look forward to seeing a lot of my #sqlfamily out in Seattle next week. But I still won’t be doing karaoke, sorry Aunt Kathi!

But I also mentioned a second reason for visiting: my non-sqlfamily, what I might call my #rocfamily.  The Rensselaer Outing Club has a number of alumns who all live in the area and we’ve started a yearly tradition of getting together for take-out Thai food at the house I stay in. ROC in its own way changed my life, among other things, teaching me how to be a leader and an effective decision maker.

In addition to all my fellow ROCcers, there’s at least one from my days on sci.space.* on Usenet (where I can still be found btw) and a few other friends I’ve made over the years. I’m quite looking forward to seeing them all.

So see you all next week in Seattle!

Sharing and Building

I’ve mentioned in the past that I think it’s important to share and give back knowledge.

This week’s blog post will be short (sorry, they can’t all be great works of art.) But first I want to mention an event that just happened. I’m the leader of the local SQL Server User Group: CASSUG. We had our monthly meeting last night and I was grateful that Hilary Cotter was willing and able to drive up from New Jersey to present on Service Broker.

When I arrange for speakers, I always hope my group gets something out of it. Well, last night we had a new member visiting from out of town. So, it’s probably rare he’ll make future meetings. And today, I read from him: “Hilary’s presentation was very informative and interesting. “ and “Now it has piqued my interest and I’ve started a Pluralsight course to learn more.”  To me, that’s success.

At our July meeting we had lightning rounds. Instead of a single presenter, we had four of our local members present on a topic of their choice for about 15 minutes each.  One of them, presented on using XML results in a SQL query to help build an HTML based email. He adopted the idea from I believe this blog post. Twice now in the last month I’ve used it to help clean up emails I had a system sending out. Yesterday, I finally decided to cleanup an old, ugly, hard to read text based email that showed the status of several scheduled jobs we were running overnight.  A few hours later, after some tweaking I now had a beautiful, easy to read email.  Excellent work and all based on an idea I never would have come up with it my colleague had not shared it from his source.

And that leads me to a bit of self-promotion. When I created this blog, my goal was not to have lots of posts around SQL Server. Several months ago, a mentor of mine (I don’t know if she considers herself that, but I do, since she’s the one that planted the seed in my head for my first book: IT Disaster Response: Lessons Learned in the Field) approached me at SQL Saturday Atlanta and mentioned she was now an editor for Red-Gate’s Simple-Talk blog section and asked me if I’d be interested in writing.  I was.

So I’m proud to say that the first of my blog at the Red-Gate Simple-Talk site is up. Go read it. I’m excited. As of today it’s had over 2000 views! Far more than I get here. And there’s more to come.

And here’s the kicker. Just today I had a client say, “Hey, I need to get this data from this SQL 2014 database to a SQL 2008 Database.”  I was able to say, “I’ve got JUST the answer for that!”

Sharing knowledge is a good thing. It makes us all far more capable and smarter.

 

Debugging – Powerhell (sic)

Generally I don’t plan to talk too much about specific programming problems. There’s too many blogs out there about how to solve particular problems; and most of them are generally good and even better, accurate.

I’m going to bore most of you, so if you want to jump to the summary, that’s fine. I won’t be insulted (heck I won’t even know!)

That said, last night I spent a lot of time debugging a particular PowerShell script I had written for a client.

I want to talk a bit about what was going on and how I approached the problem.

So, first the setup: It’s really rather simple

  1. Log into an SFTP site
  2. Download a zip file
  3. Expand the zip file
  4. Run an SSIS package to import the files in the zip package to a SQL Server

I had written the script several months ago and it’s been in production since then.  It’s worked great except for one detail. When it was put into production, we didn’t yet have the service account to run it under, so I set it up to run as a scheduled task using my account information. Not ideal, but workable for now.

Since then, we received the service account. So, I had made several accounts to move to the service account, but it wasn’t working.  Yesterday was the first time in awhile I had time to work on it so I started looking at it in more detail.

Permissions Perhaps?

So, the first problem I had was, I’d run the scheduled task and nothing would happen. Ok, that’s not entirely accurate. The script would start, but never run to completion. It just sort of sat there. And it certainly was NOT downloading the zip file.

My first thought was it might be a files permission or other security issue. So the first thing I did was write a really simply script.

get-date | out-file “D:\foo\timetest.text”

Then I setup a scheduled task and ran it manually. And sure enough, timetest.txt showed up exactly where it was supposed to.

So much for security in the OS.

out-file to the Rescue!

So, my next step is probably one of the oldest debug techniques ever: I put in statements in the code to write to a file as I hit various steps in the code.  Not quite as cool as a debugger, but hey, it’s simple and it works.

It was simple stuff like:

“Path $SftpPath Set” | out-file ftp_log.txt -append

BTW, if you’re not familiar with Powershell, one of the nice things is in the above statement, it’ll automatically insert the variable $SftpPath into the string it’s printing out.

This started to work like a charm. And then… jackpot!

“Setting Session using $sftpURL and Password: $password and credentials: $Credential” | out-file ftp_log.txt -append

When I ran it, I’d get a line in the log like:

Setting Session using secureftp.example.com and Password: System.Security.SecureString and Credentials: System.Management.Automation.PSCredentials

Not the most informative, but useful. And it’s nice to know that one can’t easily just print out the password!

But, when I ran it as the service I was getting something VERY different.

Setting Session using secureftp.example.com and Password:  and Credentials:

HUGE difference. What the heck was happening to the password and credentials?

That’s when the first lightbulb hit.

Secure credentials in PowerShell

Let’s take a slight side trip here.  We all know (right, of course you do) that we should never store passwords in cleartext in our scripts.  So I did some digging and realized that PowerShell actually has a nice way to handle this. You can pass a plaintext string to a cmdlet and write that out to a file.  Then when you need credentials, you read from the file and it gets parsed and handed to whatever needs it.

$password = get-content $LocalFilePath\cred.txt | ConvertTo-SecureString

Of course first you have to get the password INTO that cred.txt file.

That’s easy!

if (-not (test-path $LocalFilePath\cred.txt))
{
read-host -AsSecureString | ConvertFrom-SecureString | Out-File     $LocalFilePath\cred.txt
}

So, the first time the program is run, if the cred.txt file isn’t found, the user is prompted for it, they enter the password, it gets put into a secure string and written out. From then on, the cred.txt file is used and the password is nice and secure.

And I confirmed that the service account could actually see the cred.txt file.

So what was happening?

The Key!

It took me a few minutes and when I hit a problem like this I do what I often do, step away from the keyboard. I had to stop and think for a bit. Then it dawned on me. “How does Powershell know how to encrypt and decrypt the password in a secure fashion. I mean at SOME point it needs to have a clear-text version of to pass to the various cmdlets. It wouldn’t be very secure if just anyone could decrypt it.  That’s when it struck me! The encryption key (and decryption key) has to be based on the USER running the cmdlet!

Let me show some more code

$password = get-content $LocalFilePath\cred.txt | ConvertTo-SecureString
$Credential = New-Object System.Management.Automation.PSCredential (‘ftp_example’, $Password)

$sftpURL = ‘secureftp.example.com’

$session = New-SFTPSession -ComputerName $sftpURL -Credential $Credential

So first I’d get the password out of my cred.txt file and then create a Credential Object with a username (ftp_example) and the aforementioned password.

Then I’d try to open a session on the SFTP server using that credential.  And that’s exactly where it was hanging when I ran this under the service account. Now I was on to something. Obviously the $password and $Credential wasn’t getting set as we saw from my debug statements.  It wasn’t able to decrypt cred.txt!

Great, now I just need to have the service create its own cred.txt.  But, I can’t use the same technique where the first time the service runs it prompts the user for the password.  This bothers me. I still don’t have a perfect solution.

For now I fell back on:

if (-not (test-path $LocalFilePath\cred_$env:UserName.txt))
{
# after running once, remove password here!
“password_here” | ConvertTo-SecureString -AsPlainText -Force | ConvertFrom-SecureString | Out-File $LocalFilePath\cred_$env:UserName.txt
}

Note I changed the name of the cred.txt file to include the username, so I could have one when I ran it and another when the service account ran it. This makes testing far easier AND solves conflicts when debugging (i.e. “which credentials are currently being used).

So for now the documentation will be “after running the first time, remove the password”. I think more likely I’ll write one-time script that runs to setup a few things and then deletes itself. We’ll see.

Anyway, this worked much better. Now my debug lines were getting the password and all. BUT… things were STILL hanging.  Talk about frustrating. I finally tracked it down to the line

$session = New-SFTPSession -ComputerName $sftpURL -Credential $Credential

Caving Blind

As you know, I love caving. And for caving, a headlamp is a critical piece of equipment. Without it you have no idea where you’re going.  That’s how I felt here. Things were simply hanging and I was getting NO feedback.

I tried

$session = New-SFTPSession -ComputerName $sftpURL -Credential $Credential | out-file ftp_log.txt -append

But I still wasn’t getting anything. I tried a few other things, including setting parameters on how New-SFTPSession would handle errors and warnings. But still nothing. It was like caving blind. The cmdlet was being called, but I couldn’t see ANY errors.  Of course when I ran it as myself, it ran fine. But when I ran it as a service, it simply hung.

I was getting frustrated I couldn’t get any feedback. I figured once I had feedback, the rest would be simple. I was mostly right.

I needed my headlamp! I needed to see what was going on!

Finally it dawned on me, “wrap it in an exception”. So now the code became:

try
{
$session = New-SFTPSession -ComputerName $sftpURL -Credential $Credential
}
catch
{
$exception = $_.Exception.Message
“Error in New-SFTPSession – $exception – ” | out-file ftp_log.txt -append
}

Now THIS got me somewhere. I was getting a timeout message!  Ok, that confused me. I mean I knew I had the right username and password, and it never timed out when I ran it. Why was it timing out when the service ran it?

So again, I did what I do in cases like this. Got up, went to the fridge, got some semi-dark chocolate chips and munched on them and ran through my head, “why was the secure FTP session timing out in one case, but not the other?”  At times like this I feel like Jack Ryan in Hunt for Red October, “How do you get a crew to want to get off a nuclear sub…”

Eureka!

DOH, it’s SECURE FTP. The first time it runs it needs to do a key exchange!  I need to accept the key!  At first I panicked though. How was I supposed to get the service to accept the key?  Fortunately it turned out to be trivial. There’s actually a parameter -Acceptkey. I added that… and everything ran perfectly.

Except the SSIS package

That’s a separate issue and again probably related to security and that’s my problem to debug today. But, I had made progress!

Summary

Now, quite honestly, the above is probably more detail than most care to read. But let me highlight a couple of things I think that are important when debugging or trouble shooting in general.

First, simplify the problem and eliminate the obvious. This is what I did with my first script. I wanted to make sure it wasn’t something simple and obvious. Once I knew I could write to a file using the service account that ruled out a whole line of questions. So I could move on to the next step.

Often I use an example from Sesame Street of “one of these things is not like another.”  In this case, I had to keep figuring out what was different between when I ran it and when the service account ran things. I knew up front that anything requiring keyboard input would be a problem. But, I thought I had that covered. Of course starting to compare and contrast the results of decrypting the cred.txt file showed that there was a problem there. And the issue with the initial acceptance of the SFTP key was another problem.

So, gather information and compare what works to what doesn’t work. Change one thing at a time until you narrow down the problem.

The other issue is being able to get accurate and useful information. Using debugging statements if often old school, and I know some developers look down on them, but often the quick and dirty works. So I had no problem using them. BUT, there are still cases where they won’t work. If for example a cmdlet hangs or doesn’t give output to standard output, they won’t work. Catching the exception definitely solved this.

The biggest problems I really ran into here is I’m still a beginner with PowerShell. I’m loving it, but I often still have to lookup things. But the basic troubleshooting skills are ones I’ve developed and honed over the years.

And quite seriously, sometimes don’t be afraid to walk away and do something else for a bit. It almost always brings a fresh perspective. I’ve been known to take a shower to think about a problem. Working from home that is easier for me than it might be for someone in an office. But even then, go take a walk. Or do something.  Approach the problem with a fresh mind. All too often we start down the wrong path and just keep blindly going without turning around and reevaluating our position.

When we teach crack and crevice rescue in cave rescue we tell our students, “If you’re not making progress after 15 minutes, let someone else try. The fresh approach will generally help.”

Looking back on this, the problems seem pretty obvious. But it took a bit of work to get there. And honestly, I love the process of troubleshooting and/or debugging. It’s like being a detective, compiling one clue after another until you get the full picture.

So now on to comment some of the code and then figure out why the SSIS package isn’t executing now (I have ideas, we’ll see where I get to!)