SQL Server Scheduled Email Queries

Every once in a while I come across a problem that both surprises me that it exists and that the solution is often trivial. This is one of those. Basically developer at a client wanted to setup a scheduled task that would execute a query and email the results. Everything appeared to work, but the emails never went out. There were no errors that were obvious other than the emails never arrived. Some digging via profiler showed that the SQL Agent user was having permission issues. But basically giving it every permission possible didn’t solve it.

So let me walk you through it. First, let’s create a real simple stored procedure. This assumes you have AdventureWorks2014 installed (yeah, it’s old, but it’s what I had handy).

Use adventureworks2014
GO
Create or Alter Procedure Send_Test_Email
as
exec msdb.dbo.sp_send_dbmail @recipients=’test@example.com’, @body=’This is the body of the email’, @subject=’Test Email w query embedded’,
@query=’select * from adventureworks2014.sales.SalesOrderHeader where subtotal > 150000′;

Now, here’s an important detail I did discover during testing. If my query didn’t actually query a table, but instead was say @query=’select getdate()’ things worked fine.

That said, if you simply execute the query above in SSMS, it should work just fine. (I’d recommend you put your own email in it for testing. This way you’ll know if the email is actually being sent.)

Before you do that, also create the following stored procedure:

Use adventureworks2014
GO
Create or Alter Procedure Send_Test_Email_Query_attached
as
exec msdb.dbo.sp_send_dbmail @recipients=’test@example.com’, @body=’This is the body of the email’, @subject=’Test Email w query embedded’,
@query=’select * from adventureworks2014.sales.SalesOrderHeader where subtotal > 150000′,
@attach_query_result_as_file = 1,
@query_result_no_padding = 1,
@query_attachment_filename = ‘Sales Report.csv’,
@query_result_separator = ‘;’,
@exclude_query_output = 1,
@append_query_error = 0,
@query_result_header = 0

This will execute the same query as the original stored procedure, but place the contents in a CSV file as an attachment.

Again, if you execute the above directly from SSMS, you should receive the email without an issue. This is basically what the client was attempting to do.

Now create the following job:

USE [msdb]
GO

/ Object: Job [Send AdventureWorks Email] Script Date: 7/27/2021 9:12:49 AM / BEGIN TRANSACTION DECLARE @ReturnCode INT SELECT @ReturnCode = 0 / Object: JobCategory [[Uncategorized (Local)]] Script Date: 7/27/2021 9:12:49 AM /
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]’ AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N’JOB’, @type=N’LOCAL’, @name=N'[Uncategorized (Local)]’
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N’Send AdventureWorks Email’,
@enabled=1,
@notify_level_eventlog=0,
@notify_level_email=0,
@notify_level_netsend=0,
@notify_level_page=0,
@delete_level=0,
@description=N’No description available.’,
@category_name=N'[Uncategorized (Local)]’,
@owner_login_name=N’sa’, @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/ Object: Step [Send email] Script Date: 7/27/2021 9:12:50 AM /
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N’Send email’,
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N’TSQL’,
@command=N’exec Send_test_email_Query_attached’,
@database_name=N’adventureworks2014′,
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)’
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO

If you execute this job as is, you should see the following:

Success! Or is it?

But you will never get the email!

Modify the above job and replace the line:

@command=N’exec Send_test_email_Query_attached’,

with

@command=N’exec Send_test_email’,

Now if you run the job, it will fail!

This time it clearly failed!

But as is usual with sp_send_dbmail, the error message isn’t overly helpful:

Executed as user: NT AUTHORITY\SYSTEM. Failed to initialize sqlcmd library with error number -2147467259. [SQLSTATE 42000] (Error 22050). The step failed.

But it does help give a clue. The problem seems to be some sort of permissions issue. So I’ll admit, at this point I tried all sorts of solutions, including setting up a proxy user, giving unfettered rights to my SQL Agent user and other things (thinking once I got it working I could then lock things back down). Instead, I found a much easier solution buried in a thread on the Microsoft site.

You’ll note when I wrote the original stored procedure I fully qualified the table name. This is often generally useful, but here I did it because I was cheating and knew I’d need it for this demo.

The solution is actually VERY simple and I’ll show it both graphically and via a script.

First: change the database to MSDB and then fully qualify your call to the stored procedure as below:

Execute it from the MSDB database

However, there’s one more critical step: under the ADVANCED tab in the step change the Run As User to dbo:

Seems simple, but it’s critical

Now if you script out the scheduled task it should look like:

USE [msdb]
GO

/ Object: Job [Send AdventureWorks Email] Script Date: 7/27/2021 9:26:37 AM /
EXEC msdb.dbo.sp_delete_job @job_id=N’08a55e18-eecf-4d4c-8197-8135a0d7520b’, @delete_unused_schedule=1
GO

/ Object: Job [Send AdventureWorks Email] Script Date: 7/27/2021 9:26:37 AM / BEGIN TRANSACTION DECLARE @ReturnCode INT SELECT @ReturnCode = 0 / Object: JobCategory [[Uncategorized (Local)]] Script Date: 7/27/2021 9:26:37 AM /
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]’ AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N’JOB’, @type=N’LOCAL’, @name=N'[Uncategorized (Local)]’
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N’Send AdventureWorks Email’,
@enabled=1,
@notify_level_eventlog=0,
@notify_level_email=0,
@notify_level_netsend=0,
@notify_level_page=0,
@delete_level=0,
@description=N’No description available.’,
@category_name=N'[Uncategorized (Local)]’,
@owner_login_name=N’sa’, @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/ Object: Step [Send email] Script Date: 7/27/2021 9:26:37 AM /
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N’Send email’,
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N’TSQL’,
@command=N’exec adventureworks2014.dbo.Send_Test_Email_Query_attached’,
@database_name=N’msdb’,
@database_user_name=N’dbo’,
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)’
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO

If you then execute THIS scheduled task you’ll get both a success message:

And you should actually receive the email with the query results attached.

So, basically an issue that surprised me that it even existed, actually ended up with a fairly simple solution.

#YesAllMen

For over two years I’ve been wanting to put together a speaking topic using the above as a title or as a subtitle. The reason is because the alternative, #NotAllMen is NOT a good thing to use.

Yesterday I came across an article that said it better than I could. I would recommend you read it first: Buzzfeed Article.

It brings up several great points, including ones I had not given much thought to. Look, I know when a friend of yours is upset that a guy just made a pass at her in a professional setting, one of your first instincts is to think “I’d never do that” and are tempted to say “#notallmen” you’re not being helpful. In fact, you’ve just made it about yourself. Just don’t. That’s not what is needed.

Look, I’m pretty confident that all my male readers are pretty decent guys. I work with many of you professionally, some of you in volunteer positions, some are simply friends, and some readers, I suspect I don’t know, but you’re probably ok too. None of you are outright sexist or racist. If you were, I wouldn’t be associating with you. But all of us are still a product of our environments. We make the off-handed comment without thinking about it. Or someone around us makes a comment and we don’t react. This is also why I suggest when it comes to calling yourself an ally, just don’t.

Several years ago I helped organize and then participated in a Women in Tech panel for our local SQL Server User Group. I was the only man on the panel and expected to be asked what was the best thing I thought we, as men could do. The answer was of course somewhat ironic: “Sit down and shut up.” I of course expanded upon this. No, we can’t nor should we ever completely shut up. That wasn’t my point of course. My point was to make sure not to center the discussion about us. When a friend complains about a sexist incident, replying #NotAllMen is doing just that. Centering the discussion on us. Sorry, that’s a time to shut up and listen.

But when others make sexist comments, that’s a time when it may be appropriate to say something. And if someone calls you out, take it in stride. We make mistakes. And if someone takes you aside and says something (I’ve now heard that called a “call-in”) thank them. It means they think well enough of you to help you be a better person.

At the end of the day guys, we’re all still part of the problem, even when we do our best. That doesn’t make us evil. It simply means we have space to grow into. Let’s do that. Let’s grow.

… Other Duties as Assigned

I’ve mentioned once before that at one of my clients I describe my job as “DBA and other duties as assigned.”

This phrase has really been on my mind this week, especially during a phone call with another client yesterday. This second client is a local consulting company that has hired me a few times to back them with my skills in SQL Server and MS Access. This time around the work they’re looking for is definitely SQL Server related. It was refreshing.

But it reminded me of my last two weeks with two of my other clients. One is having an issue with their app (that they always call “the database”) that is most likely a design issue that I need to dig into. This is a perfect example of what I call “software archeology” where I at times have to shift through “pot shards” to determine what the original developer was thinking. At times it can be fun and interesting, at other times, frustrating. I’ll be shifting through more pot shards in the near future to get to the bottom of this problem.

For my largest client, I spent most of my hours with them last week trying to true up a file with some financial data in it. In this case it’s part of an ETL process where I receive data, compile it and send it to a vendor. The process uses a combination of PowerShell and Pentaho. So while they interact with the database, the work I was doing wasn’t in T-SQL or directly on the database server.

The numbers weren’t adding up. There was an undercurrent of “Greg, your numbers are wrong” or “You’re filtering on the wrong criteria.” I kept pointing out that “I simply add up the numbers you give me.” Eventually the problem was narrowed down to the fact that in the source system, which is the system of record, they had deleted rows. Arguably, one should never be deleting rows in such a system, but rather issuing a 2nd row (a credit if you want to reverse a debit, or a debit to reverse a credit) and this was typically what was done. But in this case the maintainers of the source of record decided to wholesale delete these rows. I explained that from day one, since deletions are never supposed to happen (and given the way the system works, extremely hard to detect) all I do is either insert new rows, or update existing rows. In any event, with one minor schema change, some updates to the rows in question and an updated PowerShell script, I was able to make the numbers come out to match with theirs. So, is that really DBA work? Not in the traditional sense. But it’s definitely other duties as assigned.

Now that’s not to say I didn’t do what some might consider actual DBA work. On Saturday morning I patched one of their servers. And at one point during the week, I deployed a script to production. So, out of 18 hours of work for the customer last week, I think I can say maybe 1-2 total was “dba work” or about 5%.

Now, I want to be clear. This is not a rant or a complaint. I’ll admit I tend to prefer to work directly with SQL Server, but I was reminded of a quick discussion I had with a fellow DBA over the weekend about how they probably needed to start to learn PowerShell for their job.

I’ve been arguing for years that the role of a DBA has changed, and will continue to change dramatically over the next few years. Once where we might spend days head down slinging T-SQL code, setting up backups and restores, tuning indices, etc. now much of that is automated or at least far easier to do. Which is a good thing. In years past, a DBA might be responsible for a dozen machines or so at the most. If it was more than that, we’d feel sorry for them. That’s no longer uniformly true. I know a DBA who is responsible for over 100 machines. They’re the soul DBA. But, through PowerShell and other modern tools, it’s generally not an overwhelming job.

However, like the online presentation from the Atlanta Azure Data User Group I attended last night on SQL Database Edge, there is a growing list of things DBAs need to learn. Steve Jones recently posted about whether DBAs need to learn Linux? The short take away is not necessarily, but it’s probably a good idea, but we definitely need to learn about containers.

I have heard for years, “Microsoft will automate everything and the DBA’s job will go away.” Not only is that not true in my experience, the exact opposite is. I think being a successful DBA is in some ways harder than it was a decade ago. There’s so much more to be aware of and to learn.

Off the top of my head, without any real priority I came up with the list below of technologies that a modern DBA might find useful to know. This is not to say I know them all, or that one has to be an expert in all of them. And I will note, this is far from an inclusive list. I also left out third-party tools which are so common place. But I think it illustrates just how broad the required skillset of a good DBA is these days.

  • T-SQL
  • PowerShell
  • Query Store
  • Linux – at least at the most basic level
  • Containers
  • SSIS
  • SSAS
  • SSRS
  • Storage – (at least how different types can impact performance and the advantages and disadvantages of each)
  • Azure
  • SQL Database Edge
  • git or some form of version control

In conclusion, I’ll say, I’m not going to make any predictions about where the Microsoft data platform will be a decade from now, but I can tell you that DBAs will still be needed but their skillset will be as different from today as today is from a decade ago.

And post conclusion, I’ll add I’ll continue to rely on #sqlfamily and all my fellow DBAs to help me out. And continue to help them.

Let me Email that!

This week’s post is a short one, but hopefully a practical and useful one. I had a recent need to send an email to about 3 dozen folks. Prior to this particular email, I had been updating them with general information and was simply using the BCC function of my email client to send out the email. But in this particular case I needed to email them login information specific to them. I didn’t relish the idea of crafting 3 dozen separate emails and making sure I got the correct information into each. So, I turned to the tool closest at hand: PowerShell and in about 3 minutes of writing, and 10 more minutes of debugging and tweaking, had something that worked great.

param([parameter(mandatory)] $course_id, [parameter(mandatory)] $course_description)

$course_info = invoke-sqlcmd -server "sql_server" -query "select ct.Course_Type_Name, c.Location, c.City,c.State from Courses c
inner join course_types ct on ct.Course_Type_ID=c.Course_Type_ID
where course_id=$course_id" 


$class_list=invoke-sqlcmd -server 'sql_server' -query "select p.persno, first_name,LAST_NAME, email1 from courses_taken ct
inner join people p on p.PERSNO=ct.PERSNO
and ct.Course_ID=$course_id"

foreach ($person in $class_list)
{
    $body = "Thank you for participating in the recent $($course_info.Course_type_name) $course_description course at $($course_info.location) in $($course_info.City), $($course_info.state). Your class information has been entered into the Generic Class database. 
If you wish to review your unofficial transcript, you may login in at https://www.example.com/Account/Login.aspx. 
If you have not created an account before you can create an account at https://www.example.com/Account/Register.aspx. 
Please be sure to use the following email:$($person.email1) (you can update it after creating your account) and your Generic Class ID: $($person.persno). Please note this is NOT your OTT ID. 
If you have issues or questions, please contact Joe Smith at mailto:jsmith@example.com. Thank you and we look forward to seeing you at future training!"

    Send-MailMessage -From "jsmith@example.com" -SmtpServer Mail_Server -Subject "Generic Class: Your Generic Class Login and unofficial transcript" -Body $body -to $person.email1
}

Every course in this particular database has an ID. And a course can have a specific course information. The first query gathers that information.

The second query then gets the personal information for each student who took that particular course and puts it into a class_list object.

I then simply iterate over the object and send an email with the personalized details to each student.

It literally took me more time to tweak the text and formatting of the body of the message than it did to write the rest of the script.

I suppose I could have figured out my email client’s mail merge feature, or done something in Word which I know also has such a feature but that would have taken far longer. I also could have spent more time making things more generic (perhaps passing in the body and having it automatically formatted, or sending as a nice HTML message with appropriate formatting. But sometimes, quick and dirty is the way to go. And now I have a script I can customize for future use.

Not bad for about 15 minutes worth of work.