SQL Server Transactional replication error

This week, the replication of the subscription of one of our most important databases stopped working. The distribution agent job stopped. In the job history, it had logged the following error:

The distribution agent failed to create temporary files in ‘C:\Program Files\Microsoft SQL Server\130\COM’ directory. System returned errorcode 5.

The same error was logged in the table [MSdistribution_agents] in the distribution database. Granting Modify permission on this folder to the replication agent account solved the issue (note: in our case, the replication agent account is not the same as the SQL Server Agent account). But we’ve had this setup for some years now, and all other subscriptions in production use the same replication agent account, and the same folder on the same server, and all those subscriptions continued working fine.

My initial Google search pointed to this Microsoft article:


Even though this article has the correct solution, it does not describe our situation accurately. It applies to the agent profile Distribution Profile for OLEDB streaming, but we use the default agent profile.

Recently, we migrated all replicated databases from Windows 2012 and SQL Server 2014 to Windows 2016 and SQL Server 2016. However, I doubted that that had anything to do with it; on the internet, I found various reports of this error for different versions of SQL (just with a different version number in the path, as 130 stands for version SQL 2016).

So, I wanted to find out exactly why this error suddenly occurred, and why the solution was to set permissions that were never needed in the years before, or on the other subscriptions.

There are two subscriptions on this database, with different articles (and therefore a different distribution job). Looking at the subscription I noticed two new tables. These tables contain an XML column. Since none of the other tables have XML columns, this might just be the difference that caused the error.

Luckily, these tables also have a timestamp column, so I could determine that 4 records had been added to one of these tables between the crash and the last successful run of the distribution agent. So in our test environment, I tried to reproduce the issue. Since our test environment runs on different servers with a different replication agent account, the permissions on the COM folder were still the default.

The first attempt to reproduce the issue failed. The XML I used for the first attempt was too small: it had a data length of 2900. The largest XML value in production had a data length of roughly 5900, so I tried that. With this bigger record, I was able to reproduce the issue. Immediately after inserting this record on the publisher, the distribution agent job on the distributor failed with the same error. And after adding the Modify permission to the replication agent account, I was able to restart the distribution agent job, and the replication continued as usual.

So if you need to replicate XML data, and the XML data gets above a certain threshold, the distribution agent needs some temporary files in the COM folder to buffer this data; with the default permissions, it is unable to do this.

To further analyze what temporary files are created, I used the tool Process Monitor to see exactly what actions are triggered by the replication process inside this folder.

Process monitor is part of the Sysinternals suite. So first, I downloaded the latest version of the Sysinternals Suite, and started procmon on the distributor. I added a filter to include only events on the COM folder:

After that, I cleared the procmon screen and performed another update on the XML column. This triggered a lot of events in the COM folder. In particular, the replication process created a LOB file, and performed several writes to this file.

LOB stands for Large OBject. Therefore, this same issue might occur with other data types. There are several large object data types: textntextimagevarbinary(max), and xml. For more info on these data types, see:


After demonstrating that temporary files are indeed created in the COM folder, I removed the extra permissions for the replication agent account on this folder, and updated the XML value once more. And as expected, the replication failed, and Process Monitor captured the error:

After granting the additional permissions for the replication agent account on the COM folder, replication started working properly once more.

Posted in Uncategorized | Leave a comment

Running processes including job step name

The normal output of sys.dm_exec_sessions (or sysprocesses) will tell you that a job is running, but it will hide the name of the job behind a heximal representation of the job_id, not the job_id as displayed in sysjobs. This script will match the job name and job stepwith the process.

Posted in T-SQL | 2 Comments

Get-AzDataFactoryPipeline: a misleading error

A few weeks back, I wanted to start an Azure Data Factory pipeline using PowerShell. It was the first time I tried that, and I thought it would be pretty simple: get the correct pipeline using Get-AzDataFactoryPipeline, supply all the parameters and pipe it to Invoke-AzDataFactoryPipeline. Something like:

That, however, did not work. I got an error message stating that the resource was not found (HTTP Status Code NotFound, Error Code ResourceNotFound):

After double checking the commands and all parameters and for syntax errors and typos, I turned to the internet for help, but searching online didn’t provide an answer either. However, it turned out that the answer was pretty easy: for ADF version 2, you need to use the AzDataFactory V2 commands:

Posted in Azure Data Factory, Powershell | Leave a comment

Code samples for my book on Querying data with Transact-SQL (MCSA 70-761), chapter 3

Posted in Code samples, MCSA SQL 2016 70-761 | Leave a comment

Slow delete with immediate output

Posted in T-SQL | Leave a comment

How I passed the exam AZ-900 Microsoft Azure Fundamentals

This week, I passed my first Azure exam: AZ-900 Microsoft Azure Fundamentals. As the name suggests, this is the most basic Azure exam, and therefore a nice place to start if you want to get certified in one of the hottest topics in IT at the moment. This exam covers the basics of cloud computing, such as what is different between the cloud and on-premises environments, and how to get started in the cloud. You can find a complete description of the exam topics on the Microsoft web site.

If you pass this exam, you will be awarded a certification, the Microsoft Certified Azure Fundamentals.

Microsoft is also kind enough to provide study material for free. On the Microsoft Learn portal, you can find a complete course for this exam (here). This course should take you a little over nine hours to complete. I had never tried Microsoft Learn before, but I found it quite enjoyable, and easy to follow along. It contains text, video, labs and practice questions. There are also some links to related material.

So probably, you are wondering if this free training is enough to pass the exam. In my experience, it is. Personally, I did not have a lot of hands-on experience working with Azure. I took an Azure boot camp course some six years ago, but in the cloud, things change pretty fast, so practically everything is different from six years ago. Since that boot camp, I’ve not worked with Azure directly, but some of my clients are moving (or planning to move) databases to Azure, so I have been watching these migrations from the sideline.

As far as the level of the course is concerned: it is not that difficult, exactly what you would expect for a fundamental course. If you are new to Azure, but not new to IT, you should be able to follow along quite nicely. If concepts like virtual machine, high availability and firewalls are completely new for you, you will probably require a bit more studying.

After following the course once, I scheduled the exam. I did plan to redo the course once more before actually taking the exam, and just focus on the topics that were new to me, and the topics that were more likely to be asked on the exam. If you’ve ever done a Microsoft exam, you will recognize those topics while studying. In most exams, there are bound to be questions that focus more on lists of options and factoids than on understanding of the material. However, life intervened, and I did not have time to completely go through the course a second time.

Regardless, I passed the exam easily. I will not give any details on the exam itself, because I don’t want to violate the non-disclosure agreement. But if you are interested in taking this exam, I can tell you that the Microsoft Learn course adequately prepares you for this exam. Good luck studying!

Posted in Uncategorized | Leave a comment

Dynamic PIVOT statement

My attempt to make a dynamic PIVOT statement. The table used is the same as the example in my 70-761 book.

First, the @list_of_columns parameter is filled to create a list of columns, using COALESCE. Next a common table expression is used. In this example. it is not useful, but as more complicated PIVOT statements will usually work with tables with more than the required 3 columns, I left this cte in there.

Add a second @list_of_columns parameter to replace the NULL values in the result set.





Posted in Uncategorized | Leave a comment

Convert MSDB job history to datetime

This script will get the job history for a particular job. The start time will be converted from integer to datetime. Works for job duration under 1000 hours.

Since I’ve still got clients on SQL 2008, I can’t use the DATETIMEFROMPARTS function, which makes the code a little longer than on newer SQL versions.



Posted in Uncategorized | Leave a comment

Bonus questions for MCSA 70-761

The practice questions have been moved to a free course on my Teachable site:


Posted in Books, MCSA SQL 2016 70-761 | Leave a comment

Code samples for my book on Querying data with Transact-SQL (MCSA 70-761), chapter 2