About Me · Send mail to the author(s) E-mail · Twitter

At GROSSWEBER we practice what we preach. We offer trainings for modern software technologies like Behavior Driven Development, Clean Code and Git. Our staff is fluent in a variety of languages, including English.

Feed Icon


Open Source Projects


Blogs of friends

Now playing [?]

  1. The Prodigy – Castl road
  2. The Prodigy – Many many people
  3. The Prodigy – 144
  4. The Prodigy – Castbreeder
  5. The Chemical Brothers – The Test


The Case Of The Giant SQL Server Log Files

Posted in SQL Server at Wednesday, January 30, 2008 7:10 PM W. Europe Standard Time

The day before yesterday I had to look into a SQL Server 2005 (Service Pack 2) instance where the daily backups of a certain database amounted to several Gigabytes over the last 14 days. The database in question is only about 10 MB in size and the recovery model is set to "Simple" so that should not be the case. The problem was a giant log file, around 6 GB in size by the time I started the remote session.

First I tried shrinking the database and the logs via the Context Menu, Tasks, Shrink Database options, but shrinking the log file failed miserably: it would not decrease below a magic threshold of 5,5 GB. Shrinking even should not free that much disk space because SQL Server shrinks the logs automatically after a transaction is committed when the recovery model is set to "Simple".

After a while of trial and error I realized that there must be some open transaction that has not been committed or otherwise been marked as done. But how to finally get rid of it?

First I tried to change the recover model to "Full" and "Bulk-logged", backing up the database and the logs separately.

BACKUP DATABASE database TO DISK='C:\Temp\file_data.bak' WITH NO_LOG
BACKUP LOG database TO DISK='C:\Temp\file_log.bak'

Processed 1000 pages for database 'database', file 'database_Data' on file 2.
BACKUP DATABASE...FILE= successfully processed 1000 pages in 0.434 seconds (18.875 MB/sec).

Processed 807822 pages for database 'database', file 'database_Log' on file 3.
The log was not truncated because records at the beginning of the log are pending replication. Ensure the Log Reader Agent is running or use sp_repldone to mark transactions as distributed.
BACKUP LOG successfully processed 807822 pages in 15.077 seconds (48.119 MB/sec).

As you can see from the messages above the shrink operation could not shrink the whole log file. The hint to use sp_repldone is a bit misguiding because the database isn't replicated. In fact, the replication components aren't even installed on the server. As expected, executing sp_repldone fails:

EXEC sp_repldone @xactid = NULL, @xact_segno = NULL, @numtrans = 0, @time = 0, @reset = 1

Msg 18757, Level 16, State 1, Procedure sp_repldone, Line 1
Unable to execute procedure. The database is not published. Execute the procedure in a database that is published for replication.

I digged the SQL Server Books Online for more information. First of all there's a command to find out about open transactions: DBCC OPENTRAN. As you can see there's at least one transaction that hasn't been replicated.


Replicated Transaction Information:
   Oldest distributed LSN     : (0:0:0)
   Oldest non-distributed LSN : (3822:331:1)
   DBCC execution completed. If DBCC printed error messages, contact your system administrator.

At least, it was the hint that finally solved the issue: There's a transaction that SQL Server thinks should be distributed to a replicated server, although there is none configured.

Fortunately, there's a stored procedure that can be used to disable replication, even if it wasn't enabled in the first place. I suspect sp_removedbreplication cleans up some bogus settings, at least it was able to remove the transaction that kept the log file steady at 5,5 GB.

Two hours later: Problem solved!

Now Playing [?]: Ocker – Public Transport – Last night the secret service saved my life

All comments require the approval of the site owner before being displayed.
(will show your gravatar icon)
[Captcha]Enter the code shown (prevents robots):

Live Comment Preview