Scenario / Questions

I have an application server, running Windows 2012 R2, which generates a high volume of log files, to the point that it runs the application volume out of free space on a semi-regular basis. Due to restrictions from the application itself, I can’t move or rename the log files or enable NTFS data-deduplication, and due to it not being ten years ago anymore, I don’t want to use a batch or vbscript to do this for me.

The log files are all in various subfolders of the application install directory, with different extensions (one component adds the date as the log file extension), and the application install directory has a space in it, because the application developers are malevolent. The subfolders where the logs are written are exclusively used for the purpose of writing logs, at least. This is also a heavily CPU-bound application, so I don’t want to compress the log folders themselves and incur the CPU penalty associated with writing compressed files for the logs.

How can I use PowerShell to enable NTFS compression, in-place, on log files older than x days?

Find below all possible solutions or suggestions for the above questions..

Suggestion: 1

The easiest solution, as PowerShell support for file operations is still rather lacking, is to create a PowerShell script to call the compact.exe utility and set it up as a scheduled task. Because of the space in the path name, you want to call compact.exe directly, instead of using Invoke-WMIMethod and the CIM_DataFile class (which will cause a lot of extra effort to deal with the space in the path).

Assuming an age of 3 days for X, your PowerShell script would look something like:

$logfolder="[location of the first logging subfolder]"
$age=(get-date).AddDays(-3)

Get-ChildItem $logfolder | where-object {$_.LastWriteTime -le $age -AND $_.Attributes -notlike "*Compressed*"} | 
ForEach-Object {
compact /C $_.FullName
}

$logfolder="[location of the next logging subfolder]"

Get-ChildItem $logfolder | where-object {$_.LastWriteTime -le $age -AND $_.Attributes -notlike "*Compressed*"} | 
ForEach-Object {
compact /C $_.FullName
}

...

The second condition there is to speed up the script execution by skipping over already compressed files (which would be present after the first time this script was run). If you wanted to, or had a lot of different logging subfolders, it would probably make sense to make a function out of that repeated PowerShell code, which would be a fairly trivial exercise.

Suggestion: 2

The repeated code can be avoided by using an array and a foreach loop:

$logfolders=("D:\Folder\One","D:\Folder\Two")
$age=(get-date).AddDays(-3)

foreach ($logfolder in $logfolders) {
    Get-ChildItem $logfolder | where-object {$_.LastWriteTime -le $age -AND $_.Attributes -notlike "*Compressed*"} | 
    ForEach-Object {
    compact /C $_.FullName
    }
}

…..

Suggestion: 3

Invoke-WmiMethod -Path “Win32_Directory.Name=’C:\FolderToCompress'” -Name compress

Suggestion: 4

If those log files are not on C: use the Server 2012 R2 deduplication feature. You can then configure it to only dedup .log files which are three days old (the default).
The second method to get this under control, or when it is on C: Move the log directory to a different drive and use a JUNCTION to point to the new place, easiest to create with Hardlink-Shell-Extension from https://schinagl.priv.at/nt/hardlinkshellext/linkshellextension.html – and then use 2012 R2 deduplication on top.
I’ve see deduplication rates way above 90% on log files and the SQl-dump-for-backup drives.