0 Comments

In my recent adventures with trying to create and maintain Windows Scheduled Tasks that execute Powershell, I’ve discovered a need for logging exactly what happens in the scripts. Mostly this is due to the fact that getting the output from a scheduled task (beyond the exit code) is a massive pain, but I’ve wanted a good Powershell logging system/framework for a while now. I mean, its basically code, and I would never write code without good logging, so this shouldn’t be an exception.

Luckily, Powershell offers a mechanism for logging, usable via the X-Transcript commands.

Unluckily, its a bit crap.

Transcribing

Powershell offers two cmdlets for dealing with transcription. Start-Transcript and Stop-Transcript.

When you start a transcript, you can optionally supply a filepath, and the transcription will take place there. If not, it will create a file for you and tell you where it is.

As for what is actually transcribed, you get all of the Output of the script plus errors and other exceptions. You also get some meta information, including times and the script being executed.

All of this sounds pretty perfect for logging on the surface. Just start a transcript at the beginning of your script and stop it at the end. So I tried it.

It was…okay. Once I figured out that it doesn’t automatically create missing directories in the path supplied (it doesn’t fail, it just does everything really really slowly), that permissions can really ruin its day (same sort of thing, no failures, just super slow while it fails all of its writes) and that if you have square braces in your path it fails miserably with the least helpful error message ever, it at least created a record of what was happening. Sort of.

Okay, It was terrible and it made me sad.

Apart from the weird behaviour with directories/permissions/braces, transcribing does not record the Verbose or Debug streams. This is kind of annoying, because that’s where I put the majority of my log statements (Output is bad because it messes up stream processing and return values and Host is just generally not advised). I believe later versions of transcribing can record additional streams, but I didn’t look into it any further, because I was already annoyed with the feature.

I also hated the output it created. If I’m looking at a log file, at minimum I want timestamps, severity and messages. For a language/framework with a mature logging tool (like .NET with log4net or Serilog) I expect a lot more, but for a simple log file, those three things I listed are enough.

Time to try an alternate approach. I hesitate to say better, but honestly, I was pretty unimpressed with transcription, so it seemed very possible.

Redirection

Powershell offers the ability to redirect the various available streams, using the standard Windows command line redirection, which seems like it would be the easiest way to create a log file. Simply redirect the streams you’re interested in to a file (pretty much all of them) and Bob’s your uncle.

In my case, I had a lot of issues getting this to work. Because I was running Powershell from a Windows Scheduled Task, using redirection didn’t seem to work. It turns out that one of my mistakes was assuming you could redirect the output from Powershell.exe, which you don’t seem to be able to do. Instead the redirection appears to be primarily for commands within Powershell.

I tried wrapping my Powershell in more Powershell (which actually did the redirection, as well as setup the file name via the current timestamp and ensure the appropriate directories existed), and while it seemed to work, it still wasn’t good enough. No timestamps on the lines in the file left me with no real information as to the time that elapsed during the script. Additionally, the syntax for getting the redirection to work properly was a bit strange, with the all redirection seemingly not working as expected (9>&1 didn’t seem to work, and neither did *>&1).

It was still better than transcribing though.

It was time to go deeper.

Interception

Powershell is an interesting language. Coming from a heavy .NET/C# background, I was initially confused by Powershell and its untyped variables and extremely loose, interpreted syntax. I still definitely prefer C#, but Powershell has its place in my environment, but only when its supported by tests using Pester.

One of the wonderful things Powershell lets you do, is function overriding.

You can create an implementation of a function, and have that implementation be used within a limited scope, instead of the default implementation.

You might be able to see where I’m going with this.

All of my logging was already in place via a combination of Write-Verbose, Write-Debug and Write-Warning (and the other write functions as well, just more rarely). All I needed to do was intercept those calls and write the messages to some file instead of whatever the default did.

It was easy enough to implement a file containing implementations of all of the Write-X cmdlets, and redirect the output to a file. I could easily add timestamps and any other information I wanted, and could control the output format, allowing it to be easily parsed by something else (like Logstash). To use the feature you simply setup a global variable containing the directory to put the log files into and then dot include the file.

The only downside to this was that any function I overwrote in this way no longer worked normally (which is kind of obvious), so during my testing I broke my IDE a number of times, which made development a bit harder than expected. Luckily, I ran across this StackOverflow answer from someone much smarter than me, which showed how to wrap the existing implementations instead of simply replacing them.

function Get-TimestampForLogFileName
{
    return [DateTime]::Now.ToString('yyyyMMdd');
}

function Get-CurrentLogFile
{
    if ([string]::IsNullOrEmpty($logsDirectoryPath)) { throw "LogsDirectoryPath script scoped variable not set. Thats bad, its used to create log files for Powershell output." }

    $timestamp = Get-TimestampForLogFileName

    $logFilePath = "$logsDirectoryPath\$timestamp.log";
    if (-not(Test-Path $logFilePath))
    {
        $logFile = New-Item -Type File -Path $logFilePath -Force;
        $max = 5
        $current = 0
        while (-not ($logFile.Exists))
        {
            Sleep -Seconds 1
            $current++
            if ($current -gt $max) { break }
        }
    }

    return $logFilePath
}

function Get-TimestampForLogContent
{
    return [DateTime]::Now.ToString('yyyyMMddHHmmss');
}

function Write-Debug 
{
    [CmdletBinding()]
    param
    (
       [Parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true)]
       [AllowEmptyString()]
       [System.String]${Message}
    )

    begin 
    {
       try 
       {
           $outBuffer = $null
           if ($PSBoundParameters.TryGetValue('OutBuffer', [ref]$outBuffer))
           {
               $PSBoundParameters['OutBuffer'] = 1
           }
           $wrappedCmd = $ExecutionContext.InvokeCommand.GetCommand('Write-Debug', [System.Management.Automation.CommandTypes]::Cmdlet)
           $scriptCmd = {& $wrappedCmd @PSBoundParameters }
           $steppablePipeline = $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
           $steppablePipeline.Begin($PSCmdlet)
       } 
       catch 
       {
           throw
       }
    }

    process 
    {
       try 
       {
            $logFilePath = Get-CurrentLogFile;
            Add-Content -Path $logFilePath "$(Get-TimestampForLogContent):DEBUG:$message" -Force;
            $steppablePipeline.Process($_)
       } 
       catch 
       {
           throw
       }
    }

    end 
    {
       try 
       {
           $steppablePipeline.End()
       } catch 
       {
           throw
       }
    }
}

I’ve only included the Write-Debug implementation, but all of the other ones look the same.

Finally I had a logging implementation that didn’t make me sad. It wrote reliably to a file, and I could easily use it from within a Windows Scheduled Task (after I started creating an actual .ps1 file for the scheduled task to execute that is).

Conclusion

The logging solution that I ended up putting into place isn’t perfect, but it seems to be a hell of a lot better than the other two approaches that I tried.

It outputs in a format that people are used to seeing, and is easily parsable by both machines and people. If I wanted to, I could import the log information into ELK using Logstash, which would let me keep track of the execution of the scheduled tasks remotely. Honestly that’s probably something I’ll put into place eventually, just not right now.

It was nice to be able to wrap functionality provided by something else without having to have written the code explicitly for that in the first place. Coming from C#, I typically code to interfaces, so I’m used to being able to replace things as necessary, but that requires effort up front. Here I simply replaced some system provided cmdlets and made them do something slightly different based on my requirements. I can’t count the number of times I’ve wanted the ability to do that sort of thing with System.Diagnostics.Trace or Console or DateTime (usually for testing purposes).

Having that sort of power is both amazing and dangerous (amazingly dangerous?). It’s very easy to completely break other, unrelated things when you are essentially replacing the implementation of a global static function.

Ah well, with great power, comes great responsibility I suppose.

Wouldn’t want Uncle Bob to be killed some rogue code in a tragically ironic accident.