Export a list of Exchange Contacts via PowerShell

Today, I needed to export a list of all of our contacts’ email addresses. The easiest way to do this was with PowerShell from the Exchange Management Shell:

Get-MailContact | Select-Object -Property PrimarySmtpAddress | Out-File c:tempcontacts.txt

Some more reading:

PowerShell: Determine the external IP of a remote system

I needed a quick way to determine the external-facing IP of a remote system. As long as you’re not going through a proxy, you can use sites like whatismyip.com or type “my IP address” into Google. I wanted to achieve this without having to get onto the machine and fire up the web browser.

The following one-liner will query the whatismyip.org API from a remote system, and return just the IP:

Invoke-Command -ComputerName <remote PC name> -ScriptBlock {(Invoke-RestMethod -Uri "http://www.realip.info/api/p/realip.php").IP}

PowerShell: Replacing Invalid Characters in a Filename

I’m currently writing a script that requires me to create folders based on the contents of a CSV-dump from SQL. One thing I need to do is check that each item I’m pulling out of the CSV will translate to a valid Windows folder name.

One easy way to do it is by using the .NET System.IO.Path.GetInvalidFileNameChars() method. This method returns a System.Char array containing all of the characters that can’t be used in a file or folder name. Here’s a sample of the output:


Assuming that the string we wish to interrogate for invalid characters is stored in a variable called $text, we’d use the following one-liner to loop through all of the invalid characters and replace them with another of our choosing. In this case, I’m replacing them with a period.

[System.IO.Path]::GetInvalidFileNameChars() | % {$text = $text.replace($_,'.')}

Here’s a screenshot of it in action: image

Determine the logged-on User’s AD group membership in PowerShell

I came across this great little gold nugget of a 1-liner while reading this blog post on automating Outlook Profile creation, so all props go to Travis Runyard for this one.


To break it down, this is using the [ADSISEARCHER] type accelerator to create an instance of the DirectorySearcher class.

The string specified directly after the accelerator denotes the search filter, so in this case, we’ll only be searching for objects with a samaccountname attribute that matches the current user’s logon name.

There’s only ever going to be one object returned, so we use the FindOne method to return a single System.DirectoryServices.SearchResult object.

All that’s left after that, is to get the contents of the “memberof” property on that object.

In his blog post, Travis goes one step farther and uses a regex to remove the LDAP path elements like “CN=” which leaves us with just the group names. Very smart!

([ADSISEARCHER]"samaccountname=$($env:USERNAME)").Findone().Properties.memberof -replace '^CN=([^,]+).+$','$1'

If we store the results of this search in a variable, for example $userGroups, we can then check if the user is a member of a certain group:

Alternatively, you could use comparison operators like –contains, –ccontains for a case-sensitive comparison, or even –notcontains.

([ADSISEARCHER]"samaccountname=$($env:USERNAME)").Findone().Properties.memberof -replace '^CN=([^,]+).+$','$1' -ccontains "Colour Printing"

Log to Loggly from your PowerShell Scripts

I’ve been pondering how to keep track of the results of the various PowerShell scripts on my network. I first considered setting up Windows Event Log forwarding, but I deemed that a bit too complex just to log some information from my scripts.

I then found Loggly, a cloud-based logging service that has an easy-to-use JSON API. At the time of writing, they have a “Lite” plan that’s free, and includes 200MB of logged data per day, with 7 days of retention.

Here’s a quick and dirty function that I can include in my scripts to log errors and other information to Loggly, where I can then search and filter based on hostname and the source script.

Note that if you use a proxy on your network, you can specify one when using Invoke-WebRequest.

Here’s the function, which needs to be put at the top of your PowerShell script. Just remember to replace “CUSTOMER_TOKEN” with your actual customer token:

function LogToLoggly {
    param ($Message,$Type)

    $logURI = "https://logs-01.loggly.com/bulk/CUSTOMER_TOKEN/tag/powershell"
    # If we don't specify a type via parameter, assume it's information
    if ($Type -eq $null) { $Type = "Information" }

    $jsonstream = @{
        "timestamp" = (get-date -Format s);
        "type" = $Type;
        "source" = $MyInvocation.ScriptName.Replace((Split-Path $MyInvocation.ScriptName),'').TrimStart('');
        "hostname" = $env:COMPUTERNAME;
        "message" = $Message;
        "exception" = $Exception

    $jsonstream | Invoke-WebRequest -Method Post -Uri $logURI

Here’s how you’d log an informational message:

LogToLoggly "This is a test log message"

And an error:

LogToLoggly "This is a test error message" "Error"

This is the JSON that gets sent to Loggly:

    "message":  "This is a test error message",
    "source":  "testing-1.ps1",
    "timestamp":  "2014-05-19T19:08:59",
    "exception":  null,
    "hostname":  "COMPUTER.LOCAL",
    "type":  "Error"

And this is what it looks like in Loggly:

2014-05-19 19_10_32-Loggly _ Search!

Reporting on Spiceworks tickets via SQLite and PowerShell

We’ve been using Spiceworks as our helpdesk ticketing solution for years, as it was way better than any commercial options back then, and it’s still doing the job well.

As we’re currently attempting to put more emphasis on support through the helpdesk, I thought I’d have a look at the built-in reports. While they’re great, there was nothing that suited my requirement of emailing each helpdesk operator their open tickets on a schedule. I also have several other requirements to do with SLAs, but I’m yet to implement solutions to those.

Continue reading

Capturing standard EXE output in PowerShell

I’m sure there’s a better way of doing this, but here’s how I captured the output of SQL’s bcp.exe in order to email it and the CSV that we were creating automatically on a schedule overnight.

I ended up piping the executable to a temp file, and then grabbing the contents of that same temp file to populate the email body. Here are the lines in question:

&$exe $arg1 $arg2 $arg3 $arg4 $arg5 $arg6 > c:tempout.txt

$bcpOutput = Get-Content c:tempout.txt | Out-String
Remove-Item c:tempout.txt

Here’s the script in its entirety:

# This needs to be set, otherwise Send-MailMessage doesn’t have a server to send through$PSEmailServer ="mailserver.local"
$PSEmailServer ="mailserver.local"

$emailRecipient = "Daniel <daniel@contoso.com>"

# Some date and filename related stuff. This part's not important
$d = Get-Date
$yesterday = $d.AddDays(-1)
$filepath = "C:ExportsSQL Export"
$filename = "SQL-DailyExport-{0}.{1}.{2}.csv" -f $yesterday.Day,$yesterday.Month,$yesterday.Year
$fullpath = $filepath + $filename

# Path to the executable
$exe = "C:Program FilesMicrosoft SQL Server90ToolsBinnbcp.exe"

# arguments, as required
$arg1 = "DBName..ViewName"
$arg2 = "out"
$arg3 = $fullpath
$arg4 = "-c"
$arg5 = "-t,"
$arg6 = "-T"

# Run the executable with the arguments, and pipe the STDOut output to a text file
&$exe $arg1 $arg2 $arg3 $arg4 $arg5 $arg6 > c:tempout.txt

# An intro line for the body of the email
$bcpOutput = "SQL daily export process information: `r`n"

# Get the contents of the temporary file that we created earlier
$bcpOutput += Get-Content c:tempout.txt | Out-String

# Remove the temporary file
Remove-Item c:tempout.txt

# Build up our subject line. This part's not important
$subjectText = "SQL export for {0}/{1}/{2}" -f $yesterday.Day,$yesterday.Month,$yesterday.Year

# Append some more information to the end of the body text. Again, not important
$bcpOutput += "`r`nThe output file is attached, and can also be found in sqlExports"

# Send the email with the file attached and the body text as we've built it up
Send-MailMessage -From "SQL Export <Process@sql.server>" -To $emailRecipient -Subject $subjectText -Body $bcpOutput -Attachments $fullpath

This results in an email that contains the output from bcp.exe as well as the actual SQL export file attached to the email.

Using bcp.exe from a PowerShell script

Recently had to convert a batch file that calls bcp.exe to a more involved PowerShell script. The script exports the contents of a view to CSV format.

I originally had some issues getting the command line arguments to work, but here’s how I got it working:

$fullpath = c:tempout.csv

$exe = "C:Program FilesMicrosoft SQL Server90ToolsBinnbcp.exe"
$arg1 = "DBName..ViewName"
$arg2 = "out"
$arg3 = $fullpath
$arg4 = "-c"
$arg5 = "-t,"
$arg6 = "-T"

&$exe $arg1 $arg2 $arg3 $arg4 $arg5 $arg6

Batch convert images to B&W with PowerShell and ImageMagick

I knocked together this PowerShell script today to batch-convert 600+ staff photo images to B&W.

2012-08-02 14-07-00_000101

You’ll require the following items installed and in your path before this script will work:

  1. ImageMagick. I used the normal version, not one of the alternate ones.
  2. jhead: exif jpeg header manipulation tool. This requires ImageMagick in order to work.

ImageMagick does the actual conversion to B&W. I played with converting to pure grayscale, but it didn’t look good. This method instead strips out all colour information by setting saturation to zero:

convert.exe {sourcefile} -modulate 100,0 {destinationfile}

An issue I came across then is that ImageMagick doesn’t update the embedded JPG thumbnail. This issue almost stumped me, but then I came across this great little tool called jhead. Amongst other things, jhead can regenerate the JPG thumbnail (only if one existed originally):

jhead.exe -rgt {filename}

Tying it all together is PowerShell:

    Generate B&W versions of images 
       Daniel Streefkerk
    Version 10, 02/08/2012
    This script runs though a folder full of images and creates B&W versions of said images
    IMPORTANT NOTE: This script requires ImageMagick and jhead. For more information, see my blog
    .PARAMETER RootFolder
    The folder within which to process images

    .PARAMETER Recursive
    Also scan subfolders? This is enabled by default
    Process all images in c:temp
    .Generate-BWImages.ps1 -RootFolder c:temp

  [Parameter(Position=0,Mandatory=$true,ValueFromPipeline=$false,HelpMessage='The folder that contains the photos')]
  #[ValidateScript({Test-Path $_ -PathType 'Container'})] 
  [Parameter(Position=1,Mandatory=$false,ValueFromPipeline=$false,HelpMessage='Recurse through all subfolders?')]


# Change these if necessary
$fileExtensions = "*.jpg"
$fileNameSuffix = "_bw" # the text to be appended to the file name to indicate that it has been modified

$files = $null;
$fileCount = 0

# Check if the root folder is a valid folder. If not, try again.
if ((Test-Path $RootFolder -PathType 'Container') -eq $false) {
    Write-Host "'$RootFolder' doesn't seem to be a valid folder. Please try again" -ForegroundColor Red

# Get all image files in the folder
if ($Recursive) {
    $files = gci $RootFolder -Filter $fileExtensions -File -Recurse
} else {
    $files = gci $RootFolder -Filter $fileExtensions -File
# If there are no image files found, write out a message and quit
if ($files.Count -lt 1) {
    Write-Host "No image files with extension '$fileExtensions' were found in the folder '$RootFolder'" -ForegroundColor Red

# Loop through each of the files and process it
foreach ($image in $files) {
    $newFilename = $image.DirectoryName + "" + $image.BaseName + $fileNameSuffix + $image.Extension
    $imageFullname = $image.FullName

    write-host "Processing image: $imageFullname" -ForegroundColor Green
    & convert.exe $image.FullName -modulate "100,0" $newFilename
    write-host "Updating embedded thumbnail for: $newFilename" -ForegroundColor Green
    & jhead.exe -rgt $newFilename


Write-Host "$fileCount images processed" -ForegroundColor Yellow