Using Azure Log Analytics to retrieve logs for Report-Only Conditional Access Policies

I’ve recently been working on reviewing conditional access policies in Azure AD. Thankfully this process has become much easier than the early days with the introduction of Azure Monitor and Report-Only mode conditional access policies which allow you to properly pilot a configuration before going live.

I needed to grab an export of all sign-ins that were failing a particular report-only policy that was set up to block legacy authentication. This led me down the path of Azure Monitor and writing my first KQL query.

Note that this process depends on having set up streaming of Azure AD logs into Azure Monitor.

This KQL query grabs all sign-ins that have failed a report-only conditional access policy, and outputs the sign-in data alongside information about the policy in question:



Here’s the KQL query code:

To explain what the query does:

  1. Retrieves all sign-in logs
  2. Uses mvexpand to expand the ConditionalAccessPolicies collection that’s included along with each sign-in’s data. The collection contains one object per conditional access policy in the Azure AD environment
  3. Narrows down the list to only sign-ins where the result of a policy was a “reportOnlyFailure”
  4. Uses the ‘project’ operator to retrieve only the data we’re interested in

From here, you can export the data to CSV and work your magic with it.

Default Domain Policy GPOs

Every now and then I find myself wishing I had a documented copy of a clean Default Domain Policy GPO and Default Domain Controllers GPO lying around for reference.

I was setting up a Server 2016 AD lab in Azure today and thought I’d take the opportunity to save a copy of the GPO reports in HTML and PDF format while I was at it. Here they are, in case anybody’s interested:

  • Default Domain Policy [HTML] [PDF]
  • Default Domain Controllers Policy [HTML] [PDF]


Fix Calculator in Windows 10: “You’ll need a new app to open this calculator”

Microsoft had to go and reinvent the wheel, and replace good ‘ole calc.exe in Windows 10 since late 2017. I can see why they did it – to make it touch-friendly.

I’ve seen an error that prevents the new calculator app from even loading in the first place. I experienced that error on my own machine after clearing out my Windows profile and logging on fresh:


I can’t work without a calculator app, I use it all the time, so I had to set off and try to find a solution. There are all sorts of involved solutions out there, but what worked for me was as simple as these two lines of PowerShell (run as an admin):

Get-AppxPackage -Name Microsoft.WindowsCalculator | Remove-AppxPackage


Get-AppxPackage -Name Microsoft.WindowsCalculator | Add-AppxPackage

After that, my calculator app started working again:

2018-04-12 13_59_15-Calculator

Note: This post has been sitting in my drafts folder for almost a year, waiting for an additional screenshot. I decided to publish it today, but there may now be better solutions to this problem.

Using Azure Blob Storage as a highly-available CDP and AIA location for your internal PKI

I inherited a Windows PKI setup that had the Root CA installed on a Windows Server 2008 R2 Domain Controller, with the root certificate signed with a SHA1 hash. That DC was in the process of being decommissioned, and I also wanted to move to a better PKI design.

I’d previously set up 2-tier Windows PKI infrastructures with offline Root CAs, so I knew that this was the route I was going to take again (note that this is for an SMB environment).

There are plenty of good guides on configuring a 2-tier Windows PKI. In my opinion the best of the crop at the time of writing is probably Timothy Gruber’s 7-part guide to deploying a PKI on Windows Server 2016.

I would, however, highly recommend reading up on the topic before blindly following a guide. PKI is a complex topic, and you want to make the correct decisions up-front to avoid issues later on. Some additional recommended reading:

There are many recommendations around where to publish/advertise the AIA and CDP. Some of these include:

  • In the default location – LDAP and locally via HTTP on the CA server
  • To an internally-hosted web server, and then reverse-proxy connections from the Internet
  • To an externally-hosted web server

I’d already used Azure Blob Storage to store some other small files, so I thought I’d have a go at seeing if it’s able to be used for AIA and CDP storage. As it turns out, it’s quite easy to do, and you don’t even need to mess around with double-escaping like you would need to if you hosted on IIS or an Azure Web App:

TLDR; The CA saves the CRL files to the default location of C:\Windows\System32\CertSrv\CertEnroll, and AzCopy then copies them up to an Azure Blob Storage account that’s configured with a custom domain of

Here are the requirements to get this all set up:

  1. CDP and AIA on Enterprise/issuing CA configured to save to the default C: location, and also advertise availability at
  2. AzCopy installed on the Enterprise CA
  3. Allow outbound HTTPS/443 from the Enterprise CA to Azure Blob Storage
  4. An Azure Storage Account with blob storage configured for HTTP access. I’d recommend at least Zone Redundant Storage for availability.
  5. A custom domain name for the above storage account
  6. A folder in the blob storage named ‘pki’ (not necessary, but you’ll need to adjust the script if you don’t use this folder)
  7. A SAS key with read/write/change access to blob storage only (don’t assign more access than necessary)
  8. A scheduled task running hourly as NETWORK SERVICE to call the below PowerShell script
  9. Ensure that NETWORK SERVICE has modify permissions to the log location (default is %ProgramData%\ScriptLogs\Invoke-UpdateAzureBlobPKIStorage.log)

You’ll need to manually copy your offline root CA certificate and CRL to the blob storage location. This script is designed to copy the much more frequent CRLs and Delta CRLs from your Enterprise CA to blob storage.

As it turns out, AzCopy is perfect for this because it supports the /XO parameter to only copy new files. That allows us to schedule the script to run hourly without incurring additional data transfer costs for files that already exist in the storage account.

I wrote a PowerShell script that does the following:

  1. Checks that AzCopy is installed
  2. Determines if the C:\Windows\System32\CertSrv\CertEnroll folder exists
  3. Copies only changed files with extension .CRL to to the blob storage account
  4. Logs successful and failed transfers to %ProgramData%\ScriptLogs\Invoke-UpdateAzureBlobPKIStorage.log

You can find the script on my Github repo here:


Use pkiview.msc on a domain-joined machine to check the status of your CDP and AIA


Generating a SAS with least-privilege for AzCopy to use. Note that you’ll need to set Allowed Protocols to HTTPS and HTTP, not HTTPS only


The script’s archive log, showing the successful transfer of the CRL and Delta CRL

As always, use this at your own risk and your mileage may vary. Please drop me a comment below if you have any questions, feedback, or run into issues with the script.

Using your service desk system to track and schedule important & security-related tasks

Most IT departments would have some type of service desk system in place, but are they using it for more than just the basic support scenarios and change control?

Any modern service desk system should also be able to schedule tickets and change requests, and perhaps even perform more advanced workflow functions.

I’m using the excellent Freshservice SaaS app, and I’ve recently been taking advantage of the scheduling and workflow features to automatically generate tickets to:

Moving these types of tasks out of the minds and calendars of individual staff is important. It ensures that these sometimes critical actions continue regardless of staff turnover.


Another benefit is that within each scheduled ticket you can include clear written instructions on how to carry out the task. You also gain a long-term audit trail and notes for each time the task was carried out.

One final related note – you could also look into pointing your email security and other notifications to the service desk if you aren’t already doing so. Again, you’ll get a clear owner for each outstanding task, an audit trail of what was done, and you can assign priorities and SLAs. For example:

  • Email administrator notifications (quarantine notifications, content notifications, etc)
  • Print device consumable alerts

Let me know what you think in the comments below. Do you have any additional useful tips?

Fix: Can’t install iManage FileSite 64-bit due to installer complaining about mismatched ‘bitness’

Testing FileSite 64-bit, I ran into an issue on my own PC. I had 64-bit Office 2016 installed, but the FileSite installer refused to continue and presented me with the following message:

Dialog box: iManage Work FileSite (x64) requires that your computer has matching bitness with all Microsoft Office producs as well as any other iManage Desktop clients, Aborting Installation...

In an attempt to locate the cause of the issue, I fired up the trusty Sysinternals Process Monitor, and set up a filter to capture activity from msiexec.exe. I then further refined that filter to capture only RegQueryValue operations, and re-ran the installer.

Sure enough, Process Monitor picked up some instances of the installer reading from the registry to determine the ‘bitness’ of Office and other iManage products. In my case, there was a lingering registry entry that led the installer to conclude that I still had the 32-bit version of FileSite installed:

A screenshot of Process Monitor, showing a registry key at HKLMSoftwareWOW6432NodeInterwovenWorksiteClientCommonInstallRootbitness with a value of "X86"

Because I didn’t have any iManage products installed at the time, it was safe for me to delete the entire HKLM\SOFTWARE\WOW6432Node\Interwoven reg key.

The installer then ran successfully after this. Thank goodness for Sysinternals by Mark Russinovich.

Internet Explorer’s dangerous default behaviour when a PAC/WPAD file directs the browser to BYPASS the proxy

Today I became aware of this interesting/potentially dangerous default behaviour in Internet Explorer when you use a proxy configuration PAC/WPAD file. Yes, I know that WPAD is a bad idea for other reasons, too.

To quote the IEInternals blog: “One sometimes surprising aspect of proxy scripts is that they impact the Internet Explorer Security Zone determination…. if a proxy script is in use and returns DIRECT, the target site will be mapped to the Local Intranet Zone.”

This is a non-issue if your PAC file only bypasses the proxy server for internal sites, but if you for some reason need to bypass the proxy for an external site, it’s suddenly running outside of Protected Mode and is without the protections in place that the default Internet Zone settings offer.

Screenshot of a PAC/WPAD file showing the FindProxyForURL function with a single example condition to bypass the proxy for In this case, the code returns the string "DIRECT" if the url matches*

Here’s a test with the settings in the default state, and the PAC file instructing all HTTPS traffic to BYPASS the proxy:

Screenshot of Internet Explorer, browsed to, and File, Properties in Internet Explorer showing that the current zone is "Local Intranet"

The solution to this is to ensure that the following box is un-checked.

Screenshot of the dialog box that appears in Internet Explorer when you go to Internet Options > Security (tab) > Local Intranet > Sites (button). Showing the "Include all sites that bypass the proxy server" option is currently checked/ticked

This setting can be found in Internet Explorer under Internet OptionsSecurity (tab)Local IntranetSites (button)

In a corporate environment, you can disable this “feature” via GPO, under Computer/User Configuration > Policies > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page > Intranet Sites: Include all sites that bypass the proxy server

Disabling via GPO will result in the checkbox being greyed out:

Screenshot of the dialog box that appears in Internet Explorer when you go to Internet Options > Security (tab) > Local Intranet > Sites (button). Showing the "Include all sites that bypass the proxy server" option is currently greyed out due to the GPO that has been put in place

Another test run after making the above changes, showing the correct zone assignment:

Screenshot of Internet Explorer, browsed to, and File, Properties in Internet Explorer showing that the current zone is "Intranet", and Protected Mode is ON

Post-publishing footnote:

I discovered that you also need to ensure that Automatically detect intranet network is not checked.

Screenshot of the dialog box that appears in Internet Explorer when you go to Internet Options > Security (tab) > Local Intranet > Sites (button). Showing that 'Automatically detect intranet network' and 'Include all sites that bypass the proxy server' are greyed out and un-checked

This can be achieved via GPO under Computer/User Configuration > Policies > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page > Turn on automatic detection of intranet (set to disabled)

Resolving all Group Policy Preferences Variables

On the odd occasion that I need to use variables within Group Policy Preferences, I sometimes find myself wishing that there was a blog post that lists out exactly what the variables resolve to.

For example, does the %ProgramFilesDir% value include a trailing backslash? Or do I need to include one myself?

Sure, you can press F3 to bring up the list of variables, but it doesn’t provide example values:

Group Policy Preferences

I decided to use Group Policy Preferences itself to generate a list of the variables and their values. This was achieved through the INI file extension:

Screenshot of the Group Policy Editor, showing the rows of preference items in the INI Files section

I’ve exported these preference items to XML, so you can import them into a fresh GPO and test for yourself. Get the files here.

Screenshot of the Group Policy Management Console, showing where to drag the XML files in order to import them into the INI Files GPP area

I couldn’t get the User preferences extension to generate an INI file, and ran out of time to troubleshoot, but here’s all the variables pertaining to a Computer policy (I’ve obfuscated some values):

A table showing all of the GPP variables, and their values

Apologies for the image-based table. doesn’t make inserting tables particularly easy.

Automatically Create 40 Event Viewer Custom Views

I still find Custom Views useful when troubleshooting on individual workstations, and I’d recently been wondering if it was possible to push them out via GPP or similar. I started creating some views manually, as a test, but it was taking too long.

I’d recently been working on implementing Palantir’s WEF/WEC setup, and wondered whether I could leverage their legwork to automate the creation of these custom views.

The script I came up with took a fraction of the time to write, as opposed to the manual method. It does the following:

  1. Downloads the Palantir ‘windows-event-forwarding’ repo in ZIP format into a temporary folder
  2. Extracts the Event Log query out of each file in the ‘wef-subscriptions’ folder, and
    turns it into an appropriately-named custom Event Viewer view (XML) file in %PROGRAMDATA%\Microsoft\Event Viewer\Views

2017-11-07 16_51_46-Event Viewer

I love how simple PowerShell makes it to work with XML.

The script needs to be run as an admin in order to create the view files in %PROGRAMDATA%, unless you change the output path in the $templateStoragePath variable. It’ll also need to be able to connect to the Internet to download the ZIP file from GitHub.

I’ve started storing my scripts in my PowerShell GitHub repo rather than as Github Gists, and it’s harder to embed them on View the code via the link below:

Mitigate commodity malware attacks with Windows Firewall rules

There’s so much that can be done with the built-in Windows tools to prevent commodity malware or ransomware attacks before you even spend a cent on 3rd party tools. All of these things can (and should be) combined to create a good multi-layered strategy:

The last point has been on my to-do list for some time now. I was again reminded of it the other day while watching Sami Laiho’s recent Microsoft Ignite session about PAWs.

A lot of email-delivered malware begins with a macro or via DDE attack, and then attempts to connect to the Internet to pull down more nasties.

Today I came across this great blog post by Branden, in which he describes a handy method to prevent applications from communicating with hosts out on the Internet, while still allowing them to communicate within the internal network.

I set about manually creating a list of outbound firewall rules, including a whole bunch to mitigate the application whitelisting bypasses highlighted by the brilliant Casey Smith here. Doing this via the GUI is painful, and I wouldn’t wish it on anybody:

A listing of outbound firewall rules created in Windows Firewall with Advanced Security

Here’s a screenshot of PowerShell connecting to the web, before putting the firewall rule in place:

A PowerShell prompt, running Invoke-WebRequest to, and showing a successful request

And here’s one taken after I enabled the firewall rule:

But PowerShell can still connect to an internal web server:

A PowerShell prompt, running Invoke-WebRequest against an internal HTTP server. Showing a successful response

There are obviously going to be exceptions to these rules, for example to enable your IT staff to access Azure AD or other cloud-based services via PowerShell, but those things should be done from dedicated administrative hosts anyway. This ruleset is more for the general user population.

When the time came to think about sharing this ruleset here on my blog, I discovered that it’s possible to export the rules from the registry and re-import them elsewhere, however that has its own potential issues.

I instead created the following PowerShell script that will generate all of the appropriate rules using the New-NetFirewallRule cmdlet. It’s also much easier to review this script to see what it does, rather than read a registry export file.

You could extend this script to apply the rules directly to the appropriate GPO by using the -GPOSession parameter on the New-NetFirewallRule cmdlet.

As usual, run at your own risk, and test thoroughly before deploying:

The embedded Github Gist doesn’t show up on mobile devices. Here’s a direct link to the raw script file: