When Google Met WikiLeaks

Very short post, this one, just to recommend a book I recently read: When Google Met WikiLeaks.

OR Book Going Rouge

I don’t read nearly as much as I should, or would like to.  I blame the fact that my commute to work is by car rather than public transport.  When I used to catch a bus to work I got through a fair few books but that abruptly stopped when my company bus service stopped and I was forced to drive in each day.  Now I get through a fair few BBC Radio 4 programmes each week instead, which isn’t necessarily a terrible trade-off!

What that means is that after getting this book as a Christmas present from my wife, it sat unread on my book shelf for 18 months.  Recently though, I finally got around to reading it and would thoroughly recommended it to anyone else.  Essentially the book is a transcript of a conversation between Julian Assange and Eric Schmidt that, whilst illuminating in itself, is punctuated with commentary/opinion from Assange on what lies below the glossy marketing surface at Google – which, quite frankly, is pretty scary stuff.

Infosecurity Europe 2017

It’s been a while since I’ve managed to update with a post due to a combination of holidays and being busy both at home and work, but things have been happening all the same.

One of those things was the annual Infosecurity conference in London.  It ran from the 5th-7th June at Olympia and I was lucky enough to go for two days (5th-6th).  I haven’t been to an InfoSec conference before but I would go again, the presentations tend to be quite light on technical detail but the whole event is pretty decent for seeing current trends and opening your eyes to opportunities to do better.

I tend to walk around these events with a notebook furiously jotting down notes, I prefer to stay analogue for my note taking, and here are those notes…..

Main Themes

There always seem to be some main themes that permeate an event like this and, for me, it felt like these were the main contenders:

  • GDPR
  • A.I based threat detection and prevention
  • Security teams as a managed service
  • The need for in house security skills (red/blue team mentality)

The bottom two go hand in hand with the concept being that you outsource the day-to-day security team operations of tracking alerts and events from IPS/IDS or SIEM solutions but make sure you also have security trained staff internally to build and architect your code/infrastructure.  The outsourced team will monitor what you have but it is your responsibility to make sure that what you have is decent in the first place.

The Sessions/Presentations

It was a packed two days where, in between meeting my suppliers, I also tried my best to go to as many sessions as I could.  Here’s a summary of my notes from those sessions.

Barbarians in the Throne Room

  • Presenter: Dave Lewis (Akamai)

This talk was about data breaches based upon analysis Dave had carried out over two data sets:

  1. Data breach disclosure notices in the public domain
  2. Akamai’s own data of patterns they’ve seen across their infrastructure

On the first item, Dave suggested that there wasn’t really a standard generic pattern to the breaches but the most common reasons were nothing more complex than:

  • Missing patches leading to compromise
  • People simply walking out of the door with data on a USB stick

If you’re cynical you could look at those and think it suggests that all the expensive products on display at the event aren’t really needed, you just need good practice on the fundamentals.  Of course that’s a fallacy based on such a narrow analysis but interesting nonetheless.

On the second item, it was interesting to to hear that Akamai’s own data seemed to suggest that 51% of web based attacks against them were good old SQL injection.  That lines up nicely with the OWASP Top 10 but does give an unexpectedly high weighting to that particular attack.

Dave gave some tips regarding protections throughout the talk and whilst I hope they would be generally obvious to most people, I know they often aren’t implemented in reality despite that understanding:

  • Use WAFs
  • Always encrypt data
  • Ensure that egress filtering is used to see what is going out of your network.  A common method he mentioned was the use of GRE tunnels to encapsulate IP traffic using common ports that many people allow outbound
  • It is important to have a strategy in place to protect DNS as it is key for a lot of malware to function (Cisco were pushing a product/service called Umbrella for this)

All in all, Dave was an interesting and charismatic speaker to listen to.

Securing the User

  • Presenters:
    • Jessica Baker (Independent Security Consultant)
    • Jonathan Kidd (Hargreaves Lansdown CISO)
    • Angela Sasse (RISCS)
    • Stephen Bonner (Deloitte)

This was a panel discussion about security culture within an organisation, some key messages from NCSC were:

  • A fundamental change on approach to passwords (see here).
  • The industry must move away from demonising users as the weakest link and think of them as the strongest link.
  • If security doesn’t work for people, it doesn’t work. (there’s an NCSC video here on this topic)

It was an interesting discussion with Angela Sasse by far being the speaker with the most to say.  I liked the focus on user psychology with everyone referencing a book called Nudge, which is now on my wishlist, as well as some mention of the SANS Security Awareness Report.  I did also have a good chuckle when this xkcd comic was rolled out to illustrate the dangers of systems that alert users too much:


Risks, Threats & Adversaries: What (or Who) Should You Be Worried About?

  • Presenters:
    • Peter Wood (ISACA)
    • James Lyne (Sophos)
    • Rik Ferguson (Trend Micro)
    • Ian Levy from NCSC couldn’t attend due to election purdah rules

This was another panel discussion and I went along mainly for the draw of James Lyne & Rik Ferguson who I’ve seen speak before and found to be both informative and entertaining.

The discussion was very audience driven and focused to the current threat landscape. Of course, WannaCry was given a healthy amount of time for discussion.

There was an interesting note that the World Economic Forum’s global risk report now lists cyber-attacks and data theft as both likely and high-impact. The report can be seen here and is interesting to see how these issues now line up alongside risks such as terror attacks and natural disasters.

On the topic of ransomware the key points were:

WannaCry has essentially broken the trust model that is needed for ransomware attacks so the expectation is that this type of attack will decrease.

Recommended protections were the old standards of:

  1. Have good backups:
    3 copies, 2 formats, and 1 backup copy should be air-gapped (i.e. on a tape in a safe)
  2. Limit access:
    Through user admin privilege management and network segmentation
  3. Patch!
    The emphasis shouldn’t be on zero day vulnerabilities as it simply isn’t possible to patch instantly. The focus should instead be on understanding vulnerability management. So knowing your estate and what vulnerabilities are out there so you can manage your risk during your patch window.

James then went on to speak a bit about IoT with the key message being that IoT is not the future, it is now.  The gist of the point was that IoT is getting to the point where it is “on by default” which, in effect, is taking the security choice away from users and companies.  You can’t choose to avoid using IoT devices as rapidly everything will become IoT whether you are aware of it or not.

MFA and Beyond

  • Presenters:
    • Wendy Nather (Duo)
    • Sam Rigelsford (Dyson)

Sam went through his experiences implementing MFA at Dyson which, whilst interesting, wasn’t really that useful for me as I’m well aware of the process & pitfalls having implemented it for my company.  I’ll do a lab series at some point to show how to set that up in Azure.

What Wendy Nather had to say though, I found very interesting.  Wendy spoke about how increasing de-perimeterisation means that companies needs to move towards a “zero-trust” model.

By this she suggested that companies need to move away from the concept of whitelisting internal users and run MFA as an always on protection, regardless of your location.  Coupled with this, she also suggested that companies focus on applying device hygiene restrictions before letting them join the network.

On this concept of de-perimeterisation it was recommended to read about Google’s BeyondCorp strategy.

ZScaler & Nuage

This wasn’t a talk/presentation but just a sales pitch that caught my eye as interesting/useful, but as it was a sales pitch I’ll keep my notes brief.

Simply put, Nuage offer a SD-WAN (Software Defined WAN) product service which you can then combine with ZScaler to provide a cloud based security overlay.  All pretty cool stuff that, I suppose, can help businesses move away from the old hub & spoke model to:

  • Commodity Internet breakout at each branch location.
  • Getting rid of the need for firewalls and proxy servers at branch sites as that will all be covered by the cloud based security overlay.

One for me to read up a bit more about.

Malware Red Alert: The First 24 Hours

  • Presenter: Steve Shepherd (7Safe)

This was another sales pitch kind of talk from Steve so I’ll keep these notes brief too.  The session essentially stepped through the CREST incident response process (which is very good):  Prepare, Respond, and Follow-up.

Steve’s headline tips/advice seemed to be:

  • You need to have a response partner lined up and ready to help you through an attack
  • Irrelevant of IPS/IDS or other tools you still need to go through a server inspection process following an incident so it is important that you have a clear grasp of what you have out there
  • You need to consider your company’s risk profile:  Are you a worthy target?

Hand-to-Hand Combat With An Advanced Attacker

  • Presenters:
    • Zek Turedi (Crowdstrike)
    • Dan Larson (Crowdstrike)

Whilst obviously this pushed Crowdstrike’s service it was the most technically in-depth of all the presentations I saw at the event.  It was also popular, the room was jam packed to the level that I’m sure breached fire regulations – extra chairs were brought it to fill all the aisles 🙂

The session was essentially a summary of the attack trends Crowdstrike are seeing across the environments they manage based upon their data pool of around 30 billion events per day.

  • Their data shows 8/10 breaches were from a fileless attack (a Verizon report puts the figure more at 50/50).
  • In 2017 they have seen a large rise in malware to mine crypto currencies such as Monero.
    • Often not picked up by AV as it’s not really a virus
    • More profitable and reliable than ransomware – a slow steady income of $132/yr per machine
    • SMBv1 worm versions have been found recently which grind networks to a halt as CPU is drained.

The Monero piece particularly caught my interest as I wonder how many AV clients do pick it up?

They then went through the traditional cyber kill-chain that an attack follows whilst offering some counter measure advice for each step – all a but too much detail for me to do it justice so I’ll leave it at that.

Splunk 101

I went along to a Splunk 101 session based on curiosity having read about the product and also because it was name dropped so often in other presentations when people gave examples of network event visibility.

I don’t have any experience with Splunk but I kind of wish I could get some.  I do think the product looks really good but my only reservation is that if you don’t have a dedicated security team then I question how much it will actually be used.

Top tip from this presentation was that for anyone interested you can sign-up for a free trial and they also offer a free basic training course online.  I’m planning to partake so, if I do, I’ll write up what I learn on this blog.

How to be Employed at the SOC of Tomorrow… Today

  • Presenter: Ryan Kovar (Splunk)

Sticking with the Splunk theme the final session I went along to was very much an informal discussion where Ryan spoke of his experience in the industry to date before going on to give his opinion of the key skills teams will need in the future.

To summarise:

  • Ryan’s premise was that very soon A.I will soon eliminate the traditional tier 1 jobs such as log checkers or those who write forwarders.
  • His though was that most people working in IT security who are in their 30s or above won’t have started in that sort of role, whereas those who are younger will have.  In some ways this is a disadvantage but in other ways breadth of experience can be an advantage if used properly.
  • Ryan’s assertion was that the explosion in the amount of available data means that the key skills are being able to use that to influence at the exec level.  Therefore the skills needed to compliment security awareness are:
    • BI/Data visualisation
    • Statistical analysis
  • Additionally he emphasised the need to know how to code, how to script in particular, with languages such as bash, Perl, and Python.  (He recommended Scipy lectures as a good online resource)

All just a matter of opinion, but very interesting nonetheless.

Azure AD Connect – Filtering

This post is part of a series, for the series contents see:

Nothing fancy in this post, I’ll just following along with a Microsoft article to setup OU filtering when synchronising my on-premise directory up to AAD.

The default install of AAD Connect, which is exactly what I have, simply synchronises everything up to AAD.  From a functionality point of view, that’s great, things just work but in reality everyone is going to want to apply some filters to things that have no place, or use, in the cloud:

  • Filtering out local service accounts
  • Some admin accounts
  • Legacy accounts
  • Accounts for third-parties to whom your not going to provide any cloud related services

There are also different ways you can filter:

  • AD Attributes:  You might have a starters and leavers process that tags new users, via an AD attribute, as requiring a cloud directory presence.
  • Groups: This is probably more appropriate for filtering out local (by that I mean: on-premise AD) admin groups that have no use in the cloud.
  • OU: To me, this is by far the most sensible method.  It’s always good practice to have a well organised OU structure anyway, and this method takes advantage of that. Also, in comparison with the other methods it’s nice and transparent. When troubleshooting why an account isn’t synchronising up to AAD you’re much more likely to spot a difference of OU than you are that an account is missing an AD attribute you’d only spot via ADSI Edit.

High-level Process

  1. Disable the scheduled synchronisation job before you start work
  2. Make sure you have the correct rights
  3. Edit the filters (I’ll be using OU filtering)
  4. Run a full import from Azure AD
  5. Run a delta synchronisation to see the changes the filters will cause
  6. Run a full import on the AAD connector to apply those changes to AAD
  7. Re-enable the scheduled synchronisation job

Step 1 – Disable Scheduled Synchronisation

This is a simple PowerShell one-liner run on the AAD Connect box:

#Disable the sync job
Set-ADSyncScheduler -SyncCycleEnabled $False

Step 2 – Permissions

Before you can edit or setup filters, your account will need the rights to do so.  This is as simple as making sure it’s a member of the “ADSyncAdmins” local group on the AAD Connect box:

AAD Connect Synch Admins

Step 3 – Editing the Filters

First up you need to open up the Synchronisation Service Manager on the AAD Connect box and go to “Connectors”.  From there select the local AD connector and then go to its properties.

AAD Connect Synch Manager.PNG

Click “Configure Directory Partitions” from the left-hand menu and then the “Containers” button.  You’ll then be prompted to authenticate using an AD account.  That account just needs permissions to read AD so any old account will do, it doesn’t need any special permissions.  I used my “irankon” domain admin account:

AAD Connect Containers

A menu will then open to allow you to choose the OUs that you want to synch up to AAD. A blue tick on a OU means that all sub-OUs are included and any new ones added in the future will be too.  A grey tick means that any new child OU additions in the future will need to be added in manually or they’ll be filtered out of the synchronisation process.

Trying to stick with some kind of tenuous real world link I imagined a scenario whereby I’ve filtered on the following criteria:

  • The default “Users” isn’t synchronised as I want some control over whether or not new accounts are synchronised by default.
  • Partners and company departments are all synchronised.
  • Of the contractors, the maintenance team need access to company systems to log tickets so they are synchronised but the caterers and cleaners have no need to use those systems so they aren’t.

The end picture looks like this:

AAD Connect OU Choice

Notice that if I add a child OU for a new contractor then I’ll need to go in and edit these filters again if I want those accounts to synchronise with AAD.

Step 4 – Run a Full Import

To run a Full Import you need to go back to the Connectors screen from the first part of step 3 and instead of choosing “Properties” select “Run” instead.  A menu will pop up from where you can choose “Full Import”:

AAD Connect Full Import

Step 5 – Run a Delta Synchronisation

Repeat the steps above to run a Delta Synchronisation:

AAD Connect Delta Synch

The Microsoft article then gives a couple of handy commands to export a copy of what those deltas between on-premise and AAD are:

#Export changes expected
csexport "irankon.onmicrosoft.com - AAD" %temp%\export.xml /f:x

#View those changes in csv format
CSExportAnalyzer %temp%\export.xml > %temp%\export.csv

The output is something that looks like this:

AAD Connect CSV Export

Not the most readable but I guess if you were really jittery you could power through and use it to double check.

Step 6 – Run a Full Import

Assuming that you’re happy with what the filters are going to do to AAD the next step is to run a full export of the on-premise directory to make those changes a reality:

This time I’m running the import from the perspective of the Azure AD connector (the ones above were run on the on-premise AD connector):

AAD Connect Full Import

When the job finishes the summary will let you know that there have been some deletions as our Cleaners and Caterers will have been removed from AAD:

AAD Connect Export Deletes.PNG

I then went and verified this by browsing my AAD users in the management portal. Following the export I could still see my maintenance users as expected:

AAD Connect Verify Maintenance

The catering team, however, were no longer there, they’d been filtered:

AAD Connect Verify Catering

Step 7 – Re-enable the Scheduled Synchronisation

With all the changes complete it’s important to remember to re-enable the sych job or you’ll soon start have odd problems.  Again, this is just a simple PowerShell one-liner run from the AAD Connect box:

#Re-enable the sync job
Set-ADSyncScheduler -SyncCycleEnabled $True

Azure AD Connect – Installation

This post is part of a series, for the series contents see:

After all the prep, the actual installation of Azure AD Connect is the easy bit.

You can get the download from within the Azure portal or simply go here.

Azure AD Connect Download

Simply kick off the installer and agree to the terms and conditions:

Azure AD Connect Install Step 1

Next up, you get the choice of customising the install or just going with the express settings.  For most cases the express settings will be fine, but if you do want to customise it there are some details here.

Essentially, you get some choices on:

  • Install location – it’s so minimal that I wasn’t that bothered.
  • Using a SQL Server as opposed to SQL Express – No point unless your dealing with a really large AD forest.
  • Service accounts and custom sync groups – This might be useful, I guess, depending on the standards in your environment.  You might want to specify groups and a service account with your own naming standards.
  • Auto-upgrade – the express settings enable this by default to make sure that the product stays “evergreen”.  This is pretty cool and seems to be a model that Microsoft are moving towards recently.
  • Synch settings – by default it synchronises all attributes which was fine for me.  Saying that, though, I think I’ll do a post later to show how to setup some filters.

The screen looks like this:

Azure AD Connect Install Step 2

Next up you need to specify an account with Global Admin rights on the Azure AD side of things.  Glad I prepared one of those in the previous post:

Azure AD Connect Install Step 3

With the Azure side of the equation sorted, the next step is to provide details of an account with Enterprise Admin rights over the on-premise directory.  Once again, in good old Blue Peter style, I’d prepared one of those earlier:

Azure AD Connect Install Step 4

And that’s pretty much it.  It’ll then start the sync process:

Azure AD Connect Install Step 5

Before finally letting you know that everything is complete:

Azure AD Connect Install Step 6

The thing is that, even though it says it’s complete, the directory synchronisation process will probably still be running in the background.  I verified this by logging into Azure and checking my sync status:

AAD Status No Sync

Notice the “Sync has never run” status?  Well once I gave it a few minutes that sorted itself out and I had a cloud directory with some users.  Job’s a good’un!

AAD Status Synchronised.PNG


Azure AD Connect – Preparation Tasks

This post is part of a series, for the series contents see:

Microsoft publish some prerequisites for Azure AD Connect but most of them sort themselves out, the one’s I’ve gone through for this lab are detailed below.

DNS Settings

I’d already built an aadconnect-vm (see the Azure IaaS Lab series) but as it wasn’t part of the domain yet it wasn’t that much use to me.

Adding IaaS VMs to a domain always brings up a minor DNS conundrum:

IaaS VMs get their DNS settings via DHCP from the VNET and to join a domain the VM must be able to use DNS to contact the domain controller.

You could just login to the VM and manually change the DNS servers via the NIC settings – it would work for a while but after a reboot or a change Azure side you’d soon find that the config would revert and you’d lose your settings.  So what you need to do is add the IP of your DC to the list of VNET DNS servers.  It’s simple enough to do with PowerShell and there’s a nice blog post about it here.  Rather than reinvent the wheel I’ve modified and reused siddsachar’s code as follows:

#Login to Azure and resource manager

#Set out base variables
$RGName = "internal-rg"
$VnetName = "internal-vnet"
$DNSIP = ""

#Get our internal vnet
$vnet = Get-AzureRmVirtualNetwork `
-ResourceGroupName $RGName `
-Name $VnetName

$vnet.DhcpOptions.DnsServers = $DNSIP
#Set and save the config
Set-AzureRmVirtualNetwork -VirtualNetwork $vnet

With DNS ready to go I needed to reboot aadconnect-vm before it picked up the new settings and then I was able to add it to my domain with any issues.

I did bump into an interesting issue though with the DNS settings.  At first I added a list of DNS servers that included Google’s boxes on and (using the iteration method from siddsachar’s post) but with those added my aadconnect-vm seemed to struggle to find the SRV record for my domain.  Once I removed the Google boxes then everything worked fine but that does leave aadconnect-vm with not recourse to public DNS.  The solution to that is to simply add the Google boxes into the DNS config on ad-vm.  When I say that I’m talking about the actual DNS role config rather than the network settings for the reasons mentioned above.

Account Settings

Azure AD Connect is essentially connecting two ends of a pipe:  Azure AD and an On-premise AD.  So, in order to do that you need accounts with permissions on each side, specifically:

  1. An Azure AD account with the Global Administrator role.
  2. And an on-premise AD account that is a member of of the Enterprise Administrators group.

In prep I went ahead a created an Enterprise Admin account called “irankon” on-premise and up in Azure I created a Global Admin cloud identity called aadsyncsvc@irankon.onmicrosoft.com:

AAD Add User

I did this through the GUI which is a bit of a pain as it creates a temporary password for you that you then have to reset by logging in as that account, but not a major problem:

AAD Add User Change Pass

Directory Health with IdFix

Microsoft recommend that prior to synchronising a directory you should use their IdFix tool to check it’s health, the logic being that problems are much harder to fix after the initial synch than they are before.

The tool is a pretty lightweight install (get it here) and you don’t have to run it on a DC.  I ran it on aadconnect-vm and the results came back good:

Azure AD Connect IdFix

No errors which can’t be bad!

AD Recycle Bin

Microsoft’s guidance also recommends enabling the AD Recycle Bin on the on-premise domain.  It doesn’t really say why this is recommended but if I were to hazard a guess, I would say it probably relates to the writeback feature.  With that enabled you could conceivably delete a load of users from the Azure directory and those changes would be reflected down to your on-premise setup.  If you did that by accident then the Recycle Bin would give you lifeline to get those accounts back.

Anyway, enabling it easy enough, I just ran the following on ad-vm:

Enable-ADOptionalFeature `
–Identity ‘CN=Recycle Bin Feature,CN=Optional Features,CN=Directory Service,CN=Windows NT,CN=Services,CN=Configuration,DC=irankon,DC=tk’ `
–Scope ForestOrConfigurationSet `
–Target ‘irankon.tk’


Azure AD Connect – AAD Custom Domain & Azure DNS

This post is part of a series, for the series contents see:

To setup a custom domain in Azure AD there were a couple of basic things I needed to get out of the way first:

  1. I needed to own a domain to add
  2. And, I needed somewhere to host that domain

Since this is all about Azure then the solution to point two was easy: I decided to host my domain in Azure DNS.  As an added bonus it’s nice and cheap, literally pence per month.

Add a Zone to Azure DNS

Adding a domain into Azure DNS is a simple one-liner (you need to specify a resource group so I stuck it into my hub resource group from the Azure IaaS Lab series):

#Login to Resource Manager

$RGName = "hub-rg"
New-AzureRmDnsZone -Name irankon.tk -ResourceGroupName $RGName

Then, so that I knew the correct nameservers to tell my domain registrar I ran this simple “get” cmdlet:

$RGName = "hub-rg"
Get-AzureRmDnsZone -Name irankon.tk -ResourceGroupName $RGName

The output of this command gave me the following nameservers to use:

Azure Custom Domain DNS Servers

Setup a Free Domain

Buying your own domain name is generally cheap enough, especially if you stay away from .com domains and choose one of the more obscure suffixes.  But no matter how cheap it is, I’m cheaper!  This is a test lab so I’ve got no interest in using my hard earned cash for it which lead me to www.freenom.com.

It turns out that there are a few countries out there that allow people and small businesses to register domain names with their country code for free and sites like freenom.com can facilitate that.  I decided to go for irankon.tk and benefit from the kindness of the good people of Tokelau in the South Pacific (see here for details).

First up I went to good old outlook.com and set myself up with a free test account:

AAD Create Email

Then, using that account, I registered with freenom to set myself up with irankon.tk:

Freenom Sign-up

Once I’d completed the registration process the next step was to tell the registrar (freenom) that I wanted to host that zone from Azure DNS by specifying the name servers I got from my “get” cmdlet previously:

Freenom Nameservers

And that was all that was needed.  It took about half an hour for everything to update and the SOA record to reflect what I wanted, but it got there eventually and I was able to check with a simple nslookup:

Freenom SOA Check

Verifying My Azure AD Custom Domain

In the previous post, when I added my custom domain to Azure AD, I left it in an unverified state so now that I’d sorted out my domain and DNS hosting I needed to correct that.

AAD Custom Domain Unverified

Anyone who’s had to add a custom domain into O365 before will be familiar with this process but essentially it boils down to this:

  1. Before you can start using a custom domain you need to prove that you own it. Otherwise we could all add any old domain to our setup without permission (such as someone adding “bbc.co.uk”, for example).
  2. To do this, Microsoft will generate a code and ask you to add it to your DNS zone as a TXT record.
  3. This is then used to verify that you own the domain as, unless you’ve been hacked, under normal circumstances the only person with the power to add a TXT record to a domain’s DNS zone file should be the owner of that domain.

Well in my case I do own irankon.tk and I also have control of DNS via Azure DNS.

So, in the Azure portal I browsed to AAD and clicked the verify button to kick off the whole process:

AAD Add Custom Domain Verify

A screen then popped up with instructions for the TXT record I needed to add into my DNS zone in order to prove that I own it:

AAD Custom Domain Verify

At this point I left the screen open and added the record to DNS with the following PowerShell (change the TXT record value to whatever is appropriate for you):

$RGName = "hub-rg"

$Records = @()
$Records += New-AzureRmDnsRecordConfig -Value "MS=ms00000000"
$RecordSet = New-AzureRmDnsRecordSet `
 -Name "@" `
 -RecordType TXT `
 -ResourceGroupName $RGName `
 -TTL 3600 -ZoneName "irankon.tk" `
 -DnsRecords $Records

Then, once the record and had been added, I simply click the “Verify” button to complete the whole process.

Azure AD Connect – Setup an Azure AD Directory

This post is part of a series, for the series contents see:

At this point I’ve got a test AD setup on my IaaS ad-vm which kind of represents an “on-premise” environment.  Next up, I need to setup a directory Azure side which I can then synchronise to.

Setting Up An Azure AD

I was hoping to do this with PowerShell but couldn’t find the cmdlets and I reckon that’s because they don’t exist.  If I was going to take a guess then it would be that MS don’t want people programmatically creating directories in AAD: one errant loop and someone could overload a cornerstone of Microsoft’s cloud platform.  Also, to be fair to MS, creating a directory is kind of a one-off event for most people so there’s probably not a lot of demand/incentive for developing cmdlets to do it.

So to the GUI I went, and not even the new one, instead this had to be done from:


First up, I created a new directory via the standard “New” button:

AAD Create Directory

As standard, all directories are created in the .onmicrosoft.com domain and need to have a unique name.  Luckily for me, irankon hadn’t been taken already:

AAD Directory Options

Add a Custom Domain

An onmicrosoft.com domain is all fine and well but in reality everyone is going to want to add their own domain and I want my test lab to mirror reality as much as possible.

So for that I need to add a custom domain, which is another GUI operation I’m afraid.


For my domain name I specified irankon.tk for no other reason than that I know I can get that domain for free for 12 months (more about that in the next post)

AAD Custom Domain

I left the ADFS box unticked for now but I’m hoping I can go back and change the later. We’ll have to wait and see…

And that’s as simple as it is, although at this point my custom irankon.tk domain is unverified and doesn’t actually mean a lot but I’ll fix that in my next post.

Azure AD Connect – Populate a Test AD

This post is part of a series, for the series contents see:

With AD DS setup on my IaaS machine (ad-vm), I now need to populate it with some test users that I can eventually synch up to Azure AD.

I thought I’d be able to just find an example script somewhere online and re-use that for this bit but I couldn’t really find any that I liked.  Generally, I thought most of the scripts out there over-complicated things a bit and I was looking to keep things simple.

Not really sure that my script ended up any better, though.  It’s messy but does the job.

Create Test AD OU Structure

One of the things I didn’t like with the example scripts that I found online was that they generally seem to create users in the default “Users” container.  That’s all fine and well, but I want a bit more of an OU structure in place so that later I can setup some filters on my AAD Connect box!

The script below creates a simple OU hierarchy of the sort you might see in real with parent OUs for Departments, Contractors, and Partners than then more specific OUs sitting below those.

#First setup an array to specify the AD OU structure
$CompanyStructure = `
 @("IT", "Departments"), `
 @("HR", "Departments"), `
 @("Legal", "Departments"), `
 @("Finance", "Departments"), `
 @("Sales", "Departments"), `
 @("Marketing", "Departments"), `
 @("Maintenance", "Contractors"), `
 @("Cleaning", "Contractors"), `
 @("Catering", "Contractors"), `
 @("Contoso", "Partners"), `
 @("Fabrikam", "Partners"), `
 @("Tailspin", "Partners")

#Set the Base DN
$BaseDN = "DC=irankon,DC=tk"

#Create the root OUs
#Get all the values from the parent OU part of my array
$ParentOU = $CompanyStructure | ForEach-Object { $_[1] }

#Remove duplicates by getting only unique values
$ParentOU = $ParentOU | Get-Unique

#Create OUs for those values
$ParentOU | ForEach-Object {New-ADOrganizationalUnit -Name $_ -Path $BaseDN}

#Create the child OUs
#Messy, but string concatenation was never my strong point
$CompanyStructure | ForEach-Object {New-ADOrganizationalUnit -Name $_[0] -Path ("OU=" + $_[1] + "," + $BaseDN)}

Populate with Test User Accounts

With my structure in place I can now simply setup a loop to go through a create a bunch of generic user accounts for me.

Each account will need a generic password value so rather than doing anything fancy to include one in the script I simply set it to prompt me for a value:

#First up, prompt for a default password to give to each user
$UserPassword = Read-Host 'Set a default user password' -AsSecureString

Then for the loop, I used the code below.  It really is quite poor but it does the job!

#Now create a hundred users for each OU
$CompanyStructure | `
ForEach-Object {

$UserPath = ("OU=" + $_[0] + "," + "OU=" + $_[1] + "," + $BaseDN)
$UserTotal = 100
$UserType = $_[0]

For ($UserCount=1; $UserCount -le $UserTotal; $UserCount++) {

$UserName = ($UserType + "User" + $UserCount)

New-ADUser `
-Name $UserName `
-GivenName $UserType `
-Surname ("User" + $UserCount) `
-Path $UserPath `
-SamAccountName $UserName `
-UserPrincipalName "$UserName@irankon.tk" `
-AccountPassword $UserPassword `
-ChangePasswordAtLogon $False `
-PasswordNeverExpires $True `
-Enabled $True `
-Division $UserType

Ta Da!  It wasn’t pretty but the upshot is that I now have something that I can synchronise up into Azure AD.

Azure AD Connect – Setup a Test AD

This post is part of a series, for the series contents see:

There’s no point setting up Azure AD Connect if I haven’t even got a directory to synch up to Azure, so that’s step one.

This series of posts will make use of the VMs I’ve already setup as part of the Azure IaaS Lab posts:

  1. ad-vm
  2. aadconnect-vm

NTDS Disk Setup

First up, when I created ad-vm earlier I didn’t give it a data disk so I need to do that now so that I have somewhere for my AD NTDS files etc. to live.

If you were doing this on-premise you might simply ignore best practices and stick everything on your C: drive, but if you do that on a Azure IaaS VM you’ll soon run into some corruption issues.  Microsoft have published some helpful guidelines for setting up AD on an IaaS box and the key bits are:

  • Make sure you’ve got a reserved/static IP address (I already did this when setting up ad-vm)
  • Make sure you add a separate data disk for your NTDS files.  The caching setting on this disk should also be set to “None” (the default is Read/Write).

To add a data disk I used this code:

# The Basics #

#Login to Azure and resource manager

#Just in case you have multiple subscriptions check which one you're working in

#If you need to select your test subscription use:
#Set-AzureSubscription -SubscriptionName <name>

# Add an NTDS Disk to AD-VM #

#First setup the variables
$RGName = "internal-rg"
$VMName = "ad-vm"
$NTDSDiskName = $VMName + "NTDSDisk"
$StorageAccount = Get-AzureRmStorageAccount -ResourceGroupName $RGName -Name internalvmstr
$NTDSDiskUri = $StorageAccount.PrimaryEndpoints.Blob.ToString() + "vhds/" + $NTDSDiskName + ".vhd"

#Add an NTDS disk to the VM
$vm = Get-AzureRmVM -ResourceGroupName $RGName -Name $VMName

Add-AzureRmVMDataDisk -VM $vm `
 -Name $NTDSDiskName `
 -VhdUri $NTDSDiskUri `
 -LUN 0 `
 -Caching None `
 -DiskSizeinGB 10 `
 -CreateOption Empty

Update-AzureRmVM -ResourceGroupName $RGName -VM $vm

Install and Setup AD DS

The PowerShell for the next few steps is all run directly on ad-vm itself.  I suppose, technically, you could get a remote PS session setup to ad-vm and run these commands using “Invoke-Command” but for something as one-off as this I didn’t really see the need to go to all that hassle.

If you fancy giving it a go, there’s a good blog post here with some instructions for setting up WinRM.

In the previous section we added a data disk to ad-vm to host our NTDS files, well before I can do that I need to configure that disk in Windows:

#Within Windows setup the NTDS disk we've added to AD-VM
Get-Disk | `
Where-Object PartitionStyle -eq "RAW" | `
Initialize-Disk -PartitionStyle GPT -PassThru | `
New-Partition -AssignDriveLetter -UseMaximumSize | `
Format-Volume -FileSystem NTFS `
-NewFileSystemLabel "NTDS" `

Because of the various extra scratch disks that exist on an IaaS VM this new disk will end up as an “F:” drive.

With that in place, all that is left is to install the AD DS role and setup a new forest.  You might want to populate your own domain details below or you’ll run into problems later when we setup AAD Connect.

#Still on the ad-vm
#Install AD DS
Install-windowsfeature -name AD-Domain-Services –IncludeManagementTools

#You might be prompted to reboot here but you don't have to

#Setup AD DS
Import-Module ADDSDeployment
Install-ADDSForest `
 -CreateDnsDelegation:$false `
 -DatabasePath "F:\NTDS" `
 -DomainMode "Win2012R2" `
 -DomainName "irankon.tk" `
 -DomainNetbiosName "irankon" `
 -ForestMode "Win2012R2" `
 -InstallDns:$true `
 -LogPath "F:\NTDS" `
 -NoRebootOnCompletion:$false `
 -SysvolPath "F:\SYSVOL" `

#Enter a recovery password when prompted

At the end of the process you’ll be prompted to enter a directory services restore mode password, so type something in there and then give the box a final reboot.

Of course, if you were doing this using “Invoke-Command” and you wanted to pre-populate the password you can simply use the “-SafeModeAdministratorPassword” switch with a SecureString value.

Azure AD Connect – High-level

This post is part of a series, for the series contents see:

Taking a brief detour from the Azure IaaS lab, these next few posts will be a mini-lab to setup Azure AD and directory synchronisation using Azure AD Connect.

At a high-level the lab will consist of the following:

  1. Populating an on-premise directory with test data ready to be synchronised.
  2. Creating an Azure Active Directory and setting up a custom domain using Azure DNS.
  3. Setting up Azure AD Connect.
  4. Exploring Azure AD Connect’s options.