• dazfuller

When in doubt, shell out

Often I see people clicking about all over UIs trying to get the information that they want, loading data into tools like Excel to filter out the bits they're interested in, or mutter under their breaths something like "this used to show more information". All to often though people ignore one of the most powerful features which comes bundled with almost all desktop environments; the command line.


It's easy to understand why the command line can seem intimidating. There's no useful buttons or menus telling you what actions you can take, or what to do. It's not as colourful as the desktop (though this is something you can change), and if you browse the web you'll find out that there's loads of command line interfaces, tools, commands etc... which can seem daunting. So I'm going to give a quick intro to 2 things, PowerShell (core) and the Azure CLI.


So why these 2? Well, they're both cross-platform so you can follow along if you're on Windows, Mac, or Linux. And this is why I love them for CI/CD work, as I often build the scripts on Windows, but build and release using Linux agents. One of the other reasons I like PowerShell is because it can treat values as objects, this means that I can query it's properties, sub-properties, and perform actions based on the values type. There is an Azure PowerShell module as well, but I often find the CLI tool easier to work with and less prone to versioning issues.


So what can we do? Well, first thing you'll need to do is to log in using the Azure CLI. To do this simply type the following. On platforms such as Windows it will open your browser and use that authentication flow to log in. Once it's done the CLI will then check to see which tenants and subscriptions you have access to. If you have access to more than one and want to change it then you can do the following.

az account list --output table
# copy the subscription id of the one you want to work in
az account set --subscription <subscription id>

And you'll now be in the right subscription.


The "--output table" in the command above tells the CLI to limit to only the really important values and display the results in a table. It makes it much easier to read, but you won't see all of the details. If you want to see more detail then you can just type "az account list" and you'll see the full output.


What you'll notice is that that output is in JSON format, this is the default for the Azure CLI tool, though you can choose others like table and tsv as well. PowerShell has a built in command for taking that JSON and converting it into objects. This is the "ConvertFrom-Json" command and it's incredibly useful when working with tools or APIs that return data in JSON. Lets list those accounts again (even if there's only 1) and using the pipe operator "|" lets pass the output from the CLI into the ConvertFrom-Json command (the GUIDs have been changed to protect the innocent).

> az account list | ConvertFrom-Json

cloudName        : AzureCloud
homeTenantId     : a1b2c3d4-abcd-1234-a1b2-a1b2c3d4e5f6
id               : a1b2c3d4-abcd-1234-a1b2-a1b2c3d4e5f6
isDefault        : True
managedByTenants : {@{tenantId=a1b2c3d4-abcd-1234-a1b2-a1b2c3d4e5f6}}
name             : Darren Fuller - MPN
state            : Enabled
tenantId         : a1b2c3d4-abcd-1234-a1b2-a1b2c3d4e5f6
user             : @{name=darren@el#######ud.com; type=user}

We now have an output showing a list of objects. This is great but we don't really want to keep running this every time we want to get a value. So lets store it in a variable for later use.

> $accounts = az account list | ConvertFrom-Json

This time you won't see the output as before, because it's been saved to $accounts (The dollar symbol denotes a variable in PowerShell). But now we can do things like the following, note that your output will probably be different to the output shown here.

> $accounts.Length
27
> $accounts[0].name
Darren Fuller - MPN

So, this is fun, but what can we really do with this?


Well, in a script recently I needed to find the external IP address range for Azure DataFactory, this was to add firewall rules to allow the Azure hosted run-times access as part of the data ingestion flow. So how can we do that? Well, we could use the Azure documentation, but that's a bit of a manual process if we wanted to run this again later, or on a regular basis. So lets script it instead.


The Azure CLI has a command that lets us list the service tags for a region, this returns everything though, so we'll need to filter it down, and then just get the IP address CIDR blocks. But we're only interested in the IPv4 addresses, not the IPv6 ones. Substitute your own region in place of North Europe if you want to try this out.

# Show all service tags for North Europe
> az network list-service-tags --location northeurope

# Lets save them as a variable
> $tags = az network list-service-tags --location northeurope | ConvertFrom-Json

# Have a look at what there is
> $tags.values

# Lets filter this down now to just DataFactory, the system service is "DataFactory" and we have to put in the region again
> $adfTags = $tags.values | Where-Object -FilterScript { $_.properties.systemService -eq "DataFactory" -and $_.properties.region -eq "northeurope" }

# And lets have a look at the properties of the item we've found
> $adfTags.properties
addressPrefixes : {13.69.230.96/28, 13.74.108.224/28, 20.38.80.192/26, 20.38.82.0/23}
changeNumber    : 1
networkFeatures : {API, NSG}
region          : northeurope
state           : GA
systemService   : DataFactory

There's a lot happening in that filter line, but the short version is that we're taking the values object and piping it into the Where-Object command, this allows us to filter the input. The FilterScript parameter lets us specify the code to use to filter each object being passed in. Any which meets the criteria are kept, the others are discarded. PowerShell uses operators like "-eq" and "-lt" instead of "==" and "<", which takes a little getting used to but they are very well documented.


Because we were operating on objects (thanks to the ConvertFrom-Json) command, we get an object out, which means that we can look at the properties using dot notation. So if we want to just look at the IP addresses we can do this.

> $adfTags.properties.addressPrefixes
13.69.230.96/28
13.74.108.224/28
20.38.80.192/26
20.38.82.0/23
20.50.68.56/29
52.138.229.32/28
2603:1020:5:1::480/121
2603:1020:5:1::500/122
2603:1020:5:1::700/121
2603:1020:5:1::780/122

# Lets filter out the IPv6 addresses
> $adfTags.properties.addressPrefixes | Where-Object -FilterScript { $_ -notlike "*::*" }
13.69.230.96/28
13.74.108.224/28
20.38.80.192/26
20.38.82.0/23
20.50.68.56/29
52.138.229.32/28

And from this we can then use the list of IP addresses to create firewall rules, perform lookups, or provide them out for reporting purposes.


When you get more confident you can start using other tools like ripgrep to scan through text or files, parse CSV files and filter out data, or use more of the built-in features of the Azure CLI such as the query argument which takes a JMESPath string (think XPath for JSON) so you perform all or most of the filtering using the CLI tool before you start to process it in JSON. This takes a little more effort, but it's useful if you want to just run a single command.

> az network list-service-tags --location northeurope --query "values[?properties.systemService == 'DataFactory' && properties.region == 'northeurope'] | [].properties.addressPrefixes[]"
[
  "13.69.230.96/28",
  "13.74.108.224/28",
  "20.38.80.192/26",
  "20.38.82.0/23",
  "20.50.68.56/29",
  "52.138.229.32/28",
  "2603:1020:5:1::480/121",
  "2603:1020:5:1::500/122",
  "2603:1020:5:1::700/121",
  "2603:1020:5:1::780/122"
]

Importantly, it makes what you're doing repeatable for yourself and anyone you share the script with. It also means that you can run the same scripts within your CI/CD process.


You can also then start creating your own tools, or just PowerShell functions to do common bits of work for you and pipe values through those as well.


The command line is a powerful environment that lets you do a lot of work quickly, easily, and in a repeatable way without having to document processes such as "Open tool A, click on the File menu and then...". You don't have to become an expert overnight, but getting familiar and automating little bits as you go, you'll soon become proficient and be breaking out the command line when you need to get stuff done.


I mentioned that you can do things about how colourful the command line is and there are loads of options for different platforms. But if you're one Windows then check out the Windows Terminal which offers tabbed consoles, choices of fonts, background effects, colour schemes and more.

18 views0 comments

Recent Posts

See All

Dropping a SQL table in your Synapse Spark notebooks

One of the nice things with Spark Pools in Azure Synapse Analytics is how easy it is to write a data frame into a dedicated SQL Pool. curated_df.write.mode("overwrite").synapsesql("curation.dbo.feed_i