Wednesday, 28 December 2016

Globally Unique Identifiers - Creating GUIDs in Powershell

Powershell Pea...

New-Guid

I've been working more and more with Powershell, specifically with Azure Automation and scripting and wanted to share a useful Cmdlet for creating GUIDs. This link provides more detail, but ultimately this command will provide you with a unique ID that can be assigned to arbitrary objects that require a unique ID.

Bear in mind that this doesn't protect you from duplicates, but the probability of them occurring is small. So if you need GUIDs for relatively small record/object sets and your application can handle the prospect of duplicates, then this function is a quick way to generate a "unique" ID.

See this link for more details on GUIDs/UUIDs.

Wednesday, 21 December 2016

Azure Premium Storage Blob Snapshot Error - 409 Conflict

I've recently been working in more detail with Azure blob snapshots and (by trial and error) discovered that Premium storage imposes limitations on the number and frequency of snapshots that can be performed on a single blob.

If you create a snapshot of a Premium storage blob and then another in quick succession (within 60 seconds of each other), then you might get a 409/Conflict error message.

This article suggests that there are two possibilities for this error, either

  • SnapshotCountExceeded - You've exceeded the limit of 100 snapshots per blob, or
  • SnaphotOperationRateExceeded - You've exceeded the limit of snapshots within a time window (stated as 10 minutes, but I observed this as closer to 1 minute)
Whilst it's unlikely that these limits would cause a problem in practice, it's something to keep in mind when developing/testing solutions that make use of the blob snapshot facility.

Wednesday, 14 December 2016

Azure Automation Troubles

A recent venture into Azure Automation threw up some unexpected problems.

A simple script, run locally, that called Get-AzureRmVM was working to my subscription, but the same script did not work within an Azure Runbook.

I'd created the AzureRunAsConnection by default with the Azure Automation account and followed the same code as given in the AzureAutomationTutorialScript powershell runbook to login to the subscription and execute Cmdlets.

The problem came when executing the Get-AzureRmVM Cmdlet in the runbook, receiving the error:
Get-AzureRmVM : Run Login-AzureRmAccount to login.
+     $VM = Get-AzureRmVM | ? {$_.Name -eq $VMName}
+           ~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [Get-AzureRmVM], PSInvalidOperationException
    + FullyQualifiedErrorId : InvalidOperation,Microsoft.Azure.Commands.Compute.GetAzureVMCommand

After several googles I found this post...
https://feedback.azure.com/forums/246290-automation/suggestions/16590229-get-azurermvm-failing-in-azure-automation

Therefore, if anyone else sees errors when running basic Cmdlets in a runbook (that work fine from PowerShell), try updating all the modules in the Automation account using a script like this one...
https://www.powershellgallery.com/packages/Update-ModulesInAutomationToLatestVersion/1.03/DisplayScript
(Note that the deploy to Azure button didn't work for me, so I had to import the script and run manually.)

Note also that (depending on when you created your Automation Account) the above may, or may not work, and you may have to create a new Automation Account and update the modules before starting to use the account. Microsoft suggest setting the script to run on a schedule in the automation account to keep the modules up to date.

Sunday, 27 November 2016

Azure Pea of the day

For those of you who care about the aesthetics of their Azure cloud experience, I accidentally discovered how to quickly change your portal theme...

Simply double-click on the dashboard background and it will cycle through the available themes.

Wednesday, 23 November 2016

Run your own Gitlab server on Azure

For those of you out there wanting to host and control your own git server in the cloud, the Azure marketplace has a Gitlab Community Edition VM ready to deploy, see here. Of course, gitlab.com takes away the need to host and removes the cost involved, but if you want control over where your repository is located and the underlying VM, then this looks like a good option.

I've spun this up using the newly available A1_V2 VM size, which handles light load amicably. So for less then £20 you can have control over all those features that the community edition offers, including SSH and HTTPS access.

The gitlab CE documentation is a little outdated, but still good enough to help you configure the instance and get going with your own git repo.

Wednesday, 21 September 2016

Using environment variables to create a generic WebJob script in Azure

I recently had to write a script for a WebJob in an Azure App Service Web App which was reused across multiple copies of the same Web App.

The script was simple, it simply needed to make a request to a scripted action (in my case PHP) within the website hosted by the Web App. The request was via a URL, which was dependent upon the name of the web app, e.g.if the web app name is "mycoolwebapp" then the URL of the scripted action would be
https://mycoolwebapp.azurewebsites.net/bin/mytask.php
Instead of writing different scripts each time, with the web app name hardcoded, it's possible to use environment variables from within the WebJob script to refer to the Web App environment variables. In this case, "mycoolwebapp.azurewebsites.net" can be referred to via the WEBSITE_HOSTNAME environment variable in a CMD script as follows
curl https://%WEBSITE_HOSTNAME%/bin/mytask.php
This single script can then be uploaded to multiple Web Jobs for your websites, without needing a different version of the script for every website.

Monday, 12 September 2016

How to enable PHP extensions by adding modules to an Azure Web App

Using Azure, you can easily deploy a Web App (a website) that uses PHP.

An additional step that you may be required to perform is to enable additional PHP extensions that contain modules your PHP code uses.

For example, if you need to add the php_sockets.dll to the Web App, you would follow these steps:

  1. Obtain the extension dll, one way to do this is to download the correct PHP installation for windows from here
  2. Extract the PHP zip file and open the ext directory, locate the required dll file(s)
  3. Add FTP credentials to your Web App and log into your Web App using an FTP client
  4. Add a bin directory to the root directory here: D:\home\site\wwwroot
  5. Copy the dll files to the bin directory
  6. In Azure click on AppServices->"MyPHPWebApp"->Settings->Application Settings
  7. Under App Settings, create a PHP_EXTENSIONS key and add a value which is a relative path to the extension dll in the root directory, e.g. bin\php_sockets.dll


Once you save the application settings, you can run phpinfo() to check that the new extension has been loaded.

See this post for more details and an alternative method.

Monday, 5 September 2016

Buying an SSL certificate for a single custom sub-domain in Azure

The process of buying an SSL certificate in Azure is relatively simple and well-documented via the App Service Certificates option in Azure.

However, I recently had to buy an SSL certificate for a specific custom sub-domain and found the process and advice to be somewhat confusing prior to committing to the purchase. This article states:
Make sure to enter correct host name (custom domain) that you want to protect with this certificate. DO NOT append the Host name with WWW. For example, if your custom domain name is www.contoso.com then just enter contoso.com in the Host Name field, the certificate in question will protect both www and root domains.
This information is specifically aimed at those who are creating a website certificate for the www sub-domain. In terms of a custom sub-domain you shouldn't follow the same advice.

Therefore, this post will help you if you're in a similar situation as follows:

  • You've deployed an Azure App Service Web App to Azure, e.g. mywebapp.azurewebsites.net
  • You own a custom domain, e.g. necloud.uk
  • You've associated a sub-domain such as mywebapp.necloud.uk to mywebapp.azurewebsites.net
Then purchase the SSL Certificate as follows:
  • Log into Azure->App Service Certificates->Add
  • Here's the catch... under the "Naked Domain Host Name" field, enter the full sub-domain, e.g. mywebapp.necloud.uk, i,.e. DO NOT enter just necloud.uk unless you specifically want a wild card certificate.
  • Once purchased, you can add the certificate to your sub-domain via Azure->App Service->mywebapp->SSL certificates->Import App Service Certificates

Sunday, 28 August 2016

Running SQL Server 2016 Express in an Azure VM


While working through the provisioning guide for SQL Server in Azure VMs I hit a problem.

I chose the 2016 Express Edition, followed all the steps to create a server that was available publicly and then attempted to connect using the SQL server extension for Visual Studio 2015.

Try as I might, I couldn't connect, getting back this error:
Cannot connect to xyz.eastus.cloudapp.azure.com
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (Microsoft SQL Server, Error: 53 The network path was not found
Eventually I discovered that the problem is not due to the VM firewalls, Azure NSGs or SQL Server configuration. Instead I loaded up Sql Server Configuration Manager and checked the enabled protocols, only shared memory was enabled, so I enabled both the Named Pipes and TCP/IP Protocols as follows:

Finally, success! I was able to connect to my SQL server instance over the internet.

Wednesday, 24 August 2016

Buying a .uk domain and WHOIS Address Opt Out

I've known for some time that domain names you purchase (such as .co.uk) can be protected against identity theft via DomainsByProxy.com, such as those purchased via godaddy.com.

What this means is that a WHOIS lookup on a protected domain returns details along the lines of
    Registrant:
        Name withheld. This Registrant is using a privacy service.

    Registrant's address:
        Address withheld. This Registrant is using a privacy service.

    Privacy service:
        Domains By Proxy, LLC

However, I hadn't realised that .uk domains cannot be protected in this manner as they a governed by different rules.

So, during my recent purchase of a new .co.uk domain, when I was asked if I wanted to purchase the .uk as well, I thought, "Yeah, OK", without realising the differences between these domain extensions and privacy.

After some head-scratching (googling) I found that it was still possible to protect some of the registrants data (address) via an Opt Out. This meant logging into Nominet and requesting the Opt Out for the new domain.

Wednesday, 17 August 2016

Date shifting with PowerBI and the DateAdd DAX function

I was recently playing with PowerBI Desktop to build a report which compared one years' figures to another.

So for example, if you have a column of dates such as
01 August 2016
02 August 2016
03 August 2016
I wanted to create a new column that time shifts these dates to the previous year to give:
01 August 2015
02 August 2015
03 August 2015
Simple... or so I thought. The DateAdd function seemed a likely contender, so I tried
Column = DATEADD(MyTable[Date],-1,YEAR)
What stumped me was that this returned nothing for each entry, just empty cells.

After re-reading the MSDN function reference and reaching out via this PowerBI blog post I found that DateAdd only returns dates that are in the original data set, i.e.
"The result table includes only dates that exist in the dates column."
Ultimately, the most elegant solution for me was to create the modified dates in the Query Editor using M and the Date.AddYears function (not using DAX functions within PowerBI formula):
Custom = Date.AddYears([Date], -1)
Credit goes to KGrice for helping find this solution.

Friday, 12 August 2016

Debugging the PowerBI Enterprise Gateway

I recently configured the PowerBI Enterprise Gateway to connect to two different types of data source:
  • A local File on the same server as the gateway, and
  • A Web datasource, a URL reachable from the gateway server 
I added the data sources to PowerBI Online via "Manage Gateways". I could see the data sources refreshing and available within PowerBI Online. So I built a report that used the data sources and published the report to PowerBI Online.I then configured the datasets to use the Gateway connection and also scheduled a daily refresh.

A day later, I revisited my reports to check that the data sources had successfully refreshed overnight, but instead had a failure as follows:

The data gateway abcdef is offline, so refresh scheduling is currently disabled. Last refresh failed: Fri Aug 12 2016 17:09:06 GMT+0100 (GMT Summer Time)
Cannot connect to the mashup data source. See error details for more information.

Concerned that the gateway service had gone down I logged onto the server running the gateway and checked the service was running and the logs files here:

C:\Users\PBIEgwService\AppData\Local\Microsoft\on-premises data gateway\Gateway*.log
See this link for more troubleshooting tips.

It turned out that even though the File data source was still available, the Web data source had become unreachable.
Therefore, rather than just indicate that one data source was unavailable, PowerBI Online appears to report the entire data gateway as offline. Which isn't entirely true, the gateway service was running, but one of the data sources was unavailable.

Tuesday, 21 June 2016

How to delete Azure Active Directories that won't delete

As a result of studying for Azure certfications I'd created several Active Directories under my MS Azure subscription. Wanting to clean these up, I attempted to delete these from within the classic portal, only to be greeted with the following message:
 
 
Directory contains one or more applications that were added by a user or administrator.
Clicking through to the applications menu I expected to be able to delete the offending applications. No such options exists, therefore I had a google and found this blog post.

There are a couple of tricks here:

  1. Create a new user with global admin in the active directory (you must delete this later as users also prevent deletion of the Active Directory)
  2. Install the Azure Active Directory PowerShell module
  3. Connect to the active directory using Connect-MsolService and the account you just created
  4. Run Get-MsolServicePrincipal | Remove-MsolServicePrincipal to delete all applications (that can be... expect some errors)
I then deleted the new admin user I created in step 1 and retried the Active Directory delete, hey presto, worked for me.

Hope this helps.

Thursday, 28 April 2016

Microsoft Azure: Differentiating IaaS, PaaS and Cloud Services

I recently realised that there can be some confusion over how an Azure "Cloud Service" can be interpreted in the as-a-service world we now live in.

When you create a VM in the Azure classic deployment model, they are added to a cloud service. A cloud service “acts as a container and provides a unique public DNS name, a public IP address, and a set of endpoints to access the virtual machine over the Internet”.
A VM in this sense can be considered IaaS, the VM being a server in Azure that you have complete control over. The VM just happens to sit in a container called a “cloud service”.

However, you can also create a “Cloud Service” in Azure which results in something different to the above. An Azure Cloud Service consists of 2 components, config files and application files which together result in Web or Worker roles being spun up to execute an application. These roles are effectively VMs, a Web Role is an Azure VM pre-configured to run IIS and worker roles carry out other processing tasks. A Cloud Service in this sense is PaaS, as Azure is managing the VMs for you and you only have limited control over them.