In December 2017 I attended this free online IoT bootcamp, presented by Microsoft and Hackster.io.
One of the best courses I've seen to date, with some notable guest speakers, Eben Upton, Raspberry Pi Founder.
If you're considering it, then I'd recommend following the pre-setup exactly in order to minimise issues with the labs., day 1 and 2 labs were good, but day 3 labs are bit buggy (covered in the recorded sessions).
There are 9 hands-on labs covered things like connecting a microcontroller/Raspberry Pi and temperature sensors to Azure IoT Hub, Azure Stream Analytics, PowerBI and Azure Functions.
Introduced me to Windows IoT core (an OS compatible with Raspberry Pi).
The 3rd day labs are code (copy/paste) heavy, focused on Cognitive services (facial recognition), bot frameworks and machine learning.
There's no need to buy the full kit they suggest, the bare minimum is (available elsewhere, but these are adafruit links):
Assembled Adafruit Feather HUZZAH ESP8266 WiFi - https://www.adafruit.com/product/3046
AM2302 (wired DHT22) temperature-humidity sensor - https://www.adafruit.com/product/393
Breadboard - https://www.adafruit.com/product/239
Male/Male Jumper wires - https://www.adafruit.com/product/1956
Female/Male Jumper wires - https://www.adafruit.com/product/1954
1x10mm LED
1x 560 Ohm 5% 1/4W Resistor
8GB Micro SD card - https://www.adafruit.com/product/1294
MicroSD card reader/writer - https://www.adafruit.com/product/939
Raspberry Pi 3 Model B - https://www.adafruit.com/product/3055
Enjoy! I certainly did :-)
neCloud
Thursday, 11 January 2018
Thursday, 21 December 2017
Test SQL Server connectivity from a Windows Server without installing another package
Recently I wanted to test whether a Windows 2016 server could connect to a SQL Server endpoint (happened to be an Azure SQL Database), but I was on a customers production web server and I didn't want to install tools (e.g. SSMS) to test, a quick google found this great solution, so I wanted to re-post it, many thanks to steverac for this gem.
https://blogs.msdn.microsoft.com/steverac/2010/12/13/test-remote-sql-connectivity-easily/
https://blogs.msdn.microsoft.com/steverac/2010/12/13/test-remote-sql-connectivity-easily/
Thursday, 19 October 2017
Troubles with PowerShell Modules
Working with Azure, I regularly have to run PowerShell CmdLets and scripts to get things done.
Unfortunately this means regularly installing and updating Azure PowerShell modules to make use of the latest and greatest features. Sometimes, too I have to downgrade to earlier module versions just to "get stuff done".
The result can be a nasty mess of my powershell environment, leaving behind modules that cannot be removed. For example, I regularly uninstall the AzureRM module with
Which completes without error, but a subsequent
Still reports that the module and sub-scripts are installed. The situation doesn't improve when you try to uninstall the AzureRM module again, this time you receieve an error stating:
Unfortunately this means regularly installing and updating Azure PowerShell modules to make use of the latest and greatest features. Sometimes, too I have to downgrade to earlier module versions just to "get stuff done".
The result can be a nasty mess of my powershell environment, leaving behind modules that cannot be removed. For example, I regularly uninstall the AzureRM module with
Uninstall-Module AzureRM
Which completes without error, but a subsequent
Get-Module -ListAvailable
Still reports that the module and sub-scripts are installed. The situation doesn't improve when you try to uninstall the AzureRM module again, this time you receieve an error stating:
No match was found for the specified search criteria and module names "AzureRM"So, how do you get rid of modules that "seemingly" can't be uninstalled? One solution I dreamt up is as follows:
Get-Module -ListAvailable | where { $_.Name -like "Azure*" } | Select Name | Write-Output -PipelineVariable modname | ForEach { Uninstall-Module $modname.Name }This utilises the PipelineVariable common parameter to capture the list of "installed" Azure modules and then forcefully uninstall them one-by-one. Be patient, this can take some time to run, but gets the job done.
Monday, 14 August 2017
Stuck! How to change both the local admin account password and account name for an Azure Windows VM
I recently had to change both the account name and account password for a Windows VM running in Azure. Luckily, help is at hand thanks to some PowerShell and the VM Access Virtual Machine extension. No point in re-inventing the wheel, details are in the articles below...
https://blogs.technet.microsoft.com/askpfeplat/2016/02/08/resetting-the-local-admin-name-and-password-for-azure-arm-virtual-machines-with-powershell/
https://docs.microsoft.com/en-us/powershell/module/azurerm.compute/set-azurermvmaccessextension?view=azurermps-4.2.0
https://blogs.technet.microsoft.com/askpfeplat/2016/02/08/resetting-the-local-admin-name-and-password-for-azure-arm-virtual-machines-with-powershell/
https://docs.microsoft.com/en-us/powershell/module/azurerm.compute/set-azurermvmaccessextension?view=azurermps-4.2.0
Friday, 31 March 2017
Enterprise Application Migration To Azure
The Azure cloud is moving fast in so many areas. Understandably, migration tools that can assist and automate the process of migrating applications to Azure are of great importance in order to encourage a move away from existing infrastructure.
Two such tools that I’ve discovered recently are the Web Apps Migration Assistant, which is detailed here, and the Data Migration Assistant, detailed here.
The Web Apps Migration Assistant will analyse your hosted IIS (and even Linux) web apps, and can migrate them (if compatible) to the Azure App Service as Web Apps.
The Data Migration Assistant targets SQL Servers and analyses them for migration to Azure SQL Database. This tool does not perform the migration, instead it highlights the incompatibilities that exist and suggests remedial action, this could include preparing fixes as Transact-SQL scripts. Tools such as SSMS or SQLPackage.exe can be used to migrate the database to Azure, a good overview of this process is here and a deeper discussion of the process is mentioned in this SQL CAT blog. Of course, you may want to determine if your database is even suitable for running in Azure and tools exist such as this one to help size the database prior to migration.
These two tools could help fast-track your move to the cloud, and whilst some effort will still be required, they remove much of the manual effort that was previously required in shifting your applications and databases to Azure.
Two such tools that I’ve discovered recently are the Web Apps Migration Assistant, which is detailed here, and the Data Migration Assistant, detailed here.
The Web Apps Migration Assistant will analyse your hosted IIS (and even Linux) web apps, and can migrate them (if compatible) to the Azure App Service as Web Apps.
The Data Migration Assistant targets SQL Servers and analyses them for migration to Azure SQL Database. This tool does not perform the migration, instead it highlights the incompatibilities that exist and suggests remedial action, this could include preparing fixes as Transact-SQL scripts. Tools such as SSMS or SQLPackage.exe can be used to migrate the database to Azure, a good overview of this process is here and a deeper discussion of the process is mentioned in this SQL CAT blog. Of course, you may want to determine if your database is even suitable for running in Azure and tools exist such as this one to help size the database prior to migration.
These two tools could help fast-track your move to the cloud, and whilst some effort will still be required, they remove much of the manual effort that was previously required in shifting your applications and databases to Azure.
Friday, 24 March 2017
Vertica on Azure
For some time now you’ve been able to deploy a Vertica cluster to Azure via a manual process, but now, the Azure marketplace has the HPE Vertica Analytics Platform available to deploy via an ARM template a cluster with up to 5 nodes.
If you’re not familiar with Vertica, then in brief, it’s a column store relational database with rapid data load features (ingesting to memory) and then optimises the storage of that data on disk through different encoding schemes.
Having worked with Vertica for over 5 years I can recommend it for use in particular scenarios, particularly when your data is very tabular, denormalised, compresses/encodes well and is such that you may want to execute analytic workloads (queries) over that data. It works less well in situations where you need to join table data or perform transactional workloads with update or delete operations.
If you’re looking for an alternative to the likes of AWS Redshift or Azure Data Warehouse and are content to consider IaaS alternatives, then why not give Vertica a go, and spin up a cluster in your cloud of choice, it may just give you better performance, have a deeper feature set for your analytics and could work out cheaper than PaaS offerings.
If you’re not familiar with Vertica, then in brief, it’s a column store relational database with rapid data load features (ingesting to memory) and then optimises the storage of that data on disk through different encoding schemes.
Having worked with Vertica for over 5 years I can recommend it for use in particular scenarios, particularly when your data is very tabular, denormalised, compresses/encodes well and is such that you may want to execute analytic workloads (queries) over that data. It works less well in situations where you need to join table data or perform transactional workloads with update or delete operations.
If you’re looking for an alternative to the likes of AWS Redshift or Azure Data Warehouse and are content to consider IaaS alternatives, then why not give Vertica a go, and spin up a cluster in your cloud of choice, it may just give you better performance, have a deeper feature set for your analytics and could work out cheaper than PaaS offerings.
Tuesday, 7 March 2017
Azure Logic Apps - A blog re-post
For those of you wanting to get started with Azure Logic Apps, I've put together a simple guide to get you started. My example effectively creates an API app in Azure App Service using Logic App callable endpoints...
Thursday, 23 February 2017
Microsoft Certification 70-475 : Designing and Implementing Big Data Analytics Solutions
Having recently sat and passed Microsoft’s exam 70-475, I thought I’d publish the list of references I built up whilst studying. This is still a relatively new exam, so study materials are hard to come by, just as for exam 70-473. As usual, I also made use of the Mindhub practice exam.
I found it difficult to pin-down specific resources for some of the objective areas, so it’s by no means extensive, but covers a good chunk of the exam content.
I also recommend having some prior knowledge of MS SQL, Hadoop and Azure ecosystems before tackling this exam.
Hope this helps!
1. Design big data batch processing and interactive solutions
- Ingest data for batch and interactive processing
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-copy-activity-performance
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-overview-load
- Ingest from cloud-born or on-premises data,
- store data in Microsoft Azure Data Lake,
- store data in Azure BLOB Storage,
- perform a one-time bulk data transfer,
- perform routine small writes on a continuous basis
- Design and provision compute clusters
- Select compute cluster type,
https://www.blue-granite.com/blog/how-to-choose-the-right-hdinsight-cluster
- estimate cluster size based on workload
- Design for data security
- Protect personally identifiable information (PII) data in Azure
- encrypt and mask data,
- implement role-based security
- Design for batch processing
- Select appropriate language and tool,
- identify formats,
- define metadata,
- configure output
- Design interactive queries for big data
- Provision Spark cluster,
- set the right resources in Spark cluster,
- execute queries using Spark SQL,
- select the right data format (Parquet),
- cache data in memory (make sure cluster is of the right size),
- visualize using business intelligence (BI) tools (for example, Power BI, Tableau),
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-integrate-power-bi
- select the right tool for business analysis
2. Design big data real-time processing solutions
- Ingest data for real-time processing
http://download.microsoft.com/download/6/2/3/623924DE-B083-4561-9624-C1AB62B5F82B/real-time-event-processing-with-microsoft-azure-stream-analytics.pdf
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-storm-sensor-data-analysis - hands-on tutorial
- Select data ingestion technology,
- design partitioning scheme,
- design row key of event tables in Hbase
http://www.dummies.com/programming/big-data/hadoop/row-keys-in-the-hbase-data-model/
http://hbase.apache.org/0.94/book/rowkey.design.html
- Design and provision compute resources
- Select streaming technology in Azure,
- select real-time event processing technology,
- select real-time event storage technology,
- select streaming units,
https://docs.microsoft.com/en-gb/azure/stream-analytics/stream-analytics-scale-jobs
- configure cluster size,
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-provision-clusters#cluster-types
- assign appropriate resources for Spark clusters,
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-apache-spark-resource-manager#how-do-i-know-if-i-am-running-out-of-resource
- assign appropriate resources for HBase clusters,
- utilize Visual Studio to write and debug Storm topologies
- Design for Lambda architecture
https://social.technet.microsoft.com/wiki/contents/articles/33626.lambda-architecture-implementation-using-microsoft-azure.aspx
http://lambda-architecture.net/
- Identify application of Lambda architecture,
- utilize streaming data to draw business insights in real time,
- utilize streaming data to show trends in data in real time,
- utilize streaming data and convert into batch data to get historical view,
- design such that batch data doesn’t introduce latency,
- utilize batch data for deeper data analysis
- Design for real-time processing
- Design for latency and throughput,
- design reference data streams,
- design business logic,
- design visualization output
3. Design Machine Learning solutions
- Create and manage experiments
https://docs.microsoft.com/en-gb/azure/machine-learning/machine-learning-studio-overview-diagram
- Create, manage, and share workspaces;
https://docs.microsoft.com/en-gb/azure/machine-learning/machine-learning-create-workspace
- create training experiment;
- select template experiment from Machine Learning gallery
- Determine when to pre-process or train inside Machine Learning Studio
- Select model type based on desired algorithm,
- select technique based on data size
- Select input/output types
- Select appropriate SQL parameters,
- select BLOB storage parameters,
- identify data sources,
- select HiveQL queries
- Apply custom processing steps with R and Python
https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-extend-your-experiment-with-r
https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-custom-r-modules
- Visualize custom graphs,
https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-execute-python-scripts#working-with-visualizations
- estimate custom algorithms,
http://download.microsoft.com/download/A/6/1/A613E11E-8F9C-424A-B99D-65344785C288/microsoft-machine-learning-algorithm-cheat-sheet-v6.pdf
- select custom parameters,
https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-execute-python-scripts#basic-usage-scenarios-in-machine-learning-for-python-scripts
- interact with datasets through notebooks (Jupyter Notebook)
https://gallery.cortanaintelligence.com/notebooks
https://gallery.cortanaintelligence.com/Notebook/Tutorial-on-Azure-Machine-Learning-Notebook-1
- Publish web services
- Operationalize Azure Machine Learning models,
- operationalize Spark models using Azure Machine Learning,
https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-data-science-spark-model-consumption#consume-spark-models-through-a-web-interface
- operationalize custom models
4. Operationalize end-to-end cloud analytics solutions
- Create a data factory
- Identify data sources,
- identify and provision data processing infrastructure,
- utilize Visual Studio to design and deploy pipelines
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-build-your-first-pipeline-using-vsm
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-build-your-first-pipeline
- Orchestrate data processing activities in a data-driven workflow
- Leverage data-slicing concepts,
- identify data dependencies and chaining multiple activities,
- model complex schedules based on data dependencies,
- provision and run data pipelines
- Monitor and manage the data factory
- Identify failures and root causes,
https://docs.microsoft.com/en-gb/azure/data-factory/data-factory-monitor-manage-pipelines
- create alerts for specified conditions,
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-monitor-manage-pipelines#create-alerts
- perform a restatement
- Move, transform, and analyze data
- Leverage Pig, Hive, MapReduce for data processing;
https://docs.microsoft.com/en-gb/azure/data-factory/data-factory-hive-activity
https://docs.microsoft.com/en-gb/azure/data-factory/data-factory-map-reduce
- copy data between on-premises and cloud;
https://docs.microsoft.com/en-gb/azure/data-factory/data-factory-data-management-gateway
- copy data between cloud data sources;
- leverage stored procedures;
- leverage Machine Learning batch execution for scoring, retraining, and update resource;
- extend the data factory with custom processing steps;
- load data into a relational store
- visualize using Power BI
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-get-started-visualize-with-power-bi
- Design a deployment strategy for an end-to-end solution
- Leverage PowerShell for deployment,
- automate deployment programmatically
https://msdn.microsoft.com/library/mt415893.aspx
https://msdn.microsoft.com/library/dn906738.aspx
Wednesday, 8 February 2017
Copying an Azure Blob snapshot to another storage account using PowerShell
I've been working with Azure Blobs and Snapshots recently. One gripe is that the Azure portal and the various storage explorers don't give you the power to copy a blob snapshot. Blobs, yes, but their snapshots, no.
Depending on your requirements there are many approaches. The PowerShell below gives an example where the snapshot timestamp is extracted from the snapshot URI and then used to obtain an object to the specific snapshot before copying it with the Start-AzureStorageBlobCopy cmdlet.
# Define the source snapshot blob URI
$SrcBlobURI="https://<SOURCE_STORAGE_ACCOUNT>.blob.core.windows.net/vhds/<SOURCE_VHD_PREFIX>.vhd?snapshot=2017-01-24T21:08:36.9371576Z"
# Define the destination storage account and context.
$DestStorageAccountName = "<DESTINATION_STORAGE_ACCOUNT>"
$DestStorageAccountKey = "<DESTINATION_STORAGE_ACCOUNT_KEY>"
$DestContainerName = "<DESTINATION_CONTAINER_NAME>"
$DestContext = New-AzureStorageContext -StorageAccountName $DestStorageAccountName -StorageAccountKey $DestStorageAccountKey
# Determine source snapshot blob container name and context
$SrcDiskInfo = "" | Select StorageAccountName, VHDName, ContainerName
$SrcDiskInfo.StorageAccountName = ($SrcBlobURI -split "https://")[1].Split(".")[0]
$SrcDiskInfo.VHDName = $SrcBlobURI.Split("/")[-1]
$SrcDiskInfo.ContainerName = $SrcBlobURI.Split("/")[3]
$SrcContainerName = $SrcDiskInfo.ContainerName
$SrcStorageAccount = Find-AzureRmResource `
-ResourceNameContains $SrcDiskInfo.StorageAccountName `
-WarningAction Ignore
$SrcStorageKey = Get-AzureRmStorageAccountKey `
-Name $SrcStorageAccount.Name `
-ResourceGroupName $SrcStorageAccount.ResourceGroupName
$SrcContext = New-AzureStorageContext `
-StorageAccountName $SrcStorageAccount.Name `
-StorageAccountKey $SrcStorageKey[0].Value
# Extract timestamp from the Snapshot filename and convert to a datetime
$pre, $post = $SrcBlobURI.split("=")
$snapdt = [uri]::UnescapeDataString($post)
$snapdt = [datetime]::ParseExact( `
$snapdt,'yyyy-MM-dd"T"HH:mm:ss"."fffffff"Z"',$null `
)
# Get a reference to blobs in the source container.
$blob = Get-AzureStorageBlob -Container $SrcContainerName `
-Context $SrcContext `
| Where-Object {`
$_.ICloudBlob.IsSnapshot `
-and $_.SnapshotTime -eq $snapdt `
}
# Copy blobs from one container to another.
$blob | Start-AzureStorageBlobCopy `
-DestContainer $DestContainerName `
-DestContext $DestContext `
-DestBlob "acopy_$($blob.Name)"
Depending on your requirements there are many approaches. The PowerShell below gives an example where the snapshot timestamp is extracted from the snapshot URI and then used to obtain an object to the specific snapshot before copying it with the Start-AzureStorageBlobCopy cmdlet.
# Define the source snapshot blob URI
$SrcBlobURI="https://<SOURCE_STORAGE_ACCOUNT>.blob.core.windows.net/vhds/<SOURCE_VHD_PREFIX>.vhd?snapshot=2017-01-24T21:08:36.9371576Z"
# Define the destination storage account and context.
$DestStorageAccountName = "<DESTINATION_STORAGE_ACCOUNT>"
$DestStorageAccountKey = "<DESTINATION_STORAGE_ACCOUNT_KEY>"
$DestContainerName = "<DESTINATION_CONTAINER_NAME>"
$DestContext = New-AzureStorageContext -StorageAccountName $DestStorageAccountName -StorageAccountKey $DestStorageAccountKey
# Determine source snapshot blob container name and context
$SrcDiskInfo = "" | Select StorageAccountName, VHDName, ContainerName
$SrcDiskInfo.StorageAccountName = ($SrcBlobURI -split "https://")[1].Split(".")[0]
$SrcDiskInfo.VHDName = $SrcBlobURI.Split("/")[-1]
$SrcDiskInfo.ContainerName = $SrcBlobURI.Split("/")[3]
$SrcContainerName = $SrcDiskInfo.ContainerName
$SrcStorageAccount = Find-AzureRmResource `
-ResourceNameContains $SrcDiskInfo.StorageAccountName `
-WarningAction Ignore
$SrcStorageKey = Get-AzureRmStorageAccountKey `
-Name $SrcStorageAccount.Name `
-ResourceGroupName $SrcStorageAccount.ResourceGroupName
$SrcContext = New-AzureStorageContext `
-StorageAccountName $SrcStorageAccount.Name `
-StorageAccountKey $SrcStorageKey[0].Value
# Extract timestamp from the Snapshot filename and convert to a datetime
$pre, $post = $SrcBlobURI.split("=")
$snapdt = [uri]::UnescapeDataString($post)
$snapdt = [datetime]::ParseExact( `
$snapdt,'yyyy-MM-dd"T"HH:mm:ss"."fffffff"Z"',$null `
)
# Get a reference to blobs in the source container.
$blob = Get-AzureStorageBlob -Container $SrcContainerName `
-Context $SrcContext `
| Where-Object {`
$_.ICloudBlob.IsSnapshot `
-and $_.SnapshotTime -eq $snapdt `
}
# Copy blobs from one container to another.
$blob | Start-AzureStorageBlobCopy `
-DestContainer $DestContainerName `
-DestContext $DestContext `
-DestBlob "acopy_$($blob.Name)"
Labels:
Azure,
Blob,
Containers,
copy,
ICloudBlob,
Powershell,
Snapshot,
Storage Accounts,
timestamp,
URI
Wednesday, 1 February 2017
Microsoft Certification 70-473 : Designing and Implementing Cloud Data Platform Solutions
I recently passed the Microsoft exam 70-473
"Designing and Implementing Cloud Data
Platform Solutions", here's my insight into the certification for anyone else thinking of studying and sitting this exam. I've also included a selection of my most useful links I found whilst looking for relevant online content.
Generally,
if you have experience with SQL Server (2008+) then focus your study on hybrid
scenarios, HA/DR, auditing, monitoring, security and the differences between
SQL Server (on-prem Vs Azure) and Azure SQL Database.
Having
some Azure certs already will also give you a small head-start. Also, knowing
the different MS tools available and when to use them was key.
With
the absence of any MS Exam References and Certification videos my preparation
material was based on MSDN, Azure and Technet articles, so lots and lots of
online reading. I did find various online videos that dealt with specific
aspects of the exam, but in the end I stuck to the docs and took the Mindhub
practice exam I had as part of the (no longer available) Microsoft Exam Booster
Pack.
I hope these prove useful (Note: not all objective areas have a link, if you find a good one then let me know)...
·
Design a hybrid SQL
Server solution
o
Design Geo/DR
topology,
o
design a data storage
architecture,
o
design a security
architecture,
o
design a data load
strategy
·
Implement SQL Server
on Azure Virtual Machines (VMs)
o
Provision SQL Server
in an Azure VM,
o
configure firewall
rules,
o
configure and optimise
storage,
o
migrate an on-premises
database to Microsoft Azure,
o
configure and optimise
VM sizes by workload
·
Design a SQL Database
solution
o
Design a solution
architecture,
o
design Geo/DR
topology,
o
design a security
architecture,
o
design a data load
strategy,
o
determine the
appropriate service tier
·
Implement SQL Database
o
Provision SQL
Database,
o
configure firewall
rules,
o
configure active
geo-replication,
o
migrate an on-premises
database to SQL Database,
o
configure for scale
and performance
·
Design and implement
data warehousing on Azure
o
Design a data
warehousing solution on Azure,
o
design a data load
strategy and topology,
o
configure SQL Data
Warehouse,
o
migrate an on-premises
database to SQL Data Warehouse
·
Design and implement
SQL Server Database security
o
Configure firewalls;
o
manage logins, users
and roles;
o
assign permissions;
o
configure auditing;
o
configure transparent
database encryption
·
Implement Azure SQL
Database security
o
Configure firewalls;
o
manage logins, users,
and roles;
o
assign permissions;
o
configure auditing;
o
configure row-level
security;
o
configure data
encryption;
o
configure data
masking;
o
configure Always
Encrypted
·
Design and implement
high availability solutions
o
Design a high
availability solution topology,
o
implement high
availability solutions between on-premises and Azure,
o
design cloud-based
backup solutions,
o
implement backup and
recovery strategies
·
Design and implement
scalable solutions
o
Design a scale-out
solution,
o
implement multi-master
scenarios with database replication,
https://msdn.microsoft.com/en-us/library/ms151176.aspx
- Transactional
https://technet.microsoft.com/en-us/library/ms151196.aspx
- Peer-to-peer
o
implement elastic
scale for SQL Database
·
Design and implement
SQL Database data recovery
o
Design a backup
solution for SQL Database,
o
implement self-service
restore,
o
copy and export
databases
·
Monitor and
troubleshoot SQL Server VMs on Azure
o
Monitor database and
instance activity,
o
monitor using dynamic
management views (DMVs) and dynamic management functions (DMFs),
o
monitor performance
and scalability
·
Monitor and
troubleshoot SQL Database
o
Monitor and
troubleshoot SQL Database,
o
monitor database
activity,
o
monitor using DMVs and
DMFs,
o
monitor performance
and scalability
·
Automate and manage
database implementations on Azure
o
Manage SQL Server in
Azure VMs with PowerShell,
o
manage Azure SQL Database
with PowerShell,
o
configure Automation
and Runbooks
Related Videos
Ignite Exam Prep
Session - Channel 9
Exam Boot Camp -
MAPA
1.1/3.1 SQL Server
HA/DR with Azure
1.2 Migrate SQL
Server to Azure
1.2 SQL Server in
Azure VMs
1.4/4.2 Hybrid Cloud
workloads SQL
1.5 Elastic Data
Warehouse as-a-service
2.2 Securing data in
SQL Database
3.1 Design a
scale-out solution
3.2 Elastic Database
with Azure SQL Database
3.3 Implementing SQL
Database Data Recovery
https://mva.microsoft.com/en-US/training-courses/implementing-azure-sql-database-data-recovery-10341
4.1 Monitor and
Troubleshoot SQL Server
4.3 SQL Server and
SQL Database Powershell
Other Links
General Introduction
SQL Server on Azure
VMs Learning Path
SQL Database Intro
TechNet SQL Server
Reference
MSDN SQL Server
Reference
SQL Server (2008)
Data Loading White Paper
SQL Server Customer
Advisory Team (SQLCAT)
SQL Server Solution
Design
Hybrid SQL Server
Scenarios
Migrate On-prem to
Azure VMs
Training Course
Subscribe to:
Posts (Atom)