Docker For Windows – Step By Step (Solutions to known Issues)

For a while now in an endeavor to skill up on Azure Devops , I have been playing around with Docker and I went through a lot of up’s and down’s while I was in the process of getting started.

In the absence of proper material of where exactly to get started, I thought of coming up with blogs (as I keep learning it , just keep sharing it ) so that any one trying to learn from scratch can be benefited.

Intent of the blog post –

The basic intend behind the blog is to help beginners with a step to step approach to be followed for installing Docker for Windows 

Prerequisites –

  • A prior basic information of Docker will help .
  • Windows 10 , 64 bit machine

Caution –

  • If you are a beginner and trying to follow this blog and try it out in real time , I would suggest try it out in Test or Development environment .
  • It requires certain changes in the BIOS settings , be super cautious about it .
  • Follow the blog very carefully.


Let’s begin then !!

Basics –

Now when I say we are going to install Docker for Windows , we will still be running the docker on Linux . You may scratch you head on this and say that it doesn’t make any sense .

It will !! Let me take you to some pictorial data materials for this and you will get clearly what we have tried to do here .


As you can see in the picture above , the machine(client) is still running with Windows OS but we will be enabling the Hyper-V feature in this client which will be running a Linux VM .

The Docker that we will install will actually be running over this Linux VM (MobyLinux) .

That brings us to the first step of Installation , Turning on the Virtualization feature in our Windows machine.

Step 1 – Turning On the Virtualization feature on Windows Machine

  • Navigate to the Control Panel , Program features , Turn windows feature On and Off.
  • Locate the Hyper V feature from the list of options and tick it .HyperV


  • It will start installing the feature on your machine . When this gets completed you might want to perform a restart and this is essential (normally needed after you install any feature in Windows) .RestartRequired


Step 2 – Resolving the error

When the restart operation is done , you may something like this on your screen stating that the Hyper V is not running.

But you just installed that feature right !! Yes, but did you activate it .


No need to panic . Let’s activate this as well .

The solution here is to navigate to your Bios settings and enable the hardware virtualization assists feature .

As already cautioned be super cautious here  .

These are the BIOS settings that you are changing on your machine . It’s like playing with the Wires in an active bomb.

There is only one setting that needs to be changed and you are going to do exactly that . Nothing else .

There is a nice step by step blog to be followed for this . I have personally verified this and so sharing this blog .

I don’t want a lengthy blog so what I am going to do is just paste the link over here , please follow it.

The major part of your work is done here .

Step 3 – Downloading/Installing the set up

  • Navigate to the GetDocke
  • Click  on “Get Docker“.
  • Scroll down and you will see a nice list of Docker products for different Operating systems .
  • Choose the “Windows“option.GetDockerForWindows
  • It will download the set up file for you . Run this setup. FireTheSetup
  • Accept the License agreement and click on InstallSetupEnded
  • You will see the screen that says that the setup has been installed. It’s all going smoothly right !! Easy peasy , basic installation steps.


Step 4 – Verification

  • Once you click on Finish, you will see a small black screen embedded in your Windows screen . Well it’s docker up and running for you .
  • Check the Image that i have inserted below . You can see it says “Docker is running”.
  • Now some verification steps right !!


  • Go to CMD prompt (as an administrator) and type docker version
  • You will get details in the console and that’s basically a sanity check .
  • The output will show you what we discussed earlier (about running docker on Linux)
  • I am not attaching the snip because I want you to try it out .

The output will be something like this,

  • Client-
    • Operating system – Windows
  • Server-
    • Operating system – Linux


You can access the settings page as well to play around with the settings .

Well that was pretty simple !!!

To all the beginners out there “Try it Out” .


Feel free to ask any questions by dropping a comment . Any suggestions, improvements are highly appreciated.

I will be coming out with the next part of this soon (currently working on Part 2 of Azure Migration blog) , so stay tuned .

Please check other blogs as well , if interested and let me know.

Until then , Bye Bye !!!!!!!









Major differences – Create an Approval and Start and Wait for an Approval

The other day I was working on an “Approval” action in a Flow and i came across some new additional options. This put me in a fix as I was left undecided on which option to choose going forward .


This blog is intended to solve the confusions that may come into picture following the additional options introduced by Microsoft for Approvals in the flow.

Confusion :-

A lot of power users using the “Approvals” action in the Microsoft flow from a long time may have noticed that the traditional “Responses” value is absent when you add a condition, if you have used just the “Create an Approval” action .

Create an approval

But , if you use the “Start and Wait for an Approval” action , the “Responses” value becomes available.


Why this difference and what exactly might have pushed Microsoft to add complexity to a simple looking and easy to use Approval action.

Now, this(differences and this confusion) is not documented anywhere and this is something that i just noticed. So i decided to dig a little deeper .

Approval Actions (What’s deprecated) !

On searching the Microsoft documentation , it seems that Microsoft deprecated the traditional two actions that we have been using for long.


Microsoft smartly created an action “Create An Approval” , which does not wait for approval process to complete and it is cancellable.


And the sole reason why “Response” value is not available against this action.

Solution !!

So how to check the response for this . Quite simple , all you need to do is Initialize a Boolean variable that is set to “true” only if the value is “approved

How it’s done is shown below :-


Function expression :- equals(body(‘Create an approval’)?[‘response’],’Approved’)

And this final step to be configured in the “Condition” .


Done and dusted !!!

Part 1 – Migrating from On-Premise Infra to Azure using ASR (Azure Site Recovery)


A lot of enterprises are moving to Azure for obvious reasons like Global Availability,Security,Scalability,Disaster Recovery,Cost Savings and Flexible Expenditure,Compliance,Development Focused Integrated Delivery Pipeline.

But this transition is not simple , isn’t it . Migrating an end to end application with the entire infrastructure in place and not having any downtime which Microsoft calls it as BC/DR (Business continuity and Disaster recovery) is a big challenge especially when we are talking about Migration of an On-Premises infrastructure to Azure Cloud.

Well most people don’t know the right way to do this , even though Azure has provided a full fledged system in place .

I know everything is documented  but if you go through the documentation you will find each document has links to hundreds of other documents .

So to make things easier , here is Part 1 of the 5 blogs that i am going to write that will help you understand the process and the functionality smoothly.

What we are going to use here is Azure Site Recovery . Now you may get the feel after reading that this seems like a Disaster recovery technique . Well , you are absolutely right !!! . But Microsoft , like’s to bring out the best in things has also provided the functionality for Migration using this .

Site Recovery is used to manage and orchestrate disaster recovery of on-premises machines and Azure VMs. However, it can also be used for migration. Migration uses the same steps as disaster recovery with one exception. In a migration, failing machines over from your on-premises site is the final step. Unlike disaster recovery, you can’t fail back to on-premises in a migration scenario.

The high level steps involved in migrating are as follows :-

  • Set up the source and target environment for migration
  • Set up a replication policy
  • Enable replication
  • Run a test migration to make sure everything’s working as expected
  • Run a one-time fail over to Azure

Let’s get this one by one in detail , Shall we ?

1. Set up the source and target environment for migration :- 

So you need to set up the Azure environment first .

This is the target where your entire infrastructure will be migrated upon .

First and the most important are the permissions . I have this table below that summarizes the required permissions for setting up Azure for Site recovery.

1.1 Permissions :-

Permission required Role
All required permissions for ASR Subscription Administrator
Create a VM in selected group Virtual Machine built in Contributor
Create a VM in selected virtual network Virtual Machine built in Contributor
Write to Azure storage account Storage BLOB data contributor
Write to Azure Managed Disk RBAC or Cloud shell to propagate permission for this
Manage site recovery operations in a vault Site recovery contributor built in role

1.2 Set up a recovery services vault :-

  • In the Azure portal, click +Create a resource, and search the Marketplace for Recovery.
  • Click Backup and Site Recovery (OMS), and in the Backup and Site Recovery page, click Create.
  • In Recovery Services vault > Name, enter a friendly name to identify the vault.
  • In Resource group, select an existing resource group or create a new one.
  • In Location, select the region in which the vault should be located.



1.3 Set up a Azure network :-

  • In the Azure portal, select Create a resource > Networking > Virtual network.
  • Keep Resource Manager selected as the deployment model.
  • In Name, enter a network name. The name must be unique within the Azure resource group.
  • Specify the resource group in which the network will be created.
  • In Address range, enter the range for the network.
  • In Subscription, select the subscription in which to create the network.
  • In Location, select the same region as that in which the Recovery Services vault was created. In our tutorial it’s West Europe. The network must be in the same region as the vault.
  • We’re leaving the default options of basic DDoS protection, with no service endpoint on the network.
  • Click Create.


That’s it guys !!!!. We have configured the prerequisites that are necessary on the (target)Azure side of the migration.

The next part , part 2 is coming soon . We will be studying a lot of interesting things in it .

Just for overview here is what is going to be covered in the next part.

Topics Status
Set up a recovery services vault Covered in Part 1
Go to the resource menu of the recovery vault and start preparing the infrastructure
The protection goal
Where do you want to migrate your infrastructure to
Are your VM’s virtualized
Set up the source environment
You will need a configuration server for this .
Download the Site Recovery Unified Setup installation file.
Download the vault registration key
Run Azure site recovery unified set up

Until then , keep reading . Always open to suggestions,improvements in all my blogs !!!!!!

References :-


Part 1 -SharePoint Moving from Windows to Claims Authentication -How to plan this ?

Many farms are moving from Windows Authentication(NTLM or Kerberos) to SAML. This migration and change requires a lot of planning

But how to start setting this up ? What are the configurations that will get impacted ? How are my third party applications going to behave ?

All this questions have been answered below .

Web Application configuration for Claims authentication :-

The first thing that comes to your mind is how am i going to start planning this .

Well the best way to do this is to extend your existing web application and configure it to Claims authentication .

The thinking behind this is that you cannot completely stop using the Windows authentication because  Search Service Application is going to need Windows authentication in your “Default” zone to work properly.

So this is how you are going to design your authentication configuration :-

Web Application Zone Authentication Default Windows Intranet ADFS (Claims)

So this is what you do . Basically your web application is configured in the default zone with Windows authentication . You extend your web application to the intranet zone and then configure it to Claims authentication .

There is a nice blog written on how to configure for ADFS . We are not going to deep dive on that here . Please follow the blog below for this .

User Profiles 

One of the next things that you need to think about is the User profiles . Well now you are going to have 2 user profiles for the same farm .


One with Windows and other with Claims . So how are you going to handle it .

Well the best way to do this is filter the user profile service application to only show claims user profiles . Because face it , your end users are going to get a hit at the sites hosted under the Web application with Intranet zone right . So there is no need for you to let users show Windows profiles.

Well we are using MIM to manage the synchronization between Active directory profiles and SharePoint user profile service application .

How we configured this is mentioned in the blog below , thanks to Adam Sorenson !!!


Third Party Applications 

Well , we were using Nintex and certain actions like Web Request in Nintex need to authenticate to the site .

Now if you are planning to route your requests to the ADFS site , it is going to be development work like generating an authentication cookie and passing it with each and every request to authenticate successfully against an ADFS site.

We as an administrator found a bypass for this . Why not let the Nintex web Requests always and always authenticate to the default zone configured with Windows at the back end .

Here is what we did .

We created a Workflow Constant in Nintex with the name GlobalURL .

We configured the string for this workflow constant to the url of the web application in the default zone i.e  “” .

Now what you have to do next is pretty simple. In the Nintex “web url” option , just choose from the workflow constant and insert this GlobalURL .

Done and dusted , all your requests will route to the Default zone with Windows authentication . No need for any custom development .




Above snips show how this is done .


Well this blog is increasing by the length , so i will be sharing the part 2 for this one .

Meanwhile guys , let me know if this helped you .

Always open to improvements, suggestions , thoughts on all my blogs. !!!




Share-gate Desktop and Pnp to customize migration using CVS’s

We have been extensively using migration tools like Share-Gate , Metalogix ,Dell , Ave Point for migrating content (specifically talking about SharePoint Migration) for years now.

Well one of the needs that has evolved over the years for migration is the customized need to migrate items based on item ID’s logged in csv .

Well let me share the script i have written which combines both pnp and Share Gate shell.

The prerequisite here is that you need to have a licensed version of Share Gate to access Share Gate desktop.

This script inherits the “dll” that Share Gate exposes , so that you can use it to migrate items using Shell language .

Set up instructions before running the script :-

  1. You must have Pnp Powershell module installed in the server running this script
  2. The CSV name inside which the list item ID”s are placed should be the name of the List . For eg :- If you want to migrate items based on ID from a list “Office” , the CSV name must be “Office”.
  3. The service account used for connecting the source and destination site must have “Full Control” access on both the sites.

Import-Module ShareGate

#Generate a transcript for every action
Start-Transcript -path “DirectoryPath\$date\Log.txt” -NoClobber -Append

#Source and Destination site url
$srcsiteUrl = “https://sitecollection/sites/srcSite”
$dstsiteUrl = “https://sitecollection/sites/dstSite”
$password = ConvertTo-SecureString “YourPassword” -AsPlainText -Force

#Connect the Source Site
$srcSite = Connect-Site -url “$srcsiteUrl” -Username “Domain\ServiceAccount” -Password $password

#Connect the destination site
$dstSite = Connect-Site -url “$dstsiteUrl” -Username “Domain\ServiceAccount” -Password $password

#Get all child csv’s from the parent folder
$csvfilearray = Get-ChildItem -Path “Path where csv’s are kept”

#Logic to loop all csv in csvarray
foreach($csvfile in $csvfilearray)
$name = $
$fullpath = $csvfile.FullName
Write-Host “Starting migration of $name”
$listName = [System.IO.Path]::GetFileNameWithoutExtension(“$csvfile”)

#get the source list
$srcList = Get-List -Site $srcSite -Name “$listName”

#Get the destination list from the destination site
$dstList = Get-List -Site $dstSite -Name “$listName”

#Import the csv in table and attach custom headers
$table = Import-Csv -path $csvFile.FullName -Delimiter “,” -header Col1 | %{$_.Col1}

#loop logic to get each item by ID
foreach ($row in $table)
$ID = $row
$Content = Get-ListItem -List $srcList -Id $ID -ErrorAction SilentlyContinue
#copy item from source to destination list
Copy-Content -SourceList $srcList -DestinationList $dstList -SourceItemId $ID -InsaneMode -OutVariable out

#verify if item has been migrated successfully.
#If yes delete the item from the source list.
$result = $out.result

if($result -match “Operation completed successfully” -or $result -match “Operation completed with warnings”)
Write-Host “Migrated successfully”
Connect-PnPOnline -Url “$srcsiteUrl”
Remove-PnPListItem -List $listName -Identity $ID -Force
Write-host “Item hasnt been migrated”
Write-Host “Completed operation on csv file $csvfile” -ForegroundColor Red



Try this out guys if you have a licensed version of ShareGate . Let me know how this script works out for you .

Open to suggestions , improvements and thoughts on all my blogs.





Sharegate Blog | Azure Solutions Architect vs. Azure Administrator: what’s the difference, and which certification is right for you? — Updates for Office 365

via Sharegate Blog | Azure Solutions Architect vs. Azure Administrator: what’s the difference, and which certification is right for you? — Updates for Office 365

PSConfig terminates giving error as Command is incorrect -Microsoft.SharePoint.PostSetupConfigurationTaskException thrown

Now this is an error  have faced multiple times . The SharePoint patch gets installed and you have rebooted the server .

But the moment you try to run the PSConfig command , it takes a long time to start executing and finally fails with error as incorrect command.

Well there are various ways to solve it.

Stop and Restart following services

  • SharePoint Timer Job from Services.msc
  • World Wide Web publishing service from Services.msc

Run psconfig again using following command

  • psconfig.exe -cmd upgrade -inplace b2b -wait -force

This should help you out .

Please find this blog below for the best practices and steps to be followed while patching a SharePoint farm.

Let me know guys , if any issue with the above blog.

Always open to suggestions , improvements , thoughts on all my blogs.

Cheers !!!!