Part 1 -SharePoint Moving from Windows to Claims Authentication -How to plan this ?

Many farms are moving from Windows Authentication(NTLM or Kerberos) to SAML. This migration and change requires a lot of planning

But how to start setting this up ? What are the configurations that will get impacted ? How are my third party applications going to behave ?

All this questions have been answered below .

Web Application configuration for Claims authentication :-

The first thing that comes to your mind is how am i going to start planning this .

Well the best way to do this is to extend your existing web application and configure it to Claims authentication .

The thinking behind this is that you cannot completely stop using the Windows authentication because  Search Service Application is going to need Windows authentication in your “Default” zone to work properly.

So this is how you are going to design your authentication configuration :-

Web Application Zone Authentication
https://hostheader.example.com Default Windows
https://adfshostheader.example.com Intranet ADFS (Claims)

So this is what you do . Basically your web application is configured in the default zone with Windows authentication . You extend your web application to the intranet zone and then configure it to Claims authentication .

There is a nice blog written on how to configure for ADFS . We are not going to deep dive on that here . Please follow the blog below for this .

https://blogs.technet.microsoft.com/adamsorenson/2018/01/17/sharepoint-20132016-migrate-from-windows-claims-to-adfs/

User Profiles 

One of the next things that you need to think about is the User profiles . Well now you are going to have 2 user profiles for the same farm .

Capture

One with Windows and other with Claims . So how are you going to handle it .

Well the best way to do this is filter the user profile service application to only show claims user profiles . Because face it , your end users are going to get a hit at the sites hosted under the Web application with Intranet zone right . So there is no need for you to let users show Windows profiles.

Well we are using MIM to manage the synchronization between Active directory profiles and SharePoint user profile service application .

How we configured this is mentioned in the blog below , thanks to Adam Sorenson !!!

https://blogs.technet.microsoft.com/adamsorenson/2018/01/31/sharepoint-2016-mim-and-samlfba-user-profiles/

 

Third Party Applications 

Well , we were using Nintex and certain actions like Web Request in Nintex need to authenticate to the site .

Now if you are planning to route your requests to the ADFS site , it is going to be development work like generating an authentication cookie and passing it with each and every request to authenticate successfully against an ADFS site.

We as an administrator found a bypass for this . Why not let the Nintex web Requests always and always authenticate to the default zone configured with Windows at the back end .

Here is what we did .

We created a Workflow Constant in Nintex with the name GlobalURL .

We configured the string for this workflow constant to the url of the web application in the default zone i.e  “https://hostheader.example.com” .

Now what you have to do next is pretty simple. In the Nintex “web url” option , just choose from the workflow constant and insert this GlobalURL .

Done and dusted , all your requests will route to the Default zone with Windows authentication . No need for any custom development .

Capture1

 

Capture2

Above snips show how this is done .

 

Well this blog is increasing by the length , so i will be sharing the part 2 for this one .

Meanwhile guys , let me know if this helped you .

Always open to improvements, suggestions , thoughts on all my blogs. !!!

 

 

 

Advertisements

Share-gate Desktop and Pnp to customize migration using CVS’s

We have been extensively using migration tools like Share-Gate , Metalogix ,Dell , Ave Point for migrating content (specifically talking about SharePoint Migration) for years now.

Well one of the needs that has evolved over the years for migration is the customized need to migrate items based on item ID’s logged in csv .

Well let me share the script i have written which combines both pnp and Share Gate shell.

The prerequisite here is that you need to have a licensed version of Share Gate to access Share Gate desktop.

This script inherits the “dll” that Share Gate exposes , so that you can use it to migrate items using Shell language .

Set up instructions before running the script :-

  1. You must have Pnp Powershell module installed in the server running this script
  2. The CSV name inside which the list item ID”s are placed should be the name of the List . For eg :- If you want to migrate items based on ID from a list “Office” , the CSV name must be “Office”.
  3. The service account used for connecting the source and destination site must have “Full Control” access on both the sites.

#Prerequisites
Import-Module ShareGate

#Generate a transcript for every action
Start-Transcript -path “DirectoryPath\$date\Log.txt” -NoClobber -Append

#Source and Destination site url
$srcsiteUrl = “https://sitecollection/sites/srcSite”
$dstsiteUrl = “https://sitecollection/sites/dstSite”
$password = ConvertTo-SecureString “YourPassword” -AsPlainText -Force

#Connect the Source Site
$srcSite = Connect-Site -url “$srcsiteUrl” -Username “Domain\ServiceAccount” -Password $password

#Connect the destination site
$dstSite = Connect-Site -url “$dstsiteUrl” -Username “Domain\ServiceAccount” -Password $password

#Get all child csv’s from the parent folder
$csvfilearray = Get-ChildItem -Path “Path where csv’s are kept”

#Logic to loop all csv in csvarray
foreach($csvfile in $csvfilearray)
{
$name = $csvfile.name
$fullpath = $csvfile.FullName
Write-Host “Starting migration of $name”
$listName = [System.IO.Path]::GetFileNameWithoutExtension(“$csvfile”)

#get the source list
$srcList = Get-List -Site $srcSite -Name “$listName”

#Get the destination list from the destination site
$dstList = Get-List -Site $dstSite -Name “$listName”

#Import the csv in table and attach custom headers
$table = Import-Csv -path $csvFile.FullName -Delimiter “,” -header Col1 | %{$_.Col1}

#loop logic to get each item by ID
foreach ($row in $table)
{
$ID = $row
$Content = Get-ListItem -List $srcList -Id $ID -ErrorAction SilentlyContinue
#copy item from source to destination list
Copy-Content -SourceList $srcList -DestinationList $dstList -SourceItemId $ID -InsaneMode -OutVariable out

#verify if item has been migrated successfully.
#If yes delete the item from the source list.
$result = $out.result

if($result -match “Operation completed successfully” -or $result -match “Operation completed with warnings”)
{
Write-Host “Migrated successfully”
Connect-PnPOnline -Url “$srcsiteUrl”
Remove-PnPListItem -List $listName -Identity $ID -Force
}
else
{
Write-host “Item hasnt been migrated”
}
}
Write-Host “Completed operation on csv file $csvfile” -ForegroundColor Red
}

Stop-Transcript

——————————————————————————————————————-

Try this out guys if you have a licensed version of ShareGate . Let me know how this script works out for you .

Open to suggestions , improvements and thoughts on all my blogs.

Cheers!!!!!

 

 

 

Sharegate Blog | Azure Solutions Architect vs. Azure Administrator: what’s the difference, and which certification is right for you? — Updates for Office 365

https://sharegate.com/blog/azure-solutions-architect-vs-azure-administrator-whats-the-difference-and-which-certification-is-right-for-you

via Sharegate Blog | Azure Solutions Architect vs. Azure Administrator: what’s the difference, and which certification is right for you? — Updates for Office 365

PSConfig terminates giving error as Command is incorrect -Microsoft.SharePoint.PostSetupConfigurationTaskException thrown

Now this is an error  have faced multiple times . The SharePoint patch gets installed and you have rebooted the server .

But the moment you try to run the PSConfig command , it takes a long time to start executing and finally fails with error as incorrect command.

Well there are various ways to solve it.

Stop and Restart following services

  • SharePoint Timer Job from Services.msc
  • World Wide Web publishing service from Services.msc

Run psconfig again using following command

  • psconfig.exe -cmd upgrade -inplace b2b -wait -force

This should help you out .

Please find this blog below for the best practices and steps to be followed while patching a SharePoint farm.

https://vigneshsharepointthoughts.com/2016/11/14/patching-a-sharepoint-2013-farm-step-by-step-installation-guide/

Let me know guys , if any issue with the above blog.

Always open to suggestions , improvements , thoughts on all my blogs.

Cheers !!!!

 

 

 

PSConfig fails while trying to patch SharePoint – Failed to upgrade Timer job

While trying to run the “psconfig” command after installing the SharePoint patches , i encountered an error .

Error :– Failed to configure SharePoint patches .Upgrade Timer Job is exiting due to exception. User does not have permission to perform this action .

Solution :– That’s a self explanatory problem that the issue is with the permissions .

But what are the appropriate permissions and which account needs to be given the permission are the real queries .

Check for the permissions of the service account via which you are running the psconfig command.

For the error above , the issue is with the role of the service account.

The required server role for a service account to safely execute the upgrade while running the psconfig is “sysadmin“.

Add the sysadmin as a server role to the service account executing the psconfig and run it again .

The above error should vanish.

Guys, let me know in case of any issues above.
Open to suggestions, improvements and thoughts for all my blogs.

Cheers!!!!!!

Workflow with this name already exists – Nintex Workflow

So , i was working with Nintex workflows and was trying to publish a workflow when suddenly an error popped out .

Error :- A workflow with this name already exists on this site.

I checked again and again but was unable to find any Nintex workflow associated with that site , list or library.

Seems , Nintex is a bit cunning or take this the other way a bit smarter than us.

Haha !!!!, jokes apart. What Nintex does is that has a secret hidden library where it temporarily saves your Workflows.

Simple, so what i had to do was simply remove the workflow from that secret place and redeploy it.

The location for that secret place is as below

https://sitecollection/sites/sitename/NINTEXWORKFLOW

Navigate to the above location , delete the existing workflow from there and redeploy your workflow .

It will get published successfully.

Let me know guys in case of any issues here.

Post your comments for improvements, suggestions, thoughts about my blogs .

Cheers !!!!!!

Not able to connect to the SQL server 2014 – A network related or Instance specific error occurred while establishing connection to SQL server.

This happened with me last year , when i was trying to patch SQL server with Windows patches .

Everything went well until only after the patching was completed successfully and i tried to login to my SQL server to check if everything was okay and boom!!!!! .

I was not able to find a single database . I got really scared and it took me some time to understand the real issue behind it .

Well , if you are facing this same issue , no need to worry .Your databases are still intact.

It’s just that some services that help SQL server run properly needs to be checked if they are working properly.

Follow the steps below:-

  • Open SQL server configuration Manager
  • Click on Services
  • Check if “SQL server service” is “Started”.
  • If it is stopped , restart it .
  • Check if “SQL server browser ” is “Started” .
  • If it is stopped , open the Administrative tools , log in to services , right click on “SQL server browser” service and click on “Properties”.
  • Toggle the Start up type to “Automatic” .
  • Then “Restart” it.

Bang !!!! . You will start seeing all the databases in your SQL server.

It’s just a matter of some services that are important to SQL server .

Let me know guys in the comments section , if you are having any issues .

As and always, open to suggestions, improvements , thoughts on all of my blogs.

Cheers!!!!!!