Tuesday, January 23, 2018

Office Web App 2013 Server Error

I ran into an issue the other day, after deploying a new WAC Farm, where SharePoint stopped displaying documents in the wopi frame.
I ran the same New-OfficeWebAppsFarm script that I have been running for the installation/upgrades for years, so at first I thought it was the latest CU.
After installing the upgrade, I ran the following script to get the new WAC Server Version to make sure I was on the right version:

However, the version number did not return. From within SharePoint, I received a Server Error message, saying "We're sorry. An error has occurred. We've logged the error for the server administrator" This message was in the wopi frame, but would be seen within the WAC farm by going to https://wac.domain.com/op/servicebusy.htm
Now the interesting part was that I was able to get content back from going to wopi-discovery: (http://wac.domain.com/hosting/discovery).
Looking at the Event Viewer, I was getting Application Error and .NET Runtime errors for the different application Watchdog.exe. At this point, I enabled verbose logging on the server.

I once again tried to open up the servicebusy.htm page, then opened up the ULS logs. I was able to find an Unexpected Error:
ServiceInstanceFinderAdapter.FindAllServiceInstances() threw an exception: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.     at Microsoft.Office.Web.Apps.Environment.WacServer.AFarmTopology.GetMachine(String machineName)     at Microsoft.Office.Web.Apps.Environment.WacServer.WSServiceInstanceFinderAdapter..ctor()     --- End of inner exception stack trace ---     at System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, RuntimeMethodHandleInternal& ctor, Boolean& bNeedSecurityCheck)     at System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)     at System.RuntimeType.CreateInstanceDefaultCtor(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)     at System.Activator.CreateInstance(Type type, Boolean nonPublic)  at Microsoft.Office.Web.Common.EnvironmentAdapters.HostEnvironment.LoadAdapterInstance(AdapterLoadInformation adapterInfo, Boolean readAppConfigOnly)     at Microsoft.Office.Web.Common.EnvironmentAdapters.HostEnvironment.AdapterLoadInformation`1.<>c__DisplayClass17.b__16()     at System.Lazy`1.CreateValue()     at System.Lazy`1.LazyInitValue()     at Microsoft.Office.Web.Common.EnvironmentAdapters.HostEnvironment.get_ServiceInstanceFinderAdapter()     at Microsoft.Office.Web.Common.ServiceInstanceFinder.RefreshList(Object state)  
ServiceInstanceFinder has no data because of an adapter exception, throwing exception to terminate process

I did a quick search and found a blog post on OOS which had the same issue (https://social.technet.microsoft.com/Forums/office/en-US/96dd1ac1-1173-47c3-bdc5-29ac0fd9f722/oos-2016-server-error-were-sorry-an-error-has-occurred-weve-logged-the-error-for-the-server?forum=OfficeOnlineServer). Basically, if the server name is not in all capitol letters, it doesn't work. To fix this, I ran a a script to update 3 registry values to update the Computer Name to all CAPS.

I opened up the servicebusy.htm page again and the expected error was returned:
I then stop my verbose logging by running the following:


Now, when I run the following script:

I get the expected results:
At least it is an easy fix...
I also tested this on a base image of WAC plus Service Pack 1, and same issue persisted. 

Friday, December 15, 2017

How to Maintain an Azure Site-to-Site (S2S) Connection with a Dynamic IP Address

This post should not be needed for your production environment. This is for those of us who test and build development environments at home and have created hybrid environments into Azure. IF you have a dynamic IP address for your business, please spend the extra bit of money for a static IP...
That being said, I have a dynamic IP address for my house, and when my IP address changes, it used to break my S2S connection with Azure. This post is about how I fixed the problem.
The fist step is to create the service account that is going to be logging in to Azure to check and update the IP Address.  I will be creating an unlicensed user on the .onmicrosoft domain to for this purpose.
As the Microsoft Online Data Service (MSOL) module did not come pre-installed, I ran the following to get started:


Next we are going to login and create the unlicensed service account. You will want to update the UPN and other variables accordingly:


Now that we have our service account created (an account that does not have access into our domain, O365, or Azure), it will need to be added to Access Control (IAM) for the Local Network Gateway in Azure.


With permissions set for Local Network Gateway, it is time to look at the current IP address of the gateway endpoint and compare it to the current local IP address endpoint. If the two IP addresses do not match, it is time to update your Local Network Gateway (in Azure).

Next we create some logging and logging clean-up:

And to finish off, we will connect all of our RRAS VpnS2SInterface connections.

Now let's put the whole thing together. First we create the service account and add their permissions:

Next we create the Update S2S file, and save the file to: 'C:\Scripts\Update S2S and RRAS.ps1'

Now that we are checking and updating our Local Network Gateway Connection IP address, we need to create a timer job that will check and update on a regular basis. Below is a script that will check every hour on the hour. Make sure that the Update S2S file path is set correctly.

Monday, April 3, 2017

Creating File Shares in Azure using PowerShell

The other day I was helping migrate a client from one cloud provider into Azure, when they ran into a problem with their file share server. It made me think about the Azure File Service that is available, and how to implement this from a corporate perspective, and then my mind wandered and I wanted to see how to create a file share just for my Surface Pro. There is an excellent post called Get started with Azure File storage on Windows that go me started, but I was not too happy with the implementation, as I would like my file share available to me after I reboot my Surface Pro, plus I do not want anyone logged into my Surface to have access to MY file share.
The beginning of the script is basic creating the Resource Group and Storage Account, as it is a file share, I am using global redundant storage, on HDDs not SSDs.

Storage within Azure is Context based, so to create the Azure File Share, we will need to create a context. Once we create the context, we can create the File Share.

Now that we have created the Azure File Share, we want to store the login credentials locally to make mounting the drive easier.

The next step is to mount the file share as a local drive (X:\ drive in this example). I tried using the New-PSDrive cmdlet, but could not get it to work consistently, so ended with the script below. One thing to notice is that in line 9 I am creating a Script Block from a string variable, instead of creating a script block with parameters and passing in values. I found this to be a very nice and easy way to deal with passing variables into a script block. Plus I need the script block in a later part of the script, so it was win-win.

Now at this point, I have a mounted X Drive to my Azure File Share, but it is not persisted. To allow my drive to return after reboot, I have to create a Scheduled Job that will run on logon for the person running this script. Notice in line 15, I grab the Scheduled Job's Path so that I can grab and update the Scheduled Task to update the Task's principal user.
All in all a very cool, and quick way to add 5TB of storage to my Surface Pro.
I could not think of any other reasons why an Enterprise would want a file share in Azure (from a corporate perspective) with the availability of SharePoint Online or OneDrive for Business, but then a client asked me how to send me their bloated SQL Database... You would send your client something like this:

Once they upload the file, you would then create a new key for the storage account.
How would/are you using Azure File Service?
Here is the code in its entirety:

Monday, June 20, 2016

Download All Files in a Yammer Group!

I was lucky enough to take a week long Azure Cloud Solution Architect training class hosted by Microsoft. Unluckily, they uploaded all the documents for the class into Yammer, apporoximately 80 files. Now, I could just go to each file and individually download each file, but where is the fun with that? So, I decided to do a quick search and found a couple of blog posts.
I found a GitHub post Download all files in a Yammer.com group and a blog post from Sahil Malik (https://twitter.com/sahilmalik) Download Multiple Files from Yammer - easily.
The first problem that I ran into was that neither code worked any longer due to the url structure change, as Sahil predicted. The second problem was that the code opened up a new tab for each file downloaded. I figured there had to be a better way.
Now, I am fortunate enough to be able to reach out to someone I consider to be one of the best JavaScript developers around, Matthew Bramer (https://twitter.com/iOnline247) for a bit of help.
In this post, I am going to show you a couple of ways to download your files. First is the full length developer version, while the other is a quick and easily repeatable version.
We decided to test this in Chrome only. The developer version should work in Edge, but was only tested in Chrome. If you want to complain that you cannot get it to work, TRY CHROME FIRST!

Do This First
1) Within Chrome, open up Settings, and select Show advanced settings...
2) Under Downloads, set a download location and make sure that the Ask where to save each file before downloading check box is NOT selected.

Full Length Developer Version
1) Open up Yammer, and go to the files location.
     a) Make sure that you scroll down and click the More button to show all of the files.
2) Hit F12 to open the Developer Tools
3) Under Sources, select Snippets.
4) Insert the following code into the Script snippet window:
5) Click the run snippet button (Ctrl + Enter) to start your downloads

Easily Repeatable Version
1) Within Chrome, open the Bookmark Manager (Ctrl + Shift + O)
2) Under Folders, select (or create) the appropriate folder
3) Under Organize, click the Organize drop-down and select Add page...
4) Give the Page an appropriate name like, Download All Yammer Files on Page
5) For the URL, paste the following code.
6) Open up Yammer, and go to the files location.
     a) Make sure that you scroll down and click the More button to show all of the files.
7) Open the bookmark that you just created to start downloading all the files.

Thank you Matthew for helping me get this up and running in time for Ignite...

Thursday, January 28, 2016

Backup and Restore SQL User Databases Using PowerShell

There are several ways how to backup and restore SQL databases. Over time, the way I back up and restore databases has changed.
Originally my backup database code looked like this:
The problem is that you needed to have SQL Server Management Studio (SSMS) installed for the code to run correctly. Having SSMS installed on a production server is not the best of ideas, so luckily PowerShell gives us the ability to backup all user databases very easily:
Now that we have our databases backed up, let's take a look at the old way that I use to restore databases:
Below is the newer, strictly using PowerShell way that I use to restore the just backed up databases:
Hopefully this will give you a couple of good solutions for backing up and restoring your SQL Server Databases through PowerShell.
-PC

Wednesday, December 30, 2015

Moving User SQL Databases Using PowerShell

I grew tired of manually moving databases around using a combination of SQL and "Copy / Paste" so wrote out a bit of PowerShell to save me some time and effort.
Notice that I am using the copy-item then deleting the object, not just moving the item. This is because of how permissions on objects are handled with copy vs move, plus I am paranoid about not having my original database handy if the move fails, or the moved database gets corrupted in transit.

Let's take a look at the code

In the first section we will be setting the variables.
The next step is to get the database information:
Once we have the database information, the database will need to be taken OFFLINE: Once the database is offline, we can copy the file, set ACLs, and update the database with the new .mdf and .ldf file locations: Then we can bring the DB back ONLINE Once the DB is back ONLINE, wait for 10 seconds and delete the original database: Here is a look at the code once it is all put together:
Updates
01/01/2016: Fixed issues with ACL for moved files by converting file location to UTC based path format.
01/02/2016: Updated to include snippets and comments
01/05/2016: Major update to fix $destination to UTC path, added copyItem function, item extension switch, updated outputs with write-output and write-verbose.

Monday, August 24, 2015

Manually Download and Install the Prerequisites for SharePoint 2016

At some point within your career of deploying SharePoint, you will hopefully come across a scenario where your SharePoint servers are not allowed internet access. Most of the server farms that I work on are not allowed access to the Internet or are fire-walled/ruled out of the ability to surf the web or the ability to download items directly. This brings me to the need to be able to download the required files to a specific location and use that location for SharePoint's PrerequisiteInstaller.exe to complete its installation.
If you need to do an offline installation of SharePoint 2016, you will need to have the prerequisite files downloaded ahead of time. You will also need the SharePoint 2016 .iso (download here). These scripts are based off of the scripts provided by Craig Lussier (@craiglussier).
These scripts will work for SharePoint 2016 on Windows Server 2012R2 or on Windows Server 2016.
The only change that needs to be made is the location of the SharePoint prerequisiteinstaller.exe file in script #3 line #3. So, update the $sp2016Location variable before running. If the SP2016 .iso is mounted to the "D:\" drive, you have nothing to change and, in theory, this should just work for you out of the box.
The first thing that I like to do is to add the windows features:
The second step is to download the items required for the prerequisite installer:
The next step is to run the prerequisite installer. There is a requirement to restart the server during the installation and provisioning of settings:
The final step is to continue the installation of the prerequisites:
I hope that this saves you some time and headaches trying to get SP2016 installed and running correctly.

Updates

08/24/2015 Added ability to install on either Technical Preview for Server 2016 or Server 2012R2
08/25/2015 Added verbiage on updating $sp2016Location variable, and added workaround for PowerShell bug in download script.
11/25/2015 Updated for installation of SharePoint 2016 Beta 2 bits.
11/14/2017 Updated the scripts to install SharePoint 2016 on either Server 2012R2 or Server 2016.
11/29/2017 Updated the scripts to install SharePoint 2016 on Server 2016 Standard or Datacenter.
Thank you Matthew Bramer for fixing this oversight.

Monday, July 20, 2015

Copying BLOBs Between Azure Storage Containers

In the past when I needed to move BLOBs between Azure Containers I would use a script that I put together based off of Michael Washam's blog post Copying VHDs (Blobs) between Storage Accounts. However with my latest project, I actually needed to move several BLOBs from the Azure Commercial Cloud to the Azure Government Cloud.
Right off the bat, the first problem is that the endpoint for the Government Cloud is not the default endpoint when using PowerShell cmdlets. So after spending some time updating my script to work in Commercial or Government Azure, I was still not able to move anything. So after a bit of "this worked before, why are you not working now?" frustration, it was time for plan B.

Plan B

Luckily enough the Windows Azure Storage Team had put together a command line utility called AzCopy. AzCopy is a very powerful tool as it will allow you to copy items from a machine on your local network into Azure. It will also copy items from one Azure tenant to another Azure tenant. The problem that I ran into is that the copy is synchronous, meaning that it copies one item at a time, and you cannot start another copy until the previous operation has finished. I also ran the command line in ISE vs directly in the command line, which was not as nice. In the AzCopy command line utility, a status is displayed letting you know elapsed time and when the copy has completed. In ISE, you know your BLOB is copied when script has completed running. You can read up on and download AzCopy from Getting Started with the AzCopy Command-Line Utility. This is the script that I used to move BLOBs between the Azure Commercial Tenant and the Azure Government Tenant.
$sourceContainer = "https://commercialsharepoint.blob.core.windows.net/images"
$sourceKey = "insert your key here"
$destinationContainer = "https://governmentsharepoint.blob.core.usgovcloudapi.net/images"
$destinationKey = "insert your key here"
$file1 = "Server2012R2-Standard-OWA.vhd"
$file2 = "Server2012R2-Standard-SP2013.vhd"
$file3 = "Server2012R2-Standard-SQL2014-Enterprise.vhd"
$files = @($file1,$file2,$file3)
function copyFiles {
    foreach ($file in $files) {
        & 'C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe' /Source:$sourceContainer /Dest:$destinationContainer /SourceKey:$sourceKey /DestKey:$destinationKey /Pattern:$file 
    }
}
function copyAllFiles {
    & 'C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe' /Source:$sourceContainer /Dest:$destinationContainer /SourceKey:$sourceKey /DestKey:$destinationKey /S
}
# copyFiles
# copyAllFiles
While I was waiting for my BLOBs to copy over, I decided to look back at my Plan A,and see if I could figure out my issue(s).

Plan A

After cleaning up my script and taking a bit of a "Type-A personality" look at the script, I noticed that i was grabbing the Azure Container Object, but not grabbing the BLOB Object before copying the item. Once I piped the container to the BLOB before copying, it all worked as expected. Below is my script, but please notice that on the Start-AzureStorageBlobCopy cmdlet, I am using the -Force parameter to overwrite the existing destination BLOB if it exists.
# Source Storage Information
$srcStorageAccount = "commercialsharepoint"
$srcContainer = "images"
$srcStorageKey = "insert your key here"
$srcEndpoint = "core.windows.net"
# Destination Storage Information
$destStorageAccount  = "governmentsharepoint"  
$destContainer = "images"
$destStorageKey = "insert your key here"
$destEndpoint = "core.usgovcloudapi.net" 
# Individual File Names (if required)
$file1 = "Server2012R2-Standard-OWA.vhd"
$file2 = "Server2012R2-Standard-SP2013.vhd"
$file3 = "Server2012R2-Standard-SQL2014-Enterprise.vhd"
# Create file name array
$files = @($file1, $file2, $file3)
# Create blobs array
$blobStatus = @()
### Create the source storage account context ### 
$srcContext = New-AzureStorageContext   -StorageAccountName $srcStorageAccount `
                                        -StorageAccountKey $srcStorageKey `
                                        -Endpoint $srcEndpoint 
### Create the destination storage account context ### 
$destContext = New-AzureStorageContext  -StorageAccountName $destStorageAccount `
                                        -StorageAccountKey $destStorageKey `
                                        -Endpoint $destEndpoint
#region Copy Specific Files in Container
    function copyFiles {
        $i = 0
        foreach ($file in $files) {
            $files[$i] = Get-AzureStorageContainer -Name $srcContainer -Context $srcContext | 
                         Get-AzureStorageBlob -Blob $file | 
                         Start-AzureStorageBlobCopy -DestContainer $destContainer -DestContext $destContext -DestBlob $file -ConcurrentTaskCount 512 -Force 
            $i++ 
        }  
        getBlobStatus -blobs $files     
    }
#endregion
#region Copy All Files in Container
    function copyAllFiles {
        $destBlobName = $blob.Name
        $blobs = Get-AzureStorageContainer -Name $srcContainer -Context $srcContext | Get-AzureStorageBlob
        $i = 0
        foreach ($blob in $blobs) {
            $blobs[$i] =  Get-AzureStorageContainer -Name $srcContainer -Context $srcContext | 
                          Get-AzureStorageBlob -Blob $blob.Name | 
                          Start-AzureStorageBlobCopy -DestContainer $destContainer -DestContext $destContext -DestBlob $destBlobName -ConcurrentTaskCount 512 -Force
            $i ++
        }  
        getBlobStatus -blobs $blobs 
    }
#endregion
#region Get Blob Copy Status
    function getBlobStatus($blobs) {
        $completed = $false
        While($completed -ne $true){
            foreach ($blob in $blobs) {
                $counter = 0
                $status = $blob | Get-AzureStorageBlobCopyState
                Write-Host($blob.Name + " has a status of: "+ $status.status) 
                if ($status.status -ne "Success") {
                    $counter ++
                }
                if ($counter -eq 0) {
                    $completed = $true
                }   
                ELSE {
                    Write-Host("Waiting 30 seconds...")
                    Start-Sleep -Seconds 30
                }                         
            }
        }
    }        
#endregion
# copyFiles
# copyAllFiles

Conclusion

Having more than one way to get something accomplished within Azure if fantastic. There is not a lot of documentation out there on how to work with Azure and PowerShell within the Government Cloud, so hopefully this will make life easier for someone. Remember that these scripts can be used across any tenant, Commercial and Government and On-Premises.

Updates

08/05/2015 Fixed cut and paste variable issues and added $destBlobName for renaming BLOBs at the destination location, and updated BLOB status check wait time.

Thursday, July 2, 2015

Provisioning SQL Server Always-On Without Rights

Separation of roles, duties, and responsibilities in a larger corporate/government environment is a good thing. It is a good thing unless you are actually trying to get something accomplished quickly on your own. But this is why there is a separation of roles, so that one person cannot simply go and add objects into Active Directory on a whim, or play with the F5 because they watched a video on YouTube. I recently had designed a solution that was going to take advantage of SQL Server 2012 High Availability and Disaster Recovery Always-On Group Listeners. The problem was that I was not a domain admin, and did not have rights to create a computer object for the Server OS Windows Cluster, or the SQL Group Listener.

Creating the OS Cluster

Creating the OS Cluster was the easy part, I just needed to find an administrator that had the rights to create a computer object in the domain. Once that was accomplished, I made sure that the user had local admin rights on all of the soon-to-be clustered machines, and had them run the following script:
$node1 = "Node-01.contoso.local"
$node2 = "Node-02.contoso.local"
$osClusternName = "THESPSQLCLUSTER"
$osClusterIP = "192.168.1.11"
# $ignoreAddress = "172.20.0.0/21"
$nodes = ($node1, $node2)
Import-Module FailoverClusters
function testCluster {
    # Test Cluster
    $test = Test-Cluster -Node (foreach{$nodes})
    $testPath = $env:HOMEPATH + "\AppData\Local\Temp\" + $test.Name.ToString()
    # View Report
    $IE=new-object -com internetexplorer.application
    $IE.navigate2($testPath)
    $IE.visible=$true
}
function buildCluster {
    # Build Cluster
    $new = New-Cluster -Name $osClusternName -Node (foreach{$nodes}) -StaticAddress $osClusterIP -NoStorage # -IgnoreNetwork $ignoreAddress
    Get-Cluster | Select *
    # View Report
    $newPath = "C:\Windows\cluster\Reports\" + $new.Name.ToString()
    $IE=new-object -com internetexplorer.application
    $IE.navigate2($newPath)
    $IE.visible=$true
}
# un-comment what you what to do...
# testCluster
buildCluster

Creating the Group Listener

Creating the Group Listener was a bit more challenging, but not too bad. Once the OS Cluster computer object was created (thespsqlcluster.contoso.local), the newly created computer object needed to be given rights as well.
- The cluster identity 'thespsqlcluster' needs Create Computer Objects permissions. By default all computer objects are created in the same container as the cluster identity 'thespsqlcluster'.
- If there is an existing computer object, verify the Cluster Identity 'thespsqlcluster' has 'Full Control' permission to that computer object using the Active Directory Users and Computers tool.
You will also want to make sure that the quota for computer objects for 'thespsqlcluster' has not been reached.
The domain administrator was also given Sysadmin rights to all of the SQL Server instances in the cluster.
After all the permissions were set, the Domain admin could run the following script on the Primary SQL Instance to create the Group Listener:


Import-Module ServerManager -EA 0
Import-Module SQLPS -DisableNameChecking -EA 0
$listenerName = "LSN-TheSPDatabases"
$server = $env:COMPUTERNAME
$path = "SQLSERVER:\sql\$server\default\availabilitygroups\"
$groups = Get-ChildItem -Path $path
$groupPath = $path + $groups[0].Name
$groupPath
New-SqlAvailabilityGroupListener `
    -Name $listenerName `
    -StaticIp "192.168.1.12/255.255.255.0" `
    -Port "1433" `
    -Path $groupPath 

Important

After the group listener is created, all the rights that were put in place can once again be removed with the understanding that if you wish to add another listener at another time, the permissions will have to be reinstated temporarily once again. In my case, once all of the computer objects were created successfully, all rights were removed off the cluster computer object and the domain administrator was removed from SQL.

Updates

07/06/2015 Cleaned up diction and grammar, added the Important section.
10/21/2015 Updated computer object permission requirements

Sunday, June 28, 2015

SharePoint and FIPS Exceptions

A couple of weeks ago, I started a "Greenfield" implementation of SharePoint 2013 for a client. This organization has SharePoint 2003, 2007, 2010 already existing in their environment, so I ignorantly figured that the installation should go pretty smoothly.
All of the SharePoint and SQL bits installed correctly, however when trying to provision Central Administration, I ran into an issue where I was not able to create the config database:

What is FIPS?

FIPS stands for the Federal Information Processing Standards, and is used for the standardization of information, such as FIPS 10-4 for Country Codes or FIPS 5-2 for State Codes. However my problem is with FIPS 140-2, the Security Requirements for Cryptography which states:
This Federal Information Processing Standard (140-2) specifies the security requirements that will be satisfied by a cryptographic module, providing four increasing, qualitative levels intended to cover a wide range of potential applications and environments. The areas covered, related to the secure design and implementation of a cryptographic module, include specification; ports and interfaces; roles, services, and authentication; finite state model; physical security; operational environment; cryptographic key management; electromagnetic interference/electromagnetic compatibility (EMI/EMC); self-tests; design assurance; and mitigation of other attacks. [Supersedes FIPS 140-1 (January 11, 1994): http://www.nist.gov/manuscript-publication-search.cfm?pub_id=917970]
In essence, FIPS 140-2 is a standard that can be tested against and certified so that the server is hardened up to a government standard. Currently, the US is not the only government that uses the FIPS standard for server hardening. The FIPS Local/Group Security Policy Flag can be found here:

FIPS and SharePoint

There are a couple of problems with using SharePoint on a FIPS enabled server. SharePoint Server uses MD5 for computing hash values (not for security purposes) which is an unapproved algorithm. According to Microsoft (https://technet.microsoft.com/en-us/library/cc750357.aspx) Schannel Security Package is forced to negotiate sessions using TLS1.0. And the following supported Cipher Suites are disabled:

  • TLS_RSA_WITH_RC4_128_SHA
  • TLS_RSA_WITH_RC4_128_MD5
  • SSL_CK_RC4_128_WITH_MD5
  • SSL_CK_DES_192_EDE3_CBC_WITH_MD5
  • TLS_RSA_WITH_NULL_MD5
  • TLS_RSA_WITH_NULL_SHA
If you want to read up more, here are some good posts:

What's Next?

Disabling FIPS is easy, however a larger discussion needs to be had. Is FIPS set at the GPO level or is it part of the image that was provisioned and FIPS was enabled by default? Will the security team come after you if you disable it without their knowledge? Why do they have FIPS enabled, and what are they trying to accomplish with FIPS? All of these questions will need to be answered before changing your server settings.

Fixing FIPS with PowerShell

This is how I reset the FIPS Algorithm Policy so that I could get Central Administration provisioned. Remember that FIPS will need to be disabled on all of your SharePoint Servers.

$sets = @("CurrentControlSet","ControlSet001","ControlSet002")
foreach ($set in $sets) {
    $path = "HKLM:\SYSTEM\$set\Control\LSA\FipsAlgorithmPolicy"
    if ((Get-ItemProperty -Path $path).Enabled -ne 0) {
        Set-ItemProperty -Path $path -Name "Enabled" -Value "0"
        Write-Host("Set $path Enabled to 0")
    }
}