Cleanup user information list and remove users from Site

Recently one of the analyst requested can we cleanup profiles in the User Information List in a site collection as he was looking to create a template out of it. This is an interesting topic because not only this is important for as a template but it is also an extranet site (internet facing) and its a good practice to minimize the user information exposed externally, most of the extranet site collections if not taken care the details can be queried by directly pointing the site collections to the following url’s

http://<site_collection_url>/_layouts/userdisp.aspx?Force=True&ID=20  (change the number to query groups and users and their personal details)

Coming back to the point how to delete user profiles in a site collection, and also cleanup the user information list, check the below powershell.

Note: It also deletes the users and user permission on the site collection.

$sc = Read-host "Enter Site Collection URL"
$site = get-spsite $sc
$users = get-spuser -web $site.rootweb -limit All
foreach ($user in $users){remove-spuser -identity $user.userlogin -web $site.rootweb -confirm:$false -erroraction Silentlycontinue }

Another approach to filter and cleanup the user information list below. This remove the groups as well if not filtered by the item.ID while passing the item to remove.

# Note: take a backup of your site collection before attempting this script - 
# Note: this script could strip all the users and their permissions on the site
$sc = Read-host "Enter Site Collection URL"
$web = get-spweb $sc
$list = $web.SiteUserInfoList
#this will get all the users and groups so be careful while passing $item to remove.
$items = $list.getitems()

foreach ($item in $items){$web.siteusers.RemoveByID($item.ID) -erroraction Silentlycontinue }

This is a simple post to achieve what I am looking and there is a potential to corrupt your site collections, I had taken care to test in my DEV farm. This has been tested in sharepoint 2010 sp1 only.

The below posts can help you further on this topic.



Route SharePoint Email from non-prod farms to mail enabled group or SP list.

Why do we need to route emails coming from SharePoint DEV & TEST farms to mail enabled group or SP list. This includes both Nintex and SharePoint alert mails.

 Where this can be useful:

  • When carrying out testing for SharePoint alerts or Nintex workflows to confirm delivery of emails to users – this email routing will help validate this.
  • When we move site collections to non-prod farms from PROD accidental notifications to users will not occur.
  • Preventing emails reaching a larger audience sent mistakenly by a workflow or SharePoint alerts.
  • Gauging which users are active on non-prod site collections and notify them when we plan to implement changes to DEV and TEST SharePoint environment.

Issues this may cause:

  • Any power user/user trying to test something in non-prod will not receive emails, they will have to contact SharePoint Support for confirmation of emails or SharePoint support can forward that email to the user.
  • You need to provide “from address” to an SMTP server to the exchange admin and he can setup a rule to route these to a newly created mail enabled group also mention which group/list need to be part of these mail enabled group to receive emails.

From Address

mail enabled group

Members of the group


SharePoint Dev Alerts
DEV Nintex:  
TST Farm:

SharePoint Test Alerts
TST Nintex:  

“Alert Me” Missing in the SharePoint Ribbon

These is a strange issue that “Alert Me” button Missing in the SharePoint Ribbon but only for a few site collections and for one complete web application. Check the following paths in central administration.


  • Central Admin –> Systems settings –>E-Mail and Text messages –> Configure outgoing email settingsoutgoingemailsettingsfarm-outgoingemail
  • Also make sure the web application outgoing email settings are filled as well.


But in this case some web applications are showing “Alert Me” button and this I vaguely remembered something with STSADM commands, and this was suggested by me to the customer to disable alerts for any restored databases from prod to test to stop the site collection from sending alerts to users.

There is a STSADM property which is by default enabled and we can turn the property to false for each site collection or to a complete web application.

#To disable alerts, use the following syntax:
stsadm -o setproperty -url http://server_name -pn alerts-enabled -pv false

#To view the setting for the alerts-enabled property, use the following syntax:

stsadm -o getproperty -url http://server_name -pn alerts-enabled
PS C:> stsadm -o getproperty -url -pn alerts-enabled

<Property Exist="Yes" Value="no" />

PS C:> stsadm -o setproperty -url -pn alerts-enabled -pv true

Operation completed successfully.

The following documentation will be helpful to start on this: Here

At the time of this writing these details applies to SharePoint 2010.


tags: SharePoint 2010, powershell, SharePoint Alerts, STSADM

Sharepoint – WorkFlow Timer Job Paused

On a random daily check on CA we found workflow timer job in “Paused” state in Central Admin. Multiple of these pointing to each server in farm in paused state. Many blog posts pointing to restarting the timer service on each server and also clearing the timer cache (configuration cache) would resolve the issue.

After the cache is cleared initially the jobs ran for a couple of instances and they were again pausing and the reason was a modified workflow resulting in a “dead end” state. Which was a mistake by the user. Each server processing workflows timer job became “stuck” due to items reaching the “dead end” state. All workflows with pauses or state changes would have ceased to be processed at this stage.

Under normal operations, a workflow is first executed in the World Wide Web Publishing Service (w3wp.exe). After a delay or pause, in which the workflow “sleeps,” it wakes up and is then run by the SharePoint Timer Service (the owstimer.exe process).

 When w3wp.exe is under excessive load, it postpones the workflow. The Timer Service is then used to the continue the process. SharePoint dictates which Workflow Timer service will run the workflow; the operation can occur on any server running the “Microsoft SharePoint Foundation Workflow Timer Service” service. However, when Nintex Workflow is installed, it only deploys the DLLs required to run workflows to WFE servers–specifically, those servers running the “Microsoft SharePoint Foundation Web Application” service.

This issue is discovered by enabling verbose logging both at CA and also at Nintex global settings and also querying for the workflow name and its site collections using Nintex NWAdmin.exe

After the workflow is modified and already the workflows which are started with the broken workflow is cleared and again restarting the timer services on servers with clearing the cache resolved the issue.

This issue is reported on SharePoint 2010 and Nintex 2010



Tags: nintex workflow, State changes error, workflow timer job paused, sharepoint timer service restart.

Automate monitoring SharePoint and Windows Services

As a SharePoint admin, the most important role calls for monitoring servers and maintain the SLA’s. I guess there will be no second thoughts on this. For an admin as the environment grows to multiple farms and servers at different location it calls for automation – mostly on routine tasks to save time for their personal life and here in I am attempting to address how it did saved me some time. Automate monitor/start SharePoint Services on farm servers. You can script it to check these services at every restart and if you are the unlucky one where some of the services  keep on stopping or unstable you can schedule the the script to check the services and attempt to start them and also email you the status.

So, we are looking at the script to:

  1. Check the services at startup or on scheduled time.
  2. Attempt to start the services if they are stopped
  3. Send an email with the status of the services from each server.

What you need to know before trying this option:  SMTP server or mailhost server (which has to be unauthenticated SMTP).

I had this script, apparently it is been pieced together for my requirements and then scheduled to run on each reboot. How to call this script, 2 files created: Batch file which refers the PowerShell script  (keep both these files in same path)

  1. servicescheck.bat
@echo off
PUSHD "%~dp0"
powershell -file "servicescheck.ps1" < NUL
  1. Servicescheck.ps1
    [Parameter(Mandatory=$false, HelpMessage='-ServiceNames Optional, provide a set of service names to restart.')]
    [Array]$ServiceNames=@("SharePoint 2010 Tracing","Simple Mail Transfer Protocol (SMTP)","SharePoint 2010 Timer","SharePoint 2010 Administration","IIS Admin Service","World Wide Web Publishing Service", "Net.Tcp Listener Adapter","Net.Pipe Listener Adapter")

$server = hostname;
$emailbody = "";

        Write-Host "Attempting to start services on" $server -ForegroundColor White;
        foreach($serviceName in $ServiceNames)
            $serviceInstance = Get-Service  -DisplayName $serviceName -ErrorAction SilentlyContinue;
            if(($serviceInstance -ne $null) -AND ($serviceInstance.Status -eq "Stopped"))
               # Write-Host "Attempting to start service" $serviceName ".." -ForegroundColor White -NoNewline;
                    start-Service -InputObject $serviceInstance; 
                        Write-Output "Error Occured: " $_.Message;
$emailbody = foreach($servicename in $servicenames){ Get-Service $servicename -ErrorAction SilentlyContinue | Select-Object Status, DisplayName | ConvertTo-Html -Fragment}

$users = "Venu Madhav <>"
$fromemail = "$"
$smtpserver = ""
# assemble the HTML for our body of the email report.

$HTMLmessage = @"

send-mailmessage -From $fromemail -to $users -subject "$server Rebooted - Services Status" -BodyAsHTML -Body $HTMLmessage -priority High -smtpServer $smtpserver
$emailbody = "";










Tags: Monitor Services, Automate Service Monitoring, PowerShell, SharePoint, Automate monitoring SharePoint and Windows Services

Disable windows update in Windows 10

I was looking how to disable windows update in Windows 10. I want to control downloading new builds on one of my machine as I need it to be more stable with my Virtual Machines running.

  • Open MMC (type MMC in search windows)
  • Click file -> add/remove snapin –> choose “Group Policy Object Editor”
  • Navigate to “Computer Configuration” -> “Windows Settings” -> “Administrative Templates” ->”Windows Components” ->”Windows Update”
  • Rest of the story the below picture explains.






tags: Disable windows update in Windows 10; Automatic windows update; Windows 10 Updates;

SharePoint Server 2016 Overview

Here are my notes from Ignite 2015. These notes are from this session: What’s New for IT Professionals in SharePoint Server 2016

Launch dates

  • Beta 1 – Q4/2015
  • RTM – Q2/2016


  • Zero downtime patching for build-to-build updates (like Cumulative Updates)
  • Patch size significantly decreases.
  • Responsive interface and new mobile experiences based on what’s available in O365

Hardware Requirements

About the same as SP2013. Compute doesn’t really change.

Software Requirements

  • W2k12R2 and Windows Server 10
  • .NET FW 4.5.2 for w2k12R2 or .NET FW 4.5.7 for WS10.
  • 64bit SQL Server 2014 SP1 at a minimum


  • Standalone installations are no longer supported. Single Server install will require a regular install on SQL Server on the local machine.
  • The role guidance that we had in SP2013 becomes specific roles in code in sp2016. Conceptually, there are three role types
  • User Services roles – any request coming directly from a user.
  • Robot services – any request that does not originate from an end user.
  • Cache services – DCS

The goal is that a request from a user is managed from end-to-end by a single machine. This will reduce latency. No need to traverse the topology and send the request from server to server. This is called “min-role topology”

During install, you pick from one of server roles.

  • Special load does not use min role. Same flexibility you had in SP2013. This server can play any role just like before. Recommended for third-party or custom-dev services.
  • Web Front End – responding to a web user request from end to end. User services role.
  • Search – Indexing/Crawling.
  • App Server role – robot services role.
  • Single Server Farm – This does not include SQL Server Express any longer. You need to install SQL Server on that machine separately. All SP bits are installed, like Special Load.


  • 5 mode (SP 2010 mode) site collections need to be upgraded to v15 (SP2013) before an upgrade.
  • DB attach upgrade from SP2013 to SP2016. There will be direct path from SP2010.
  • Think of SP2013 as the base kernel for future SP release. For the most part, there is schema parity from 2013 to 2016.
  • Service app architecture does not change
  • When developing SP2016, MSFT took a point-in-time snapshot from O365 was taken and a new branch has been created.


  • SAML is the default and a first class citizen. Basically, there is one auth provider that is both claims and cloud ready. This helps make the cloud more transparent from an app-dev point of view.
  • MSFT is moving away from domain/classic mode auth and moving towards cloud-based auth. What about Windows Identity over SAML? This is supported but still deprecated.
  • SMTP Connection encryption. StartTLS connection encryption. Can use non default ports, other than TCP 25. Fallback to unencrypted SMTP is not supported.

Perf and reliability

  • Performance is expected to be significantly improved with the min-role concept covered during installation.
  • Health analyzer is specific for the role. It detects a service that deviates from the role. Can you start a service on a server that is incompatible with its role? Health analyzer does not run against Special Load.
  • Not in compliance means the health analyzer compares what’s running and what does it expect to find. There is a Fix link to resolve this.


  • Size of the patch has been reduced significantly. Number of MSIs and MSPs reduces down to 2 and 4 respectively. Previously this was 37 + (10 x number of language packs).
  • The upgrades install faster with no downtime. In the past, achieving 99.9% uptime has been too difficult. This is definitely possible now.
  • Build-to-build upgrades are an online operation. Completely transparent to users. Upgraders used to run offline where services were stopped.
  • With the fewer number of configuration combinations (e.g. using min role topology), this simplifies the testing and increases over the stability of the system. This is also how the patching footprint can be much smaller and faster.

Distributed Cache (DCS)

In SP2013, AD needed to be consulted for each cache request which really slowed down the perf. This has been eliminated by using a new transport layer.

Boundaries and limits

  • Moving away from FSSHTTP. Using Background Intelligent Transfer Service (BITS). What about Shredded Storage? Nothing discussed here.
  • SPSite (site collection) provisioning is much faster. SPSite. Copy is used at the db layer. Basically just adding new rows at the table level based on the template. Things like feature activation does not slow it down now.
  • MSFT is still thinking about incorporating Traffic Mgmt as with O365. New end point on web servers. This is not official yet, but in the planning/feasibility stage.

User Profile Services

Two-way FIM support is pulled out of SP. If you need this, you can use the full external ForeFront Identity Manager. SP only supports the simple, one-way sync from AD.

Project Server database gets merged into a content db. Brings Project Server closer to SP. SP2016 doesn’t include Project Server.

Durable Links

Files can be moved in between site collections or renamed and the durable link still works. This is based on a resource ID that has an affinity on the document.


Lots of new telemetry in SP2016. Lots of new reporting on usage, storage, health, perf. This is the first time I’ve seen this degree of reporting in any edition of SharePoint. This will definitely diminish the value of Report Centre.


  • O365 Compliance Centre can also cover on-prem content.
  • In-place hold and e-discovery on both O365 and on-prem.
  • Classification ID – a discrete representation of a particular piece of IP. For example, there is a credit card class ID, but in addition, they look for something else to corroborate it. For example, expiration date. There will be many others, SSN, Driver’s license, etc.

Search – unified search result set.

No longer separate result blocks. One search to rule them all. Also brings the power of Delve and Office Graph onto On-Prem. Is there a unified search index in O365? Did I hear that right?




Keywords# SharePoint Server 2016 Overview

Notes Captured# Randy Williams

Nintex Workflow Migration – Data and Configuration Migration

Migration process for Nintex Workflow data from one SharePoint farm to another where the two farms are mutually exclusive and running with their own accounts and instances, like when you want to bring in PROD data to non-prod for performing content refresh for testing CU’s or other major enhancements in non-prod. Or when we have to consolidate two SP farms into one by bringing over web applications from one farm and merging into another SP farm.

The idea is to preserving the current Nintex configuration of the destination farm and trying to move only Nintex data and configuration from source farm. Nintex has a command line tool which will migrate data from source Nintex database into destination database. Here we have two options where we can move the data to existing Nintex database or we can create an additional Nintex content database and move data into newly created database. I like second option where if we have to rollback it’s easier. (SP 2010 and Nintex workflow 2010 are the context of this document)

The process can be divided into 5 Major activities:

  • DB creation
  • Migrated Data
  • Migrate configurations
  • Testing
  • (optional) Roll back


How do you add a new Nintex content database: Either from Central Administration or Nintex command line tool: Nwadmin

For me: It’s easier to create the dB from CA rather than command line.workflow1

Nintex has a command line tool to migrate only the workflow data but all other configuration has to be either manually created/updated on the workflows or import the necessary dB table data into destination. We will look at that little later in the process.

Nwadmin is a command line tool that ships with Nintex Workflow 2010. It is used to perform various administration operations.

By default, the NWAdmin.exe tool is installed to the SharePoint hive, typically at the following path.

%ProgramFiles%Common FilesMicrosoft SharedWeb Server Extensions14BIN

Note: For some versions of Nintex Workflow 2010, the Nwadmin tool is installed in the installation directory, typically at the following path:

%ProgramFiles%NintexNintex Workflow 2010 (I find it here not in 14 hive)workflow2

Click to access NWAdmin_Operations_2010.pdf

Migration Process:


Backup Source (Prod) and Destination (Non-prod) Nintex DB’s

Either bring the source database to destination sql server and assign destination farm account access to the dB or grant destination farm account access to the dB where ever it is.

Grant FARM_ADMIN account DB_owner role on Destination farm DB’s: SP_NW2010 ; SP_NW2010Source


Consider the following points which are important while migrating the nintex data.

While migrating web applications needs to be stopped (iis) and
Timer service on all the servers to be stopped aswell. Consider gathering all the site collections where Nintex Workflows are active.

From SQL run the following query “SELECT SiteID FROM [NintexDBname].[dbo].[Storage]”
Or gathering the list of site collections in  “Central Administration : Nintex Workflow Management : Nintex Workflow Database Setup : View Database Mapping”

While web apps and timer service are off run the following command to migrate data:
nwadmin -o movedata –SiteID “GUID” –SourceDatabase “Prod Nintex DB Connection String” –TargetDatabase “Adelaide Nintex DB Connection String” –RetainSourceData

Connection String format: “Server=DBserver;Database=DBName;Integrated Security=True”workflow3

Make sure you create a sql alias so that you pass a single string in server name rather than using “serverinstancename”.

The command window output looks like below when it is success:

C:Program FilesNintexNintex Workflow 2010> nwadmin -o MoveData -SiteID “F1E52EF2-996F-46A1-A0D1-250D8970E725” -SourceDatabase “Server=TSTSP10DB-BNE;Database=SP_NW2010DB;Integrated Security=True” -TargetDatabase “Server=TSTSP10DB;Database=SP_NW2010_BNE;Integrated Security=True” -RetainSourceData

Before continuing this operation, please stop the following on each server:

SharePoint IIS web site

SharePoint 2010 Timer Service

If these services are not stopped, workflows may continue adding data, leaving the instance in an invalid state.

It is recommended that the source and target content databases are backed up before continuing.

Data will be moved for site collection ID: f1e52ef2-996f-46a1-a0d1-250d8970e725.

Restart each service to continue workflow operation.

C:Program FilesNintexNintex Workflow 2010>


  • Grant db_owner access and assign WSS_Content_Applications_Pools role to all the app pool accounts of the migrated web applications on this dBworkflow4
  • Start web applications, SharePoint Timer services.
  • Login to each site collection and go to Nintex workflow and check for any errors.

Migrate Configuration:

Till here we had complete one part of the migration where we moved WF data there are other components we need to consider to move, some of them listed below.

  • Workflow constants
  • Workflow Schedules (need be recreated manually)
  • Managed Allowed Actions
  • User Defined Actions
  • Lazy Approval settings
  • Error Notifications
  • Delegation
  • EventReceivers

If you have some expertise in sql queries that would be handy in identifying differences between tables in both the DB’s

Some of the configurations I can help identify here and provide out of the box solutions and for remaining we need to use dB table move method or identify the need based on case by case.

Workflow Constants:

These will not be migrated need to migrate manually. If they are few it’s easy to add them to the workflows otherwise consider the below commands to get them updated.

Use nwadmin -o exportworkflowconstants and then import with nwadmin -o importworkflowconstants

NWAdmin.exe -o ExportWorkflowConstants -siteUrl siteUrl -outputFile pathToFile [-includeSite] [-includeSiteCollection] [-includeFarm]

NWAdmin.exe -o ImportWorkflowConstants -siteUrl siteUrl -inputFile pathToFile -handleExistingSkip|Overwrite|Abort [-includeSite] [-includeSiteCollection] [-includeFarm]

Use the nwadmin documentation I refer earlier for detailed steps.

MessageTemplate: The entries can be added back via UI (Central Admin – Nintex Workflow Management – Message Templates). Impact is that customized templates will not show in emails.

EventReceivers: Impact is if event receiver entries are missing then task related action might not work properly. The entries are added to this table at the time of Nintex Workflow features activation, if destination configuration table has already entries for sites it can be ignored. Otherwise, deactivate and re-activate the Nintex workflow features. (“Nintex Workflow 2010” &amp; “Nintex Workflow 2010 InfoPath Forms”) on all the site collections migrated where you have Nintex Workflows active/running.

Changes in Functionality:

Please note that the send notifications email will change (if the source has a different email address) for users as only 1 address can be used per farm.

Environment From Address Reply To Address


I suggested below configuration areas are tested in a test environment before the actual live production migration:

  • Lazy approval (not required as it isn’t configured in either farm)
  • send notifications
  • query AD
  • execute SQL
  • Web request
  • call web service
  • Any other actions that are using stored credentials to connect to other systems

Rollback Procedure:

Drop the NEW Nintex content destination DB, The source database is still not touched and still hold the original WF data.

Having said that when migrating some web applications from one farm to another you cannot bring all the Nintex configurations as the destination farm already had its own, so there is a need for good amount of testing what is needed and how it can be brought, and I hope this document will provide some of the insights into it.

The reason this is created is we could not get proper documentation on this process either online or from Nintex support.

SharePoint Calculated Column – how to work around

Q: I work in Communications and we collect the news related to our department and send it out by email twice daily, once in the morning and once in the afternoon. I have set up a SharePoint library to group by Date, so it groups uploaded files by the date they were added, which is great. I was wondering if I could group the files under these dates by morning and afternoon (ie. before 12:00 and after 12:00) so by the time that they were added, so that they correspond with our morning and afternoon news alerts? Continue reading