Find All InfoPath Form Libraries and lists in SharePoint

As part of the cleanup activity one of the SharePoint analyst requested a report listing all the forms libraries and lists as well with document count and last modified date. Finding lists with forms enabled was different used SharePoint manager to search and query for the property which separates lists with and without forms but no luck.

Then I found the following property from one of the blog and married up the script with Rajack’s script to produce a script works for me.


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") > $null
#Get the web application
Write-Host "Enter the Web Application URL:"
$WebAppURL= Read-Host
$SiteColletion = Get-SPSite($WebAppURL)
$WebApp = $SiteColletion.WebApplication
#Write the CSV header
"Site Collection `t Site `t List Name `t List Url `t Docs Count `t Last Modified `t Form Template" > InfoPathLibs.csv
#Loop through all site collections of the web app
    foreach ($site in $WebApp.Sites)
       # get the collection of webs
       foreach($web in $site.AllWebs)
            write-host "Scaning Site" $web.title "@" $web.URL
               foreach($list in $web.lists)
                   if( $list.BaseType -eq "DocumentLibrary" -and $list.BaseTemplate -eq "XMLForm")
                    $listModDate = $list.LastItemModifiedDate.ToShortDateString()
                    $listTemplate = $list.ServerRelativeDocumentTemplateUrl
       #Write data to CSV File
                   $site.RootWeb.Title +"`t" + $web.Title +"`t" + $list.title +"`t" + $Web.Url + "/" + $List.RootFolder.Url  +"`t" + $list.ItemCount +"`t" + $listModDate +"`t"  + $listTemplate >> InfoPathLibs.csv
                elseif ($list.ContentTypes[0].ResourceFolder.Properties["_ipfs_infopathenabled"])
                    $listModDate = $list.LastItemModifiedDate.ToShortDateString()
                    $listTemplate = $list.ServerRelativeDocumentTemplateUrl
                    #Write data to CSV File
                   $site.RootWeb.Title +"`t" + $web.Title +"`t" + $list.title +"`t" + $Web.Url + "/" + $List.RootFolder.Url  +"`t" + $list.ItemCount +"`t" + $listModDate +"`t"  + $listTemplate >> InfoPathLibs.csv
#Dispose of the site object
Write-host  "Report Generated at same path of the powershell script InfoPathLibs.csv" -foregroundcolor green

The following report looks like:

In the below report which ever list does not have a form template implies its a list and all others are document libraries.




Delete duplicate fields in a sharepoint list based on internal name

I have a similar issue where a content and structure migration created duplicate fields and had to delete fields based on internal name

My code:

$web = Get-SPWeb https://myWeb
$list = $web.Lists["Technical"]
$field = $list.Fields |?{$_.InternalName -eq "InternalNameofthefield"}
$field.ReadOnlyField = $false
$field.AllowDeletion = $true
$field.Sealed = $false
#if I dont update prior I can't delete - on sharepoint 2010 so update command first.


Script to find sharepoint group members across sites

The below is a script to look for a group and group members across sites in the SharePoint farm.

You can customize it for site collection and web apps along with export columns.

$sites = Get-SPWebApplication P_Teams" | Get-SPSite -limit all
"Site Collection`t Group`t User Name`t User Login" | out-file groupmembersreport.csv
foreach($site in $sites)
	$webs = $site.allwebs
    foreach($web in $webs)
    $group = $web.Groups |?{$_.Name -like "*power*"}
    # you can also filter by exact group name { $_.Name -eq "Power Users"} or look for word in a group
	foreach($user in $group.Users)
		"$($web.url) `t $($group.Name) `t $($user.displayname) `t $($user) " | out-file groupmembersreport.csv -append




PowerShell Script – list all SharePoint group members

A quick script to query  site collections in a web application for SharePoint group and list all the members of that group and export into an CSV file. In the below example I am looking for “Power Users” group and list all the members of that group

$sites = Get-SPWebApplication | Get-SPSite -limit all
"Site Collection`t Group`t User Name`t User Login" | out-file groupmembersreport.csv
foreach($site in $sites)
	$sitegroup = $site.RootWeb.SiteGroups |?{$_.Name -EQ "Power Users"}
	foreach($user in $sitegroup.Users)
		"$($site.url) `t $($sitegroup.Name) `t $($user.displayname) `t $($user) " | out-file groupmembersreport.csv -append

If you are interested in querying all the groups in the site collections and list all the members

# if you want to query all the site collections and its groups members then un comment line 4 and comment line 5
# $sites = get-spsite -limit All
$sites = Get-SPWebApplication | Get-SPSite -limit all
"Site Collection`t Group`t User Name`t User Login" | out-file groupmembersreport.csv
foreach($site in $sites)
	foreach($sitegroup in $site.RootWeb.SiteGroups)
	  foreach($user in $sitegroup.Users)
		"$($site.url) `t $($sitegroup.Name) `t $($user.displayname) `t $($user) " | out-file groupmembersreport.csv -append

You can also write in a single line if you quickly want to query for a single site collection

Get-Spweb | Select -ExpandProperty SiteGroups | Where {$_.Name -EQ "Power Users"} | Select -ExpandProperty Users | Select Name, userlogin, Email


Docave 5 Agent does not start

I am receiving the following error when attempting to start the DocAve 5 service on the server.
This issue is specifically for servers which has CRL check issues( no intenet connection or certificate servers not configured)

The DocAve Communication service failed to start due to the following error:
The service did not respond to the start or control request in a timely fashion.


A timeout was reached (30000 milliseconds) while waiting for the DocAve Communication service to connect.

The solution is to add a configuration file to the Docave Agent bin folder.
e.g: c:program filesAvePointdocAve6Agentbin

Copy below lines into a file named: DocAveAVPCService.exe.config

<?xml version="1.0"?>
  <runtime >
    <generatePublisherEvidence enabled="false"/>

This is only effective for Docave 5 agent. Lookout for another post on docave 6 agent steps.

Cleanup user information list and remove users from Site

Recently one of the analyst requested can we cleanup profiles in the User Information List in a site collection as he was looking to create a template out of it. This is an interesting topic because not only this is important for as a template but it is also an extranet site (internet facing) and its a good practice to minimize the user information exposed externally, most of the extranet site collections if not taken care the details can be queried by directly pointing the site collections to the following url’s

http://<site_collection_url>/_layouts/userdisp.aspx?Force=True&ID=20  (change the number to query groups and users and their personal details)

Coming back to the point how to delete user profiles in a site collection, and also cleanup the user information list, check the below powershell.

Note: It also deletes the users and user permission on the site collection.

$sc = Read-host "Enter Site Collection URL"
$site = get-spsite $sc
$users = get-spuser -web $site.rootweb -limit All
foreach ($user in $users){remove-spuser -identity $user.userlogin -web $site.rootweb -confirm:$false -erroraction Silentlycontinue }

Another approach to filter and cleanup the user information list below. This remove the groups as well if not filtered by the item.ID while passing the item to remove.

# Note: take a backup of your site collection before attempting this script - 
# Note: this script could strip all the users and their permissions on the site
$sc = Read-host "Enter Site Collection URL"
$web = get-spweb $sc
$list = $web.SiteUserInfoList
#this will get all the users and groups so be careful while passing $item to remove.
$items = $list.getitems()

foreach ($item in $items){$web.siteusers.RemoveByID($item.ID) -erroraction Silentlycontinue }

This is a simple post to achieve what I am looking and there is a potential to corrupt your site collections, I had taken care to test in my DEV farm. This has been tested in sharepoint 2010 sp1 only.

The below posts can help you further on this topic.


Route SharePoint Email from non-prod farms to mail enabled group or SP list.

Why do we need to route emails coming from SharePoint DEV & TEST farms to mail enabled group or SP list. This includes both Nintex and SharePoint alert mails.

 Where this can be useful:

  • When carrying out testing for SharePoint alerts or Nintex workflows to confirm delivery of emails to users – this email routing will help validate this.
  • When we move site collections to non-prod farms from PROD accidental notifications to users will not occur.
  • Preventing emails reaching a larger audience sent mistakenly by a workflow or SharePoint alerts.
  • Gauging which users are active on non-prod site collections and notify them when we plan to implement changes to DEV and TEST SharePoint environment.

Issues this may cause:

  • Any power user/user trying to test something in non-prod will not receive emails, they will have to contact SharePoint Support for confirmation of emails or SharePoint support can forward that email to the user.
  • You need to provide “from address” to an SMTP server to the exchange admin and he can setup a rule to route these to a newly created mail enabled group also mention which group/list need to be part of these mail enabled group to receive emails.

From Address

mail enabled group

Members of the group


SharePoint Dev Alerts
DEV Nintex:  
TST Farm:

SharePoint Test Alerts
TST Nintex:  

Automate monitoring SharePoint and Windows Services

As a SharePoint admin, the most important role calls for monitoring servers and maintain the SLA’s. I guess there will be no second thoughts on this. For an admin as the environment grows to multiple farms and servers at different location it calls for automation – mostly on routine tasks to save time for their personal life and here in I am attempting to address how it did saved me some time. Automate monitor/start SharePoint Services on farm servers. You can script it to check these services at every restart and if you are the unlucky one where some of the services  keep on stopping or unstable you can schedule the the script to check the services and attempt to start them and also email you the status.

So, we are looking at the script to:

  1. Check the services at startup or on scheduled time.
  2. Attempt to start the services if they are stopped
  3. Send an email with the status of the services from each server.

What you need to know before trying this option:  SMTP server or mailhost server (which has to be unauthenticated SMTP).

I had this script, apparently it is been pieced together for my requirements and then scheduled to run on each reboot. How to call this script, 2 files created: Batch file which refers the PowerShell script  (keep both these files in same path)

  1. servicescheck.bat
@echo off
PUSHD "%~dp0"
powershell -file "servicescheck.ps1" < NUL
  1. Servicescheck.ps1
    [Parameter(Mandatory=$false, HelpMessage='-ServiceNames Optional, provide a set of service names to restart.')]
    [Array]$ServiceNames=@("SharePoint 2010 Tracing","Simple Mail Transfer Protocol (SMTP)","SharePoint 2010 Timer","SharePoint 2010 Administration","IIS Admin Service","World Wide Web Publishing Service", "Net.Tcp Listener Adapter","Net.Pipe Listener Adapter")

$server = hostname;
$emailbody = "";

        Write-Host "Attempting to start services on" $server -ForegroundColor White;
        foreach($serviceName in $ServiceNames)
            $serviceInstance = Get-Service  -DisplayName $serviceName -ErrorAction SilentlyContinue;
            if(($serviceInstance -ne $null) -AND ($serviceInstance.Status -eq "Stopped"))
               # Write-Host "Attempting to start service" $serviceName ".." -ForegroundColor White -NoNewline;
                    start-Service -InputObject $serviceInstance; 
                        Write-Output "Error Occured: " $_.Message;
$emailbody = foreach($servicename in $servicenames){ Get-Service $servicename -ErrorAction SilentlyContinue | Select-Object Status, DisplayName | ConvertTo-Html -Fragment}

$users = "Venu Madhav <>"
$fromemail = "$"
$smtpserver = ""
# assemble the HTML for our body of the email report.

$HTMLmessage = @"

send-mailmessage -From $fromemail -to $users -subject "$server Rebooted - Services Status" -BodyAsHTML -Body $HTMLmessage -priority High -smtpServer $smtpserver
$emailbody = "";










Tags: Monitor Services, Automate Service Monitoring, PowerShell, SharePoint, Automate monitoring SharePoint and Windows Services

SharePoint Server 2016 Overview

Here are my notes from Ignite 2015. These notes are from this session: What’s New for IT Professionals in SharePoint Server 2016

Launch dates

  • Beta 1 – Q4/2015
  • RTM – Q2/2016


  • Zero downtime patching for build-to-build updates (like Cumulative Updates)
  • Patch size significantly decreases.
  • Responsive interface and new mobile experiences based on what’s available in O365

Hardware Requirements

About the same as SP2013. Compute doesn’t really change.

Software Requirements

  • W2k12R2 and Windows Server 10
  • .NET FW 4.5.2 for w2k12R2 or .NET FW 4.5.7 for WS10.
  • 64bit SQL Server 2014 SP1 at a minimum


  • Standalone installations are no longer supported. Single Server install will require a regular install on SQL Server on the local machine.
  • The role guidance that we had in SP2013 becomes specific roles in code in sp2016. Conceptually, there are three role types
  • User Services roles – any request coming directly from a user.
  • Robot services – any request that does not originate from an end user.
  • Cache services – DCS

The goal is that a request from a user is managed from end-to-end by a single machine. This will reduce latency. No need to traverse the topology and send the request from server to server. This is called “min-role topology”

During install, you pick from one of server roles.

  • Special load does not use min role. Same flexibility you had in SP2013. This server can play any role just like before. Recommended for third-party or custom-dev services.
  • Web Front End – responding to a web user request from end to end. User services role.
  • Search – Indexing/Crawling.
  • App Server role – robot services role.
  • Single Server Farm – This does not include SQL Server Express any longer. You need to install SQL Server on that machine separately. All SP bits are installed, like Special Load.


  • 5 mode (SP 2010 mode) site collections need to be upgraded to v15 (SP2013) before an upgrade.
  • DB attach upgrade from SP2013 to SP2016. There will be direct path from SP2010.
  • Think of SP2013 as the base kernel for future SP release. For the most part, there is schema parity from 2013 to 2016.
  • Service app architecture does not change
  • When developing SP2016, MSFT took a point-in-time snapshot from O365 was taken and a new branch has been created.


  • SAML is the default and a first class citizen. Basically, there is one auth provider that is both claims and cloud ready. This helps make the cloud more transparent from an app-dev point of view.
  • MSFT is moving away from domain/classic mode auth and moving towards cloud-based auth. What about Windows Identity over SAML? This is supported but still deprecated.
  • SMTP Connection encryption. StartTLS connection encryption. Can use non default ports, other than TCP 25. Fallback to unencrypted SMTP is not supported.

Perf and reliability

  • Performance is expected to be significantly improved with the min-role concept covered during installation.
  • Health analyzer is specific for the role. It detects a service that deviates from the role. Can you start a service on a server that is incompatible with its role? Health analyzer does not run against Special Load.
  • Not in compliance means the health analyzer compares what’s running and what does it expect to find. There is a Fix link to resolve this.


  • Size of the patch has been reduced significantly. Number of MSIs and MSPs reduces down to 2 and 4 respectively. Previously this was 37 + (10 x number of language packs).
  • The upgrades install faster with no downtime. In the past, achieving 99.9% uptime has been too difficult. This is definitely possible now.
  • Build-to-build upgrades are an online operation. Completely transparent to users. Upgraders used to run offline where services were stopped.
  • With the fewer number of configuration combinations (e.g. using min role topology), this simplifies the testing and increases over the stability of the system. This is also how the patching footprint can be much smaller and faster.

Distributed Cache (DCS)

In SP2013, AD needed to be consulted for each cache request which really slowed down the perf. This has been eliminated by using a new transport layer.

Boundaries and limits

  • Moving away from FSSHTTP. Using Background Intelligent Transfer Service (BITS). What about Shredded Storage? Nothing discussed here.
  • SPSite (site collection) provisioning is much faster. SPSite. Copy is used at the db layer. Basically just adding new rows at the table level based on the template. Things like feature activation does not slow it down now.
  • MSFT is still thinking about incorporating Traffic Mgmt as with O365. New end point on web servers. This is not official yet, but in the planning/feasibility stage.

User Profile Services

Two-way FIM support is pulled out of SP. If you need this, you can use the full external ForeFront Identity Manager. SP only supports the simple, one-way sync from AD.

Project Server database gets merged into a content db. Brings Project Server closer to SP. SP2016 doesn’t include Project Server.

Durable Links

Files can be moved in between site collections or renamed and the durable link still works. This is based on a resource ID that has an affinity on the document.


Lots of new telemetry in SP2016. Lots of new reporting on usage, storage, health, perf. This is the first time I’ve seen this degree of reporting in any edition of SharePoint. This will definitely diminish the value of Report Centre.


  • O365 Compliance Centre can also cover on-prem content.
  • In-place hold and e-discovery on both O365 and on-prem.
  • Classification ID – a discrete representation of a particular piece of IP. For example, there is a credit card class ID, but in addition, they look for something else to corroborate it. For example, expiration date. There will be many others, SSN, Driver’s license, etc.

Search – unified search result set.

No longer separate result blocks. One search to rule them all. Also brings the power of Delve and Office Graph onto On-Prem. Is there a unified search index in O365? Did I hear that right?




Keywords# SharePoint Server 2016 Overview

Notes Captured# Randy Williams