SharePoint patching constitute of three types, Service Packs, Cumulative updates and hotfixes. Patching is considered a B2B (build to build) upgrade.
Generally service packs are not a set schedule but they are around 18 to 24 months. They fix all the bugs that have identified since the last service pack or RTM. Sometimes service packs introduce new features and functionality. These are thoroughly tested before release and it should immediately be tested in a test environment. Once tested for feature and functionality in test/non-prod environment they should be released to production at the earliest as service packs are dramatically affect performance and security.
Cumulative updates are normally released in every other even numbered month and they are attempt to fix bugs identified in those months plus they include fixes for all the previously fixed bugs as well. The size of these CU’s grows as they are being released. Unless if any current affected issues are resolved in production environment they are not generally necessary to install. Sometimes these Cu’s can break things if they don’t work properly. Patches cannot be uninstall and you will be helpless and wait till the next CU is released. Test the fix the CU is providing thoroughly in test farm and wait for a couple of weeks for any regression before updating to production.
Hotfix or a security patch. These are generally pushed through windows update. These attempt to fix any urgent security issues in SharePoint and are release as early as possible and could be the lease tested of all the patches. Install and run configuration wizard to complete the process. Fingers crossed and prey they should not break any other functionality.
In this new article, I will summarize basic Powershell cmdelts we have available to upgrade SharePoint 2010 content databases to SharePoint 2013:
Before going into detail, you can list available cmdlets by executing the following PowerShell sentence in the SharePoint 2013 Administration Console:
After executing this sentence, you will have a list with all the cmdlets available:
From this list, the cmdlets we are interested in are the following: Test-SPContentDatabse, Upgrade-SPContentDatabase and Mount-SPContentDatabase.
Test-SPContentDatabase cmdlet allows you to identify any issue or customization you could take into account when upgrading a content database to SharePoint 2013: features installed in the SharePoint 2010 environment that should be in the SharePoint 2013 ones, language packs installed, etc. Below, you can find the general syntax for Test-SPContentDatabase.
Test-SPContentDatabase –Name -WebApplication
If the command execution detects issues in the content database, it will appear in the output window. For instance, a common issue is to have orphan objects in the content database.
Ugrade-SPContentDatabase cmdlet allows you to upgrade a content database that has some issues or upgrade it from one product build to another one. This command is used to only to resume a failed Upgrade. The syntax of this cmdlet is quite simple as you can see in the following script:
If you execute this command in the SharePoint 2013 Administration Console, you will have to confirm that you want to start the upgrade process in the related database. If there are no problems, a warning message will be shown indicating that the database doesn’t need to be upgraded.
Finally, you have to execute the Mount–SPContentDatabase cmdlet in order to add a content database to an existing web application. Please note that only once you have fixed any existing problem found with the content database, should you execute this command. You are then ready to add to an existing web application. Mount-SPContentDatabase syntax is quite simple as you can see below:
Mount-SPContentDatabase -Name -WebApplication
And that’s all there is to know about upgrading from basic PowerShell cmdlets to SharePoint 2010 content databases. Courtesy: Juan Carlos
Now that we talked about perfmon counters for SharePoint and project server in the last post, the other challenging task after collecting these counters was to analyze this data and make out some sense out of those huge log file. There may be different ways analyze them by applying knowledge gained experience of such incidents, but not all the time we can rely on that when the counters are in large in number. The standard Perfmon interface tends to be messy when many counters are involved.
I have two recommendations:
1.Manual PerfMon Analysis Workbook — When you want a tool that would give you quick overview of the server without having to delve through a thick report or through a cluttered perfmon diagram with lots of counters.
This tool is all you need to complete your analysis. While you are looking to analyze SharePoint Counters and see how your server fares to the best practice results or a standard expected results of a well performing one, you need to feed the threshold file for SharePoint and the dependent threshold files as well, like for SharePoint server you may need a system overview, IIS, SQL thresholds as well as the counters we normally monitor for SharePoint server will also include IIS, process and system hardware related.
You can get a clue by looking at the sample report which this tool prepares at this URL
Manual PerfMon Analysis Workbook – You get detailed and latest update by visiting codeplex link provided above.
This tool gives you an overview over lots of counters that are collected using perfmon. This release is workable. Too upload the data to the workbook: read the manual below: Manual Perfmon Analysis Workbook.docx
When we run into issues and need further investigation to troubleshoot the root cause we look up to getting Servers and Database statistics. Always I look forward to gather as much data from Performance Counters. These counters are specific to application and Operating system and as a SharePoint Admins we need to gather data which makes sense and the following counters had been very helpful for up in such instances.
Every Situation is different and we need to add counters which are necessary and apt for that situation and below are one which I use most of the time for SharePoint and Project Server and while working with Microsoft we got these templates to collect the counters. I also uploaded these templates which you can download at:
The data keeps on growing as the SharePoint Farm being adopted more into the company and customers. It brings more challenges in terms of keeping up with the pace of requests and data administration. We are going to address today one part of that problem which is managing SharePoint LOGS weather it is ULS, Diagnostic and Analytic logging. Also it includes IIS logs. To address this challenge I did wrote a small script to migrate these logs to a different directory and them zip these logs and move to a network storage or to another directory on the local hard drive.
Some of the challenges to achieve this were
· There are IIS logs with same file names in different directories (mainly IIS logs) we need to maintain the directories structure to not miss any logs.
· The criteria to move these logs out is based on a number of day , like I need to move all the logs 15 or older away from the default logs folder or drive.
· Then zip these logs and add to same archive file as this script has to run daily and we have a single archive file on that machine and to maintain the zip name to reflect the server name to easy recognition on the network store.
· All this to be scripted in batch scripting and using a free compress utility (7z) command line utility to include compress in the script.
· Donwload 7z and install it to program files as we need to add this install path to environment variable PATH to use it in script directly
REM Sharepoint ULS and IIS Log cleanup script (backup and compress)
REM Venu Madhav
REM Enter the file path for the ULS log directory path
REM ENTER no of days old logs to be archived
REM Enter the location for the temporary logs backup and archive file location
REM sets the compress utility directory to PATH variable
REM SET PATH=%Path%;C:Program Files7-Zip
REM copy the current logs directory structure
echo D | xcopy %log% %backup% /t
REM copies all the *.log files matching to the criteria in root directory initially
forfiles -p "%log%" -m *.log -d %days% -c "cmd /c copy /Y @path %backup%@file"
FOR /F %%v IN ('dir/ad/b') DO forfiles -p "%log%%%v" -m *.log -d -%days% -c "cmd /c copy @path %backup%%%v@file"
REM creating a compress file with hostname if it already exists it appends these logs either in local directory or network location
for /F %%v in ('hostname') DO 7z a d:logs%%v.zip %backup%*
REM once the logs are compressed remove the log files meeting the criteria
forfiles -p "%log%" -m *.log -s -d -%days% -c "cmd /c del @path /F /Q"
REM removing the temporary folder after the log compressed and zipped
rmdir %backup% /S /Q
It’s a lot to live up to, and the iPhone 4 is doing its best — with features like a super-fast A4 CPU, a new front-facing camera and five megapixel shooter on the back, a completely new industrial design, and that outrageous Retina Display, no one would argue that Apple has been asleep at the wheel. So the question turns to whether or not the iPhone 4 can live up to the intense hype. Can it deliver on the promises Steve Jobs made at WWDC, and can it cement Apple’s position in the marketplace in the face of mounting competition from the likes of Google and Microsoft?