When SharePoint Content Data Gets Too Big
One of the first limitations that SharePoint administrators encounter is that of content database size. Content databases can start at a reasonable size, but over time get larger and more bloated as new content is added – impacting performance and responsiveness.
It’s a problem that hasn’t gone unnoticed. Microsoft has studied the issue and issued recommendations on database size limits. In SharePoint 2007, this recommendation was a scant 100 GB. Subsequently, the recommendation was raised to 200 GB for SharePoint 2010 and 2013. While doubling the previous recommendation, the soft limit is insufficient for most organizational needs, as my own research has shown.
Over the past year, I have interviewed more than 1,500 organizations about their SharePoint usage trends. The research found that more than 70 percent of SharePoint administrators report having at least one content database greater than 200 GB, a percentage that continues to rise.
There are many contributing factors to this slow down, including:
- Insufficient Input/Output Operations Per Second (IOPS)
- Efficiency of the API between SQL Server and SharePoint
- And, as more content enters the database, more users compete to access documents at the same time.
Despite Microsoft’s recommended best practices, organizations inevitably find that their databases get too large for a given environment, architecture, and technical design limits – a problem that dramatically impacts SharePoint performance and end user satisfaction. It’s no surprise that managing performance issues impacted by content database size is reported as a top priority for many administrators.
Interesting in learning more about how to better manage SharePoint storage? Take a moment to read Shattering SharePoint Storage Limitations for a deeper examination of the issues and the solutions for improving them.