Geeks With Blogs
Thorvald Bøe
When dealing with SharePoint Online, there are some limitations to consider.

The most famous one is the list view threshold of 5000 elements in a list view. But there are also others. In my company, we have implemented a project workspace solution for managing projects. In essence, we create a subsite for every project in a project list. This has lead us to another potential limitation: The number of subsites per site collection.

When googling the limitations of SharePoint Online, I find a page saying that the maximum number of subsites per site collection is 2000. It does not say whether this is within the same hierarchy level, or as a total for the whole site collection, but the way it is read it seems to be the latter. If so, it means that you cannot escape this by adding more subsites with sub-subsites, you would actually have to create more site collections (that max limitation is 500 000 per tenant, so that should be ok)

But while googling, I also found this link:
http://www.metalogix.com/Blog/Blog-Article/metalogix-software-blog/2014/09/17/sharepoint-online-ups-site-collection-sub-sites-storage-and-file-sizes-beyond-sharepoint-on-premises

This article says that the number is lifted from 2000 to 100 000. If that is true, it is good news for us. But I cannot find any official MS documentation of this, so I guess what remains is to actually test it.

Time to code! :)

I start off building a small console app using the SPO SDK. It runs in a loop, creating 10 subsites in a batch. 

        private static void AddSubsites(int count, string targetSiteUrl, string userName, string password)
        {
            ClientContext context = GetContext(targetSiteUrl, userName, password);
            var web = context.Web;
            context.Load(web);
            context.Load(web.Webs);
            context.ExecuteQuery();
            Console.WriteLine(web.Webs.Count.ToString() + " number of subsites, now adding " + count.ToString() + " more...");

            for (int i = 0; i < count; i++)
            {
                var creationParams = new WebCreationInformation();
                Guid id = Guid.NewGuid();
                string title = "Subsite " + " " + i.ToString() + " " + id.ToString().Substring(0, 6);
                string url = "Subsite " + "_" + i.ToString() + "_" + id.ToString().Substring(0, 6);
                creationParams.Title = title;
                creationParams.Url = url;

                web.Webs.Add(creationParams);
            }

            context.ExecuteQuery();

        }


But it doesnt take long before I find out that this doesnt work. The program crashes with the following error message:

the request uses too many resources

When googling, I find that the reason is probably that it takes too long to execute within the same ExecuteQuery call. I change it to call ExecuteQuery each time within the loop, and it works fine.

However, the second problem is that it takes almost 2 minutes to add 10 subsites. At this speed, it will take hours to create 2000. At first, I try calling ExecuteQuery at every 2 or 3 request, but that does not help much.

I decide it is time to find out what SharePoint Online can handle, and plans to fire up many more instances of the same program to speed it up. But first I need to handle something.

Throttling
If you are not familiar with throttling in SharePoint Online, it is not so strange. It is rarely an issue within "normal" usage. Microsoft says that they will not tell us the exact limits, but if a process is putting too much load on the application, they will return a code 429 webexception. That means the process will have to wait a certain period of time (again they wont tell us how long) until it tries again. Each subsequent request within the throttling timespan will return more 429 exceptions, but also add to the timespan. And if you just keep on hammering, you will finally be blocked, and will have to call MS support in order to lift the block and probably explain yourself.

It is recommended to implement a "back off algorithm" in order to handle throttling responses (the 429 code). This is basically just an incremental wait and try again approach.

Since I was planning to potentially set SPO on fire, I found it wise to add some throttling handling to prevent me from being blocked. Here is the code:


public static class ClientContextExtension
    {
        //Throttling handling
        public static void ExecuteQueryWithIncrementalRetry(this ClientContext context, int retryCount, int delay)
        {
            int retryAttempts = 0;
            int backoffInterval = delay;
            if (retryCount <= 0)
                throw new ArgumentException("Provide a retry count greater than zero.");
            if (delay <= 0)
                throw new ArgumentException("Provide a delay greater than zero.");
            while (retryAttempts < retryCount)
            {
                try
                {
                    context.ExecuteQuery();
                    return;
                }
                catch (WebException wex)
                {
                    var response = wex.Response as HttpWebResponse;
                    if (response != null && response.StatusCode == (HttpStatusCode)429)
                    {
                        Console.WriteLine(string.Format("CSOM request exceeded usage limits. Sleeping for {0} seconds before retrying.", backoffInterval));
                        //Add delay.
                        System.Threading.Thread.Sleep(backoffInterval);
                        //Add to retry count and increase delay.
                        retryAttempts++;
                        backoffInterval = backoffInterval * 2;
                    }
                    else
                    {
                        throw;
                    }
                }
            }
            throw new Exception(string.Format("Maximum retry attempts {0}, have been attempted.", retryCount));
        }

    }

As you can see, it is basically an extension method to replace ExecuteQuery, that catches the 429 response and sleeps for a while if it happens. If it happens again, it doubles the sleep length.

I replace my previous call to ExecuteQuery, with a call to ExecuteQueryWithIncrementalRetry. I set the retry to 5, and the interval to 30 seconds.

context.ExecuteQueryWithIncrementalRetry(5, 30000);

Then I fire up 5 simultaneous windows:

The result is promising. Now the subsites are created at a speed 5 times quicker than before. But it doesnt take long until I get throttled:

Fortunately the back off algorithm works, and I will not risk to get blocked. But maybe 5 simultaneous processes are too much, so I reduces it to 3. After about 30 minutes, I've reached 1000 subsites. Half way through, yay! :)

But it is a bit slower now, maybe I should risk starting up one more window:

Seems to be working fine, closing in on 1200 now, "just" 800 to go.

About 40 minutes later (I was interrupted and had to resume it later, so the time in the output does not match the actual running time), I am approaching 2000 subsites. Now 1 of 3 things can happen: 
1.I am allowed to create the 2001th subsite. If so, it indicates that the limit is actually lifted to 100 000 (I am not going to test that). This is the best outcome.
2.It crashes somewhere around the 1950th site or so. This indicates that the total number of subsites on the site collection is 2000, and along with the other subsites I have in different parts of my hierarchy (I assume it is about 50 or so) we have now reached the limit. This is the worst outcome.
3.It crashes when trying to add the 2001th subsite. This means that the 2000 limit applies to the current hierarchy level. This is a fairly ok outcome, because it means that we can work around the problem by adding more hierarchy.

Anxious for the result? So am I! Here it is:


As you can see, the total count is well beyond 2000 and still going strong.

So I guess we can rest assured that the limit is lifted to something larger than 2000 - and if we shall believe the previous article, the new number is 100 000. Which should be enough for quite a while :-)



Posted on Friday, December 18, 2015 1:14 PM sharepoint , sharepoint online , CSOM , throttling | Back to top


Comments on this post: Finding the maximum number of sub sites in SharePoint Online

# re: Finding the maximum number of sub sites in SharePoint Online
Requesting Gravatar...
Thanks for the verification, we're happy with the results as our main sitecollection will soon reach the 1000+ subsite range, and we're expecting to meet the 2000 'limit' by next year
Left by LG on Apr 21, 2016 8:31 PM

# re: Finding the maximum number of sub sites in SharePoint Online
Requesting Gravatar...
Thanks for your research - did you notice any performance degradation as you created more and more subsites? The MSDN docs say:

Performance can degrade as the number of subsites surpasses 2,000 at the site collection level.


(this page: https://technet.microsoft.com/en-GB/library/cc262787.aspx?f=255&MSPPError=-2147217396#SiteCollection)

So I would not have expected it to break at 2000 subsites, just certain operations to slow down. Did you check any performance metrics?

Thanks again!
Left by Jonathan Cardy on Nov 04, 2016 11:57 AM

# re: Finding the maximum number of sub sites in SharePoint Online
Requesting Gravatar...
Hi Jonathan,
sorry for the late reply, as I am not really maintaining this blog much anymore.

To answer your question, I did not detect any performance degradation, but I did not really investigate it either.

I would, however, not expect any big performance issues at the 2001'st sub site, but rather at say the 3000nd or 4000nd, but I never added that many subsites.

Did you try it out or experience any performance issues?
Left by Thorvald on Jan 14, 2017 6:26 PM

Your comment:
 (will show your gravatar)


Copyright © Thorvald Bøe | Powered by: GeeksWithBlogs.net