Link to home
Start Free TrialLog in
Avatar of Capt_Ron
Capt_RonFlag for United States of America

asked on

Expandable list of URLs

I'm trying to build a heirarchical list of web sites in our environment.  We have about 1500 sites and I am able to traverse the tree and list the sites, but because of the amount, the page takes 12+ seconds to load.

I would like to rebuild this to only bring back one level at a time.  Each time I click the [+] sign I'd like to generate a tree branch 1 level deep just for that URL.

Example:

[+] Site 1
[+] Site 2
[-] Site 3
   [+] Sub site 1
   [-] Sub site 2
      [+] Sub Sub site 1
[+] Site N
etc...

I have all the code I need to generate each piece.  I can get a single level list of sites based on a URL.

My problem is displaying this and getting to to work right.  When the [+] is clicked the GetSubsites(URL) will fire.  But if "Site 1" is clicked, I want the user to be sent to Site 1.
I need help on the [+] generation/clicking and the displaying of the list as a sub list of the one I clicked.

Thanks
Ron
ASKER CERTIFIED SOLUTION
Avatar of Miguel Oz
Miguel Oz
Flag of Australia image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Capt_Ron

ASKER

I've looked at the example and I'm still confused on one part.
I can load the root level fine.  But how do I show the [+] sign if there are no nodes yet?  I can click the Root node and go to the URL, but I have no way of expanding it because the sub nodes aren't there yet.

I must be missing something.
Thanks
Ron
With these kind of things you have to take into account that you are opening the spsite and spweb objects to get all the information.

What I would suggest is that you have a daily timer job running (or hourly) that creates the hierarchy in XML and go from there.

You can use the following to get the subinformation of your sites:

using System;
using Microsoft.SharePoint;
using Microsoft.SharePoint.Navigation;

namespace ConsoleApp
{
    class Program
    {
        static void Main(string[] args)
        {
            using (SPSite site = new SPSite("http://localhost"))
            {
                using (SPWeb web = site.OpenWeb("test"))
                {
                    string format = "{0, -30} {1, -5} {2}";
                    Console.WriteLine(format, "Title", "Id", "Url");

                    foreach (SPNavigationNode node in web.Navigation.GlobalNodes)
                    {
                        Console.WriteLine(format, node.Title, node.Id, node.Url);
                    }
                }
            }
            Console.Write("\nPress ENTER to continue....");
            Console.ReadLine();
        }
    }
}
Thank you.
We thought of that but we need the list to be security trimmed.
SharePoint 2010 removed the My Links list and we are trying to replace it. (Without it's limitations)

Our Thought process is:
Get a list of all site collections and check the logged in user's security.
When the [+] is clicked next to a site collection, go get the next level (check security)
Repeat for every level of sub sites

This way we are only getting one level at a time.

We've successfully done this for all sites at all levels, but the load time is 12+seconds and is unacceptable.

We're trying to reduce the load time by only getting one level at a time.

Thanks
Ron

PS: Once we figure this out, I'll post the code to whomever would like it. :-)
You can do that by populate the nodes in the TreeView dynamically.
Can you post your code/markup? It seems that Populates nodes may not be setup correctly.

Thanks,
Note: Another alternative that I used before if the treeview gets very slow because of the number of nodes, it is third party vendor called obout tree.
Had to implement the TreeNodePopulate event handler properly.
Thank you so much.  My developer was able to implement the TreeNodePopulate event handler properly and the dynamic population started working the way I expected it to.  We are now loading over 1000 SharePoint sites into the list in about 1.5 to 2 seconds.  Much better than the 12 seconds before.

Also we are looking into KoenVosters' suggestion to run a timer job to export an XML file of all sites.  We believe that it may be faster (I emphasize may) to read the XML instead of hitting the SP server multiple times.  

In addition, we have created a new project on CodePlex called SharePoint 2010 My Links Web Part and will be releasing the source code once I've had it cleaned up and commented properly.

Thanks again for all your help.

Now, If I could only convince Microsoft that they should add a Sort function to the SPSite collections... (we had to manually sort the sites in alpha order :-( )