Looking for the v10 manual? Visit our new user's guide!
Search Descriptions Version
This article applies to: ML v7, ML8, ML, MultiStore

Performance Tuning for Large Sites

The AspDotNetStorefront platform is very powerful, and has no hard-coded limits internally on the size of site you can use it for. However, as your site grows in size (number of products, categories, manufacturers, etc) it may be necessary to make some tuning adjustments to achieve maximum performance.

The techniques below can be used to help optimize performance in the software. Large sites also must be properly hosted, using sufficient web server and DB server hardware, and many other factors of the hosting environment can also come into play which affect overall site performance.

Large sites are defined as sites that have more than 5,000 entities combined (e.g. Categories, Departments, & Manufactures) or more than 100,000 products. There are combinations of these two parameters that could also cause a site to need these optimizations too (e.g. 30,000 products and 3000 entities. The entity count usually refers to a single entity possessing this many entries.

The performance of the site is impacted more by large numbers of entities than for products. This is because the Entityhelper is a cached Xml document and the size of the document can consume a large amount of memory and creating this document can take a long time when the application starts up.

Trimming the EntityHelper data

This data comes from the aspdnsf_EntityMgr stored procedure. Many of the fields are not used in the default site. The following list of fields can usually be removed without impact to the site:

- SortByLooks
- ContentsBGColor
- PageBGColor
- GraphicsColor
- QuantityDiscountID
- TemplateName.

Others may be eliminated but test your site after removing them to see if they have an impact. This will reduce the memory footprint of your site and reduce the amount of time to create the entity XmlDocument on startup.

Prevent building some navigation controls from entity data

This step requires source code and recompiling the AspDotNetStoreFrontCommon project

The Parser class has a method in it named BuildPageStaticTokens. In it is the following code block:

foreach (String EntityName in AppLogic.ro_SupportedEntities)
    String ENU = EntityName.ToUpperInvariant();
    StringBuilder tmpSx = new StringBuilder(4096);
    EntityHelper Helper = AppLogic.LookupHelper(EntityName);
    m_StaticTokens.Add("(!" + ENU + "_BROWSE_BOX!)", Helper.GetEntityBrowseBox(SkinID,

If you are not using any of the skin tokens for navigation in your template you can comment out this code or you can add a test for the specific entity to exclude from this expensive process. Also, if you have a large number of manufacturers that you don’t need the browse box navigation for you can comment out the following line in the same method:

m_StaticTokens.Add("(!MANUFACTURER_BOX!)", AppLogic.GetManufacturersBox(SkinID, ThisCustomer.LocaleSetting));

Modifying XmlPackages

The entity XmlPackages by default include the EntityHelper Xml data in the package’s data document. This data is added for convenience so that you don’t have to add another query to your package for data that is already cached on the web server. This is done using the includeentityhelper attribute on the package element:

<package version="2.1" displayname="Entity Grid" debug="false" includeentityhelper="true">

Use of this becomes very costly when there are a large number of entities. Set this to false and add a query to the package to get the data that you need.

Use XML data returned from SQL Server queries
When there is a lot of data to be returned from your queries you can return it from the database already formatted as xml. This reduces the need for the web server to convert the data to xml. To do this you need to make sure that your queries uses the FOR XML clause on you query. In the package you need to add the retype attribute to the query element and set it’s value to “xml”.

<query name="Products" rowElementName="Product" retType="xml">

Don’t use GetMLValue extension function on single language sites
In all XmlPackages we use GetMLValue extension function for retrieving the proper locale data from ML fields (like product name, product description, Category name, etc.) Because of a bug in the .NET framework this function can leave undisposed objects cause high CPU use due to excessive garbage collection. Replace aspdnsf:GetMLValue(Fieldname) with just the Fieldname.

Optimize SQL queries

These optimizations are best done by a skilled DBA

The aspdnsf_GetProducts stored procedure is used in all entity packages and implements every filter option that the store supports. This can make this procedure very expensive.

The first step is to remove the filter tables that are not needed. Below is the FROM clause from one of the queries in the procedure. If you are not using some of these table you can remove them from the query and remove the associated WHERE criteria.

    product p with (nolock)
    join #displayorder do on p.ProductID = do.ProductID
    left join ProductVariant pv with (NOLOCK) ON p.ProductID = pv.ProductID and pv.IsDefault >= @ViewType
    left join productcategory pc with (nolock) on p.ProductID = pc.ProductID
    left join productsection ps with (nolock) on p.ProductID = ps.ProductID
    left join ProductManufacturer pm with (nolock) on p.ProductID = pm.ProductID
    left join ProductDistributor pd with (nolock) on p.ProductID = pd.ProductID
    left join ProductGenre px with (nolock) on p.ProductID = px.ProductID
    left join ProductVector px2 with (nolock) on p.ProductID = px2.ProductID
    left join ProductLocaleSetting pl with (nolock) on p.ProductID = pl.ProductID
    left join ProductCustomerLevel pcl with (nolock) on p.ProductID = pcl.ProductID
    left join ProductAffiliate pa with (nolock) on p.ProductID = pa.ProductID
    join #inventoryfilter i on pv.VariantID = i.VariantID

You can also remove other queries related to tables that you are not using. There are other WHERE criteria that could be removed also. If you’re not using start and stop dates you can remove this criteria

((@IncludeAll = 1) or (getdate() between isnull(p.AvailableStartDate, '1/1/1900') and isnull(p.AvailableStopDate, '1/1/2999')))

Once you make these changes you can evaluate the procedure for additional indexing.


Use SQL Profiler to find queries that are long running or require a large number of reads. Creating indexes can be complicated and if you create too many it can also negatively effect performance. So, be careful to design indexes that do only what you need.

You may also want to create some indexed views. These are useful when you need to include aggregate data in your queries (like inventory data when you’re tracking inventory by size and color). So, instead of doing something like this in your query every time it’s run:

select VariantID, sum(quan) from Inventory group by VariantID

Create this as a view then add an index to it. This will force SQL Server to populate and index with the aggregate value sum(quan) and only update it when the inventory table gets updated. This is far less that the number of times that entity pages are viewed (and hence the above query would be executed). The downside to using them is that an indexed view prevents physical changes to the underlying tables. So, if you try to upgrade the site and one of these tables is going to have a field added or removed it will throw an error. You will need to drop all indexed views perform the upgrade then re-create your views and indexes.