Home How to Enterprise Level Page Speed Optimisation – Part 1

Enterprise Level Page Speed Optimisation – Part 1

Enterprise Level Page Speed for SEO

You might be surprised to know that most of the page speed recommendations you either implemented, received from your SEO agency or found in online guides are almost pointless.

General recommendations such as:

  • Browser caching
  • Optimising images
  • Gzip compression
  • Minify CSS
  • Minify HTML
  • Minify JavaScript
  • etc

Will yield almost no noticeable difference in page load speeds. The one page speed recommendation that actually makes a significant difference is external requests.

External Requests

Most webmasters underestimate the impact external requests have on a website. Large enterprise websites suffer the most, hence why you may always find page speeds on large websites such as Macy’s, John Lewis, Best Buy etc have fairly poor page load speeds when being tested by popular tools such as WebPageTest or GTMetrix.

To give you a quick demonstration on how external on-page requests can impact a websites page load speeds. We prepared a side by side comparison. On the left is the John Lewis homepage, but optimised by us, removing all unnecessary external requests. Whereas on the right hand side it’s the original untouched version of John Lewis.

 

We were able to drop page speed down by 9.5 seconds by removing 170 external requests that are not compulsory for the page to function correctly.

Note: Nothing else on the website has changed, everything has remained identical.

How do you remove external requests?

Most decent page speed tools will have an advanced featured called ‘blacklist’. What this does is allows you to enter a list of domains that you would like to blacklist, in order words, prevent them from loading.

Within GTMetrix, when you run a page speed report, you will find under the Yslow tab ‘Reduce DNS lookups’, as seen below.

GTMetrix Reduce DNS Lookups
GTMetrix Reduce DNS Lookups

Once you open this tab, you will find a long list of domains, the list looks similar to this.

DNS Domains

The next step requires a bit of technical knowledge. We need to exclude all the domains that are not required to render the page. The first domain, for obvious reasons should not be included in the blacklist. For John Lewis specifically, we did not blacklist some external domains such as johnlewis.scene7.com. This is the domain John Lewis uses to load static assets such as images.

Once you have gone through the list, you will need to add a protocol at the beginning and an ending slash with a star /*. The final list should look similar to the one below.

https://assets.adobedtm.com/*
https://nexus.ensighten.com/*
https://media.richrelevance.com/*
https://se.monetate.net/*
https://sec.levexis.com/*
https://ssite.johnlewis.com/*
https://f.monetate.net/*
https://d2oh4tlt9mrke9.cloudfront.net/*
https://tapestry.tapad.com/*
https://ws.sessioncam.com/*
https://ad.adlegend.com/*
https://pfa.levexis.com/*
https://recs.richrelevance.com/*
https://public.edigitalresearch.com/*
https://d3c3cq33003psk.cloudfront.net/*
https://dd6zx4ibq538k.cloudfront.net/*
https://orca.qubitproducts.com/*
https://dtxtngytz5im1.cloudfront.net/*
https://johnlewis.ugc.bazaarvoice.com/*
https://static.criteo.net/*
https://cp.impdesk.com/*
https://connect.facebook.net/*
https://secure-ib.adnxs.com/*
https://pix.impdesk.com/*
https://sslwidget.criteo.com/*
https://opentag-stats.qubit.com/*
https://www.facebook.com/*
https://ib.adnxs.com/*
https://x.bidswitch.net/*
https://cm.g.doubleclick.net/*
https://simage2.pubmatic.com/*
https://pixel.rubiconproject.com/*
https://ads.yahoo.com/*
https://tags.bluekai.com/*
https://mmtro.com/*
https://stags.bluekai.com/*
https://ih.adscale.de/*
https://secure.adnxs.com/*
https://aimfar.solution.weborama.fr/*
https://i.w55c.net/*
https://nzaza.com/*
https://pixel.mathtag.com/*
https://gum.criteo.com/*
https://tracker.adotmob.com/*
https://loadeu.exelator.com/*
https://idsync.rlcdn.com/*
https://d22rutvoghj3db.cloudfront.net/*
https://load.s3.amazonaws.com/*
https://stash.qubitproducts.com/*
https://d1m54pdnjzjnhe.cloudfront.net/*
https://gong-eb.qubit.com/*
https://www.googleadservices.com/*
https://dis.eu.criteo.com/*
https://edigitalsurvey.com/*
https://bam.nr-data.net/*
https://tags.bkrtx.com/*
https://us-u.openx.net/*
https://dis.criteo.com/*

 

Using GTMetrix, you can paste the list above directly into the URL Blacklist field.

It’s important to note that some domains are indeed compulsory, for example Google tracking and other important tracking requests. It’s the clients job to review all external requests and remove any that are not compulsory. Furthermore, if possible host external requests internally.

Conclusion

Never underestimate the impact external requests can have on a website. Watch this space as we publish part 2 of our Enterprise level Page Speed Optimisation series. We will be covering server response time in comprehensive detail and demonstrating the impact it can have on SEO and the user experience.

 

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here