Steve Souders gave this talk at the 2010 jQuery conference in Mountain View, California. My notes are below, and you can find the slides on Steve's website.
My last five years has been about making things that are slow faster. While I was at Yahoo I created ySlow. Now I'm at Google, writing even faster websites. I taught a course at Stanford based on the books I wrote.
The biggest thing I've brought to the world of web performance is flipping things upside down. Even at some of the biggest companies out there, there's a bias to seeing performance only on the back-end. But when you look at the total user experience, a lot of it is front end. For example, for a page request on iGoogle, only 9% of the page load time is the actual HTML of the page, and 91% is other resources, usually static resources, that are being served out of memory on the server or from a CDN.
If you're concerned about scalability, focus on the back end. But if you want to focus on the user experience, you have to look a the front end too.
Site Speed and Search Engines
There was an announcement from Google that site speed is going to be taken into account when ranking pages. There was mixed, slightly negative reaction to this announcement, and a lot of questions. We made a similar announcement two years ago with regard to Adwords quality score.
The timing data from Webmaster Tools is one metric that we use in determining search result ranking, using timing data from users who have installed Google Toolbar. Webmaster Tools gives you fairly good visibility into what real users are experiencing with page load times.
YSlow and Page Speed
Run both of these when evaluating your site. About half of the rules and best practices are covered by both tools, but there are some specific rules that only apply to one tool or the other. For example, YSlow tests CDN and CSS sprites. There are other sites like ShowSlow which allow you to send a beacon of results up to the server, and he'll keep a historical archive of YSlow and PageSpeed results over time.
A free, open source tool where you can select locations from around the world and find your page load times under different conditions. It gives you some stats and performance suggestions in a waterfall chart. It can even show a flimstrip of how your page appears as it loads, as low as 1/10th of a second interval. Additionally, you can export this filmstrip as a video. One great way to make performance a compelling metric is to show a video of your site loading side-by-side with your competitor's site.
How do we speed up 3rd party content?
Ads, Widgets and Analytics are commonly blamed for slow site performance. Five years ago, perhaps 25% of the performance slowdown on a site would come from ads. However, today there are more ads, different ad networks, and lots of mashups and analytics out there. Today, half or more of the slow site performance can come from external sources, such as Google Maps, AddThis or the Meebo Toolbar. How can you optimize that code when you're relying on a third party to serve that code?
appendChild or insertBefore?
One way of loading a script asynchronously is to load the head element of a page and append it. However, not all pages have head elements. When Google was trying to load ga.js asynchronously, they started using document.getElementsByTagName('script'). Facebook tries to get the head, and if it doesn't work, they get the body. It works for them because they're not putting it into a third-party system. jQuery has a great system; it gets the head or it gets the documentElement.
Steve gave an in-depth discussion of the Collective Media blocking JS ad loader. He's already documented this in a blog post on his site.
How do we fix it?
Eventually we'll have browsers adopt it, and publishers will adopt it. More and more website owners will wrap their third party content in FRAG tags with sandboxing enabled, and the third parties will be forced to improve security.
This is just a twinkle in my eye right now.
Browser Caches are Tiny!
- IE is 8-50M
- Firefox: 50M
- Chrome: Opera: 20M
Eviction algorithms can be improved. Why not push out images first, and not scripts? Keep the scripts and stylesheets in the cache! I would like to see cache items that had a large impact on the page load speed cached for longer, as well as preferred sites that should have the highest priority in the cache.
What makes sites seem slow?
The number one pattern I recommend for today's webapps is to focus on progressive enhancement. Do as much as you can on the server, defer as much JS as possible, avoid touching the DOM, and add any rendering and event handling later.
If you focus on progressive enhancement, you're going to get progressive rendering, and it will make you site appear to load faster and be more responsive to the user.
Split the Initial Payload
Thanks to Steve for the great talk.