Fri Jun 10 09:41:29 PDT 2011

adventures in HTML optimization

[1]

Last week, someone linked me to Google Page Speed. This sucked, since it directly resulted in me spending rather a lot more time than strictly necessary dicking around with Apache configuration files.

My server doesn't get a whole lot of traffic,[2] so I hadn't bothered setting Expires: headers, under the "who cares" school of thought. When Apache CPU utilization doesn't get above .01% when you're getting 5 hits a second from HN, there's not a whole lot of incentive to aggressively cache files. But when I ran bbot.org through Page Speed, I received the humiliating news that it only scored 68/100. 68! That's a low number!

Resolving most of the issues was easy, (Turning on Expires, bzip compression, changing the black and white logo image from full 24-bit color to grayscale, etc.) but going from 98/100 to 100/100 was kinda painful.

Page Speed is a vast improvement over YSlow, but it shares some of the inherent problems of an automated performance tool.

For one, it doesn't seem to care much about actual page load times, as long as you follow its rules. It gives cracked.com, ("Auschwitz for javascript engines") a phat 90/100, even though the front page takes 8.2 seconds to load, makes 188 HTTP requests from about a billion domains, loads 36 javascript files, throws 166 warnings in Chrome's audit tab, has 466 unused CSS rules, and is, in fact, pure evil. For a period of time while I was testing things, the links div loaded its own font-face, the smallcaps version of Linux Libertine. Now, anyone with half a brain can tell you that loading a 300 kilobyte font just to style 20 words of text, something you can just do in CSS with font-variant:small-caps anyway, is pants-on-head retarded; but Page Speed was totally fine with it.

For two, while it doesn't actually come out and say that you should "Use a CMS" like YSlow does, (A monumentally useless piece of "advice") it sure does wink a lot and nod suggestively towards it.

The sticking point, that robbed me of two points and kept from that tantalizing perfect score, was "Inline small CSS". I, of course, kept the stylesheet in an external file, and linked to it from every page, because it's easier to maintain that way. Except, of course, it's a small stylesheet, and page rendering would be faster if you just stuck it in the HTML file. This would be a pain... unless you used a CMS, which could just seamlessly inline a stylesheet when publishing a document. Funny.

Inlining the CSS granted me the two points, but then page speed turned right around and docked me one, since the stylesheet had a lot of whitespace, and pushed the file over the tipping point where Google thought it would be worthwhile to minify the source code. Now this is a pain, since I hand-edit my code, and minifying makes code hideously ugly. Would have been trivially easy to do if I used a CMS, of course. Minifying was tedious and fiddly, since the tool I used liked to munge the inlined CSS, and scribble all over my link formatting. It was worth it, though! After twenty minutes of swearing, I finally trimmed off that last 120 bytes, and scored a perfect 100/100! Yeah!

Now, if you'll excuse me, I have to go put everything back to the way it was.


1: Google showing off their mad UI skillz there on the "refresh results" button.

2: How much traffic does it get? Last month, my host called in a panic. Apparently, my box had consumed 1000% more bandwidth than it did from the month before-- it had used up 10 megabytes! My site doesn't get a lotta traffic, I tell ya, every page load takes 30 seconds, because the disks end up spinning down between hits! My traffic is so low, my Alexia site rank is measured in scientific notation! It's low, I tell ya!


Posted by | Permanent link | File under: important, Linux