[QUOTE=gaydemon;62223]Ah but there is a very simple way of solving the problem. I only use it for images and scripts that are used on all pages and I have a domain alias + htaccess setup for:
Forcing it to refresh. It’s still the same file and location but just a simple htaccess rewrite that points that alias to the real location. Same with Java.:)[/QUOTE]
The additional dns lookup and http connection is probably hurting you more than you’re saving with the expires header. Putting everthing on www will make things a bit faster. You can still do your rewrite trick on a /static/ directory instead of a subomain.
[QUOTE=William Godin;62232]It’s crazy hard to compute Apdex scorings! For the sake of our sanity, we’ve outsourced that part. (We’re using NewRelic)
Apdex doesn’t calculate business performance per se (sales, etc…). It just takes those hard-to-read technical metrics and transforms them into a measure of user satisfaction.
It basically tells you if the performance of your websites is a source of frustration for your customers in a manner us, non-über-geeks, can understand it.
Happy customers tend to be more loyal. And I just looove loyal customers. lol[/QUOTE]
It sounds really interesting, I’m a sucker for efficiency and stats… What do you need to use it? What data is necessary to get a good “picture” or to be able to use this way of calculating performance?
Sure, there are probably a quite a few providers out there who can compute Apdex for you. I’m using NewRelic RPM, but they require your website to be built in Ruby (or in Java I believe).
It all depends on the architecture of your websites. And if you can’t find a provider, and you have access to programmers, they may be able to implement it for you.
What I found makes the most difference that you can actually see is to use parrallele downloads. Browsers can only request 2 connections per URL at any time. So I’ve setup that my CSS and Java files sits on www.gaydemon.com with cookies and no expiration.