You don't have to believe in PSI.
Do you think if you made your scripts, pages and images as physically small as possible (without losing quality) and enabled compression (so less network needed to download), that it would be good?
In other words, do you think if you used best practices for your website, that maybe it would be faster?
Thomson Reuters website: horrible code, nothing validates, PSI gives it a bad score, BUT they throw a LOT of bandwidth at it so speed isn't too bad. What's your take on that? Are they getting their money's worth?
How does your car run when it hasn't had a tuneup in 10 years? Get my point?