I was wondering whether anyone knows of any tools available that perform the task of analyzing one or more CSS files, determining the similarity between the various rules within the files and presenting the user with options for the merging and reduction of rulesets.
I ask this because a project I am working on has reached the point where it has so much CSS that Internet Explorer (Still the bottom line I'm afraid) chokes on the CSS after page load, causing a 3-5 second lock-up in interactivity until the choke is processed.
In case you're wondering: Yes, I am sure it is the CSS causing this issue.
Is there a more efficient way to reconcile large data sets?
How to optimize mysql indexes so that INSERT operations happen quickly on a large table with frequent writes and reads?
Anyone care to help optimize a MySQL query?
Efficency of Comparisons in C++? ( abs(X)>1 vs abs(x) != 0)
PHP cli Memory usage optimization
Does the .NET JIT optimize nested try/catch statements?
Optimize file open and read
Performance testing scenarios required