PDA

View Full Version : ext-all.js compressed, packing vs gzip



trbs
27 May 2007, 4:34 PM
Can it be true ? can we safe up to 200Kb on ext-all.js ?

I've started playing around with another packer for the Python ExtJS Build script.

This new packer is called JSPacker (how original :) ) and compresses the ext-all.js an extra 200KB!! after already being compressed with Rhino Compressor and JSMin.

It's a very fast and aggressive packer. I've tested the resulting ext-all.js against several of the example's in svn and they all seem to work fine. But i would like some more input and testing by other people to see if it trully doesn't break anything in Extjs. (Aggressive packers tent to break code)

For all people with svn access which are also interested in squeezing every last byte of the size of ext-all.js, i would ask of u to give this a try with your ext application.

Please report if it worked (or not).

Instructions:

Download Python ExtJS Build Script to your Ext SVN directory: http://extjs-py-builder.googlecode.com/svn/trunk/build_ext_packages.py

Run the application with:


user@host:~/ext$ ./build_ext_packages.py -S -p .

or on windows

C:\Ext\build_ext_packages.py -S -p .


If you got ShrinkSafe (custom_rhino.jar) in the same directory or anywhere else on your CLASSPATH you can remove the '-S' to also pack with ShrinkSafe. The default (no command line options) is to use both ShrinkSafe and JSMin, but not JSPacker hence the -p flag. (JSMin is included in the application, ShrinkSafe must be downloaded separately)

The next version of the build script will also be able to build the documentation with JSDoc, hopefully we can get some feedback from the ExtJS Community on how good (or bad) this new packer is and see if we should include it in future releases or drop it because of broken javascript.

Thanks in advance B)

thejoker101
27 May 2007, 7:19 PM
Yeah, it does remove newlines and shorten variable names. If you think that's insane, you should see how small server compression (gzip) will make it.

nassaja-rus
27 May 2007, 7:36 PM
yes, apache mod_deflate is very useful for gzipping )

jack.slocum
27 May 2007, 10:10 PM
Packed and Gzipped vs. just gzipped, generally results in almost identical file sizes. The difference is gzipped files are unzipped by the browser (c code) and only unzipped once. Packed files are unpacked by JS code and unpacked on every page load. For small files it isn't really an issue, but for larger files (like ext-all.js) I have found it can take over 300-500ms to unpack on every page load, which is unacceptable.

nassaja-rus
27 May 2007, 10:47 PM
Most optimum decision for me - using apache's mod_deflate for gzipping content on the fly. Or covering the heavy apache server with small and fast nginx, with gziping.

Nginx is high perfomance http and reverse proxy server that I use for distribution of static content, pictures css and js. It have many feauteres and faster than apache for distribution of a statics. Also eats less resources than apache.

TommyMaintz
28 May 2007, 1:46 AM
I created a script that gzips all my javascript files, caches them on the server so it doesnt gzip on the fly (saves a lot of CPU time). it also checks for modification date to determine wether to gzip and cache again.

Create a folder called cache in your root.
In .htaccess set:
RewriteRule (.*.css|.*.js) compress.php?f=$1 [L]

compress.php


<?php
// setting variables
$cache = true;
$cachedir = 'cache';

// Determine the directory and file extension
$fn = $_GET['f'];
$t = explode('.', $fn);
$ext = $t[count($t)-1];
switch($ext) {
case 'css':
$type = 'css';
break;
case 'js':
$type = 'javascript';
break;
}

$base = dirname($fn);

// Determine last modification date of the files
$lastmodified = filemtime($fn);

// Send Etag hash
$hash = $lastmodified . '-' . md5($fn);
header ("Etag: \"" . $hash . "\"");

if (
isset($_SERVER['HTTP_IF_NONE_MATCH']) &&
stripslashes($_SERVER['HTTP_IF_NONE_MATCH']) == '"' . $hash . '"'
){
// Return visit and no modifications, so do not send anything
header ("HTTP/1.0 304 Not Modified");
header ('Content-Length: 0');
} else {
// First time visit or files were modified
if ($cache) {
// Determine supported compression method
$gzip = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip');
$deflate = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'deflate');

// Determine used compression method
$encoding = $gzip ? 'gzip' : ($deflate ? 'deflate' : 'none');

// Check for buggy versions of Internet Explorer
if (!strstr($_SERVER['HTTP_USER_AGENT'], 'Opera') &&
preg_match('/^Mozilla\/4\.0 \(compatible; MSIE ([0-9]\.[0-9])/i', $_SERVER['HTTP_USER_AGENT'], $matches)) {
$version = floatval($matches[1]);

if ($version < 6)
$encoding = 'none';

if ($version == 6 && !strstr($_SERVER['HTTP_USER_AGENT'], 'EV1'))
$encoding = 'none';
}

// Try the cache first to see if the compressed file is already generated
$cachefile = 'cache-' . $hash . '.' . str_replace('/', '-', $fn) . ($encoding != 'none' ? '.' . $encoding : '');

if (file_exists($cachedir . '/' . $cachefile)) {
if ($fp = fopen($cachedir . '/' . $cachefile, 'rb')) {

if ($encoding != 'none') {
header ("Content-Encoding: " . $encoding);
}

header ("Content-Type: text/" . $type);
header ("Content-Length: " . filesize($cachedir . '/' . $cachefile));

fpassthru($fp);
fclose($fp);
exit;
}
}
}

// Get contents of the files
$content = file_get_contents($fn);
// Send Content-Type
header ("Content-Type: text/" . $type);

if (isset($encoding) && $encoding != 'none') {
//Send compressed contents
$content = gzencode($content, 9, $gzip ? FORCE_GZIP : FORCE_DEFLATE);
header ("Content-Encoding: " . $encoding);
header ('Content-Length: ' . strlen($content));
echo $content;
} else {
// Send regular contents
header ('Content-Length: ' . strlen($content));
echo $content;
}

// Store cache
if ($cache) {
if ($fp = fopen($cachedir . '/' . $cachefile, 'wb')) {
fwrite($fp, $content);
fclose($fp);
}
}
}
?>

dj
28 May 2007, 3:21 AM
Hi Tommy,

i did nearly the same. Pre-gzipping saves quite some CPU cycles on server side and makes the delivery of your JS quicker. The optimal .js and .css delivery - as i see it - would be:


combine all files into one (saves overhead for GZIP-indexes and multiple TCP-connections)
ShrinkSafe & JSMin the file (better compressability because of fewer different variable names)
save a pre-compressed version (no overhead for the on-the-fly compression)
generate a uniqe url for each different version of your JS (longer cacheing time possible, so the browser doesn't even ask if the file is up-to-date)


You really should add some sanity checks to your script, as far as i understand it, if you call it directly with compress.php?f=/etc/passwd it will return the passwd of your server...

trbs
28 May 2007, 6:51 AM
Hi Tommy,
You really should add some sanity checks to your script, as far as i understand it, if you call it directly with compress.php?f=/etc/passwd it will return the passwd of your server...

Thanks a lot for the script Tommy :) I can use it for some PHP sites and i think it will help a lot of people. (as long as they make sure to check file locations, so u don't reveal /etc/passwd or alikes)

I mostly use mod_deflate on the server for small sites/simple shared hosting, i don't think it can cache the gzipped content directly (thus wasting a view cpu cycles) but for most of shared hosting places where i use full ext-all.js, admin sites, that isn't such a big deal.

For productions sites i have a middleware for my application server that acts much like Tommy's php file. When a clients requests a compressable file type it compresses on the fly and caches it in memcached. (or any other caching backend)

trbs
28 May 2007, 7:01 AM
Packed and Gzipped vs. just gzipped, generally results in almost identical file sizes. The difference is gzipped files are unzipped by the browser (c code) and only unzipped once. Packed files are unpacked by JS code and unpacked on every page load. For small files it isn't really an issue, but for larger files (like ext-all.js) I have found it can take over 300-500ms to unpack on every page load, which is unacceptable.

Thanks for the input. i too use gzip on the server and just recently started playing with packers. The 0.5s delay for unpacking is indeed unacceptable, so i'll leave the packer in the builder for people to play/work with but not enable it by default.

papasi
28 May 2007, 10:06 AM
just fyi, your script has quite a bit of bugs, for one, this is too naive. you need to read up on the rfc to learn about the q values. also, if there are custom encodings with names like x-gzip, your script will break

$gzip = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip');
$deflate = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'deflate');

Hani
28 May 2007, 1:53 PM
For Java users, there's http://www.galan.de/projects/packtag

No hackery required, it'll reduce the size of ext-all-debug.js to around 100-120k (I point it at the debug version rather than ex-all.js since it runs jsmin as part of the process). The css shrinker has issues though so don't use that.

franklt69
28 May 2007, 2:00 PM
is there some script to asp.net world?

kind regards
Frank

Serlo
28 May 2007, 2:35 PM
you could take a look at how tinymce do it for asp.net

http://wiki.moxiecode.com/index.php/TinyMCE:Compressor/ASPX

franklt69
28 May 2007, 3:54 PM
Hi Serlo look good that, do you have some experience with TinyMCE Compressor and ext? I mean is work ok? any comment is welcome.

kind regards
Frank

digeomel
31 May 2007, 5:15 AM
If you want to do this in PHP, and you have multiple JS/CSS files to include, I am using a modified version of this script:

http://rakaz.nl/item/make_your_pages_load_faster_by_combining_and_compressing_javascript_and_css_files

The original script on the site has a bug which never sends the compressed content on IE6, and does not handle clients who use HTTP 1.0 instead of 1.1.

If anyone is interested I can post my fixes here.

But overall, it is a great solution, with combining, compressing, caching and all the bells and whistles. For example, combining the YUI adaptor libraries and ext-all results in one gzipped file of 138k.

jack.slocum
1 Jun 2007, 5:50 PM
I use a .htaccess file that makes js and css files processed as PHP. Then I use an auto prepended file that starts an output buffer and an auto appended file that gzips it. Then I drop these in a parent directory and all js and css is automatically compressed. It's probably not the most efficient way, but it's easy and it doesn't seem to slow the server down.

digeomel
2 Jun 2007, 12:49 AM
I use a .htaccess file that makes js and css files processed as PHP. Then I use an auto prepended file that starts an output buffer and an auto appended file that gzips it. Then I drop these in a parent directory and all js and css is automatically compressed. It's probably not the most efficient way, but it's easy and it doesn't seem to slow the server down.

Does your .htaccess file check if the client supports gzip compression? You wouldn't believe it, but I recently ran across a PC in our organization with IE 6 SP2 who was talking HTTP 1.0 (for whatever reason). I had to modify the aforementioned PHP script to combine the files without gzipping them if the client does not send the right headers.

manugoel2003
22 Jun 2007, 10:27 PM
I use a .htaccess file that makes js and css files processed as PHP. Then I use an auto prepended file that starts an output buffer and an auto appended file that gzips it. Then I drop these in a parent directory and all js and css is automatically compressed. It's probably not the most efficient way, but it's easy and it doesn't seem to slow the server down.

Hey Jack, can you post ur solution here please. [Edit] both htaccess file and the gzipping script :)