PDA

View Full Version : JSON Gzip Compression



emredagli
8 May 2009, 4:19 AM
Hi,
I have huge object (about 130 kb) coming from server as json string.
I compressed it by using gzip in server side and I am sending compressed string to client.

Browser decompressed it. It is ok.

But after I made some operations I want to send back as compressed json string to server.

I searched but I couldn't found gzip implementation in javascript.

As I know ExtJS doesn't have such a function... Am I correct?

Thanks for your advance.

Condor
8 May 2009, 4:31 AM
GZip using javascript :)):)):))

Do you have any idea how long it would take javascript to compress using the GZip algorithm?

The long compression time completely outweighs any advantage you would gain from the smaller transfer.

ps. Base62 encoding would also make the data smaller (but not as much as compression).

jay@moduscreate.com
8 May 2009, 4:33 AM
Correct, there is no native gzip implementation for javascript.

VinylFox
8 May 2009, 4:35 AM
The short answer - no.

Maybe you could trim the size by shortening your names in the JSON. Its a common compression tactic.

ie:


{'some_long_name':'bla'}

Becomes...


{'sln':'bla'}

and you alter your scripts on the server to translate the short name to what you need.

emredagli
8 May 2009, 4:55 AM
I previously found a "JavaScript" compression implementation of LZW


// LZW-compress a string
function lzw_encode(s) {
var d = new Date();

var dict = {};
var data = (s + "").split("");
var out = [];
var currChar;
var phrase = data[0];
var code = 256;
for (var i=1; i<data.length; i++) {
currChar=data[i];
if (dict[phrase + currChar] != null) {
phrase += currChar;
}
else {
out.push(phrase.length > 1 ? dict[phrase] : phrase.charCodeAt(0));
dict[phrase + currChar] = code;
code++;
phrase=currChar;
}
}
out.push(phrase.length > 1 ? dict[phrase] : phrase.charCodeAt(0));
for (var i=0; i<out.length; i++) {
out[i] = String.fromCharCode(out[i]);
}

var retrunedresult = out.join("");
console.log("Input: " + s.length/1024 + "kb Output:"+ retrunedresult.length/1024 + "kb Rate: " +(s.length/retrunedresult.length) );
console.log((new Date()).getTime() - d.getTime() + ' ms.');
return retrunedresult;
}


// Decompress an LZW-encoded string

function lzw_decode(s) {
var dict = {};
var data = (s + "").split("");
var currChar = data[0];
var oldPhrase = currChar;
var out = [currChar];
var code = 256;
var phrase;
for (var i=1; i<data.length; i++) {
var currCode = data[i].charCodeAt(0);
if (currCode < 256) {
phrase = data[i];
}
else {
phrase = dict[currCode] ? dict[currCode] : (oldPhrase + currChar);
}
out.push(phrase);
currChar = phrase.charAt(0);
dict[code] = oldPhrase + currChar;
code++;
oldPhrase = phrase;
}
return out.join("");
}



It's results are:
Input: 136.7041015625kb Output:9.1962890625kb Rate: 14.865137517256027
290 ms.

Is my computer too fast? (Intel Pentium Dual PC 2 GHz.) :))

jay@moduscreate.com
8 May 2009, 5:10 AM
Personally, i say stop messing with that stuff and let the server and browser perform the gzip compression..

Instead of using JSON, use arrays, which will significantly reduce your transfer overhead

ypandey
19 Feb 2014, 3:37 AM
Hi Jay,
I liked your approach, instead of sending data in json use arrays :)
But how server will serialize value object to array data?
We have been using Jackson for VO to JSON conversion.
Thanks

osmantekin
21 Aug 2014, 12:09 PM
You have to be smart when it comes to network optimization, because yes, the payload size is important.

Common compression algorithms like Gzip, Zlib etc. aren't performant enough that it won't be noticed client side.

But there is lz4 (https://code.google.com/p/lz4/) since 2011 and it's the fastest at compressing/decompressing right now. It reaches 400MB/s speed per core in compression and 1.8GB> decompression speed per core.

Some benchmarks against other compression algorithms:



Name Ratio C.speed D.speed
MB/s MB/s
LZ4 (r101) 2.084 422 1820
LZO 2.06 2.106 414 600
QuickLZ 1.5.1b6 2.237 373 420
Snappy 1.1.0 2.091 323 1070
LZF 2.077 270 570
zlib 1.2.8 -1 2.730 65 280
LZ4 HC (r101) 2.720 25 2080
zlib 1.2.8 -6 3.099 21 300

This means that you can compress server side and decompress server side without any performance/speed issue. In the (very) rare case it becomes a problem you could always use Javascript Web Workers (http://www.w3schools.com/Html/html5_webworkers.asp) to offload the work.