1. #1
    Sencha User emredagli's Avatar
    Join Date
    Jun 2008
    Posts
    301
    Vote Rating
    2
    emredagli is on a distinguished road

      1  

    Default JSON Gzip Compression

    JSON Gzip Compression


    Hi,
    I have huge object (about 130 kb) coming from server as json string.
    I compressed it by using gzip in server side and I am sending compressed string to client.

    Browser decompressed it. It is ok.

    But after I made some operations I want to send back as compressed json string to server.

    I searched but I couldn't found gzip implementation in javascript.

    As I know ExtJS doesn't have such a function... Am I correct?

    Thanks for your advance.

  2. #2
    Sencha - Community Support Team Condor's Avatar
    Join Date
    Mar 2007
    Location
    The Netherlands
    Posts
    24,246
    Vote Rating
    92
    Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of Condor has much to be proud of

      -1  

    Default


    GZip using javascript

    Do you have any idea how long it would take javascript to compress using the GZip algorithm?

    The long compression time completely outweighs any advantage you would gain from the smaller transfer.

    ps. Base62 encoding would also make the data smaller (but not as much as compression).

  3. #3
    jay@moduscreate.com's Avatar
    Join Date
    Mar 2007
    Location
    Frederick MD, NYC, DC
    Posts
    16,361
    Vote Rating
    81
    jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all

      0  

    Default


    Correct, there is no native gzip implementation for javascript.

  4. #4
    Sencha - Community Support Team VinylFox's Avatar
    Join Date
    Mar 2007
    Location
    Baltimore, MD
    Posts
    1,501
    Vote Rating
    8
    VinylFox will become famous soon enough VinylFox will become famous soon enough

      1  

    Default


    The short answer - no.

    Maybe you could trim the size by shortening your names in the JSON. Its a common compression tactic.

    ie:

    Code:
    {'some_long_name':'bla'}
    Becomes...

    Code:
    {'sln':'bla'}
    and you alter your scripts on the server to translate the short name to what you need.

  5. #5
    Sencha User emredagli's Avatar
    Join Date
    Jun 2008
    Posts
    301
    Vote Rating
    2
    emredagli is on a distinguished road

      1  

    Default


    I previously found a "JavaScript" compression implementation of LZW
    Code:
    // LZW-compress a string
    function lzw_encode(s) {
        var d = new Date();
        
        var dict = {};
        var data = (s + "").split("");
        var out = [];
        var currChar;
        var phrase = data[0];
        var code = 256;
        for (var i=1; i<data.length; i++) {
            currChar=data[i];
            if (dict[phrase + currChar] != null) {
                phrase += currChar;
            }
            else {
                out.push(phrase.length > 1 ? dict[phrase] : phrase.charCodeAt(0));
                dict[phrase + currChar] = code;
                code++;
                phrase=currChar;
            }
        }
        out.push(phrase.length > 1 ? dict[phrase] : phrase.charCodeAt(0));
        for (var i=0; i<out.length; i++) {
            out[i] = String.fromCharCode(out[i]);
        }
        
        var retrunedresult = out.join("");
        console.log("Input: " + s.length/1024 + "kb Output:"+ retrunedresult.length/1024 + "kb Rate: " +(s.length/retrunedresult.length) );
        console.log((new Date()).getTime() - d.getTime() + ' ms.');
        return retrunedresult;
    }
    // Decompress an LZW-encoded string

    function lzw_decode(s) {
    var dict = {};
    var data = (s + "").split("");
    var currChar = data[0];
    var oldPhrase = currChar;
    var out = [currChar];
    var code = 256;
    var phrase;
    for (var i=1; i<data.length; i++) {
    var currCode = data[i].charCodeAt(0);
    if (currCode < 256) {
    phrase = data[i];
    }
    else {
    phrase = dict[currCode] ? dict[currCode] : (oldPhrase + currChar);
    }
    out.push(phrase);
    currChar = phrase.charAt(0);
    dict[code] = oldPhrase + currChar;
    code++;
    oldPhrase = phrase;
    }
    return out.join("");
    }



    It's results are:
    Input: 136.7041015625kb Output:9.1962890625kb Rate: 14.865137517256027
    290 ms.

    Is my computer too fast? (Intel Pentium Dual PC 2 GHz.)

  6. #6
    jay@moduscreate.com's Avatar
    Join Date
    Mar 2007
    Location
    Frederick MD, NYC, DC
    Posts
    16,361
    Vote Rating
    81
    jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all jay@moduscreate.com is a name known to all

      0  

    Default


    Personally, i say stop messing with that stuff and let the server and browser perform the gzip compression..

    Instead of using JSON, use arrays, which will significantly reduce your transfer overhead

  7. #7
    Sencha Premium Member
    Join Date
    Dec 2011
    Posts
    235
    Vote Rating
    3
    ypandey is on a distinguished road

      0  

    Question how to send array data from server side?

    how to send array data from server side?


    Hi Jay,
    I liked your approach, instead of sending data in json use arrays
    But how server will serialize value object to array data?
    We have been using Jackson for VO to JSON conversion.
    Thanks

  8. #8
    Sencha User
    Join Date
    Aug 2014
    Posts
    2
    Vote Rating
    0
    osmantekin is on a distinguished road

      0  

    Default You can decompress client side without any performance loss

    You can decompress client side without any performance loss


    You have to be smart when it comes to network optimization, because yes, the payload size is important.

    Common compression algorithms like Gzip, Zlib etc. aren't performant enough that it won't be noticed client side.

    But there is lz4 since 2011 and it's the fastest at compressing/decompressing right now. It reaches 400MB/s speed per core in compression and 1.8GB> decompression speed per core.

    Some benchmarks against other compression algorithms:

    Code:
    Name            Ratio  C.speed D.speed
                            MB/s    MB/s
    LZ4 (r101)      2.084    422    1820
    LZO 2.06        2.106    414     600
    QuickLZ 1.5.1b6 2.237    373     420
    Snappy 1.1.0    2.091    323    1070
    LZF             2.077    270     570
    zlib 1.2.8 -1   2.730     65     280
    LZ4 HC (r101)   2.720     25    2080
    zlib 1.2.8 -6   3.099     21     300
    This means that you can compress server side and decompress server side without any performance/speed issue. In the (very) rare case it becomes a problem you could always use Javascript Web Workers to offload the work.