PDA

View Full Version : Ext.ux.data.JsonCsvReader - consumes a lightweight form of record data



danh2000
4 Nov 2008, 12:13 AM
I've created a new data reader to consume a lightweight JSON format modelled on CSV. It basically avoids the repetition of property names in records so reduces bandwidth and response times.

Here's a basic example:


var Employee = Ext.data.Record.create([
{name: 'firstname'},
{name: 'job', mapping: 'occupation'}
]);
var myReader = new Ext.ux.data.JsonCsvReader({
totalProperty: "results",
root: "records",
id: "id"
}, Employee);

//dummy data
var recs1 = {
'results': 2,
'records': {
'cols': ['id','firstname','occupation'],
'rows': [
[1,'Bill','Gardener'],
[2,'Ben','Horticulturist']
]
}
};

//consume test
var readRecs1 = myReader.readRecords(recs1);
console.log(readRecs1);
console.log(readRecs1.records[0].get('job'));
console.log(readRecs1.records[1].get('firstname'));

There are a couple of posts on my blog if interested:

Ajax Data Transfer - XML, JSON or CSV (http://technomedia.co.uk/blog/2008/10/ajax-data-transfer-xml-json-csv/)

Ajax Data Transfer - JsonCsvReader for ExtJS (http://technomedia.co.uk/blog/2008/11/ajax-data-transfer-jsoncsvreader-for-extjs/)


And here's the code:


Ext.namespace('Ext.ux.data');
/**
* @description Data reader class to create an Array of Ext.data.Record objects from specialised block of JSON.
* @author <a href="dan.humphrey@technomedia.co.uk">Dan Humphrey</a>
* @class Ext.ux.data.JsonCsvReader
* @extends Ext.data.JsonReader
* @constructor
* @version 0.1
* @license Unlikely
* @param {Object} meta Metadata configuration options.
* @param {Object} recordType Either an Array of field definition objects as passed to Ext.data.Record.create, or an Ext.data.Record constructor created using Ext.data.Record.create.
*/
Ext.ux.data.JsonCsvReader = function(meta, recordType){
meta = meta || {};
Ext.ux.data.JsonCsvReader.superclass.constructor.call(this, meta, recordType || meta.fields);
};
Ext.extend(Ext.ux.data.JsonCsvReader, Ext.data.JsonReader, {
/**
* @cfg {String} totalProperty Name of the property from which to retrieve the total number of records
* in the dataset. This is only needed if the whole dataset is not passed in one go, but is being
* paged from the remote server.
*/
totalProperty: null,
/**
* @cfg {String} successProperty Name of the property from which to retrieve the success attribute used by forms.
*/
successProperty: null,
/**
* @cfg {String} cols Name of the property from which to retrieve the columns.
*/
cols: 'cols',
/**
* @cfg {String} rows Name of the property from which to retrieve the rows.
*/
rows: 'rows',
/**
* @cfg {String} root Name of the property which contains the Array of row objects.
*/
root: null,
/**
* @cfg {String} id Name of the property within a row object that contains a record identifier value.
*/
id: null,
/**
* Create a data block containing Ext.data.Records from a Json object containing JsonCsv formatted data.
* @methodOf Ext.ux.data.JsonCsvReader
* @param {Object} o An object which contains an Array of row objects in the property specified
* in the config as 'root, and optionally a property, specified in the config as 'totalProperty'
* which contains the total size of the dataset.
* @return {Object} data A data block which is used by an Ext.data.Store object as
* a cache of Ext.data.Records.
*/
readRecords: function(o){
this.jsonData = o;
//dynamic meta
if (o.metaData) {
delete this.ef;
this.meta = o.metaData;
this.recordType = Ext.data.Record.create(o.metaData.fields);
this.onMetaChange(this.meta, this.recordType, o);
}
var s = this.meta, Record = this.recordType, f = Record.prototype.fields, fi = f.items, fl = f.length;

var root = o;
if (s.root) {
try {
root = o[s.root] || o;
}
catch (e) {
root = o;
}
}
//extra accessors
var colsProp = s.cols ? s.cols : 'cols';
var rowsProp = s.rows ? s.rows : 'rows';
this.getCols = function(){
return root[colsProp];
};
this.getRows = function(){
return root[rowsProp];
};
if (s.totalProperty) {
this.getTotal = this.getJsonAccessor(s.totalProperty);
}
if (s.successProperty) {
this.getSuccess = this.getJsonAccessor(s.successProperty);
}

//default properties
var cols = this.getCols(), rows = this.getRows(), c = cols.length, r = rows.length, totalRecords = r, success = true, v;
var data = {}; // data block to pass to super
//override defualts
if (s.totalProperty) {
v = parseInt(this.getTotal(o), 10);
if (!isNaN(v)) {
totalRecords = v;
}
data[s.totalProperty] = totalRecords;
}
if (s.successProperty) {
v = this.getSuccess(o);
if (v === false || v === 'false') {
success = false;
}
data[s.successProperty] = success;
}

//loop through data and recreate normal JSON data for the superclass
var recs = [];
for (var iRow = 0; iRow < r; iRow++) {
var rec = {};
for (var iCol = 0; iCol < c; iCol++) {
rec[cols[iCol]] = rows[iRow][iCol];
}
recs.push(rec);
}
if (s.root) {
data[s.root] = recs;
}
else {
data = recs;
}
return Ext.ux.data.JsonCsvReader.superclass.readRecords.call(this, data);
}
});

Hope someone finds it useful.

Animal
4 Nov 2008, 12:50 AM
Is that not just an ArrayReader?

danh2000
4 Nov 2008, 1:03 AM
Is that not just an ArrayReader?

Yes I suppose it is essentially - this just also gives the ability to specify the totalRows which I don't think the ArrayReader does (I might be wrong there though).

To be honest, I didn't look at the ArrayReader source - I'll have to check out the totalRows and root usage.

This might be redundant if I misunderstood ArrayReader, but it was fun anyway.

danh2000
4 Nov 2008, 1:17 AM
Just skimming over the source of ArrayReader, I'm surprised that it extends JsonReader - I didn't expect that.

It seemd DataReader is just an interface that doesn't do much, it's just that lot's of the extension JsonReader to ArrayReader is unused.

Also the root and totalProperty are redundant in the ArrayReader so the version I posted would be better suited for paged ajax calls for a grid for instance.

Animal
4 Nov 2008, 2:25 AM
root and totalProperty are optional.

You can just send an Array, and if you don't specify a root in your ArrayReader config, it just uses the evaluated data block as an Array.

I can't see the difference between your code and an ArrayReader. ArrayReaders handle paged data. Just use the totalProperty to include a property which contains the total dataset size.

Now if you consume a comma-delimited string, that's a different matter. That would be a CSVReader.

Something like:


Ext.ns("com.aspicio");

Ext.apply(Ext.util.Format, {
number: function(v, format) {
if(!format){
return v;
}
v *= 1;
if(typeof v != 'number' || isNaN(v)){
return '';
}
var hasComma = -1 < format.indexOf(','),
psplit = format.replace(/[^\d\.]/g,'').split('.');

// compute precision
if (1 < psplit.length) {
// fix number precision
v = v.toFixed(psplit[1].length);
}
// error: too many periods
else if (2 < psplit.length) {
throw('NumberFormatException: invalid format, formats should have no more than 1 period: ' + format);
}
// remove precision
else {
v = v.toFixed(0);
}

// get the string now that precision is correct
var fnum = v.toString();

// format has comma, then compute commas
if (hasComma) {
// remove precision for computation
psplit = fnum.split('.');

var cnum = psplit[0],
parr = [],
j = cnum.length,
m = Math.floor(j / 3),
n = cnum.length % 3 || 3; // n cannot be ZERO or causes infinite loop

// break the number into chunks of 3 digits; first chunk may be less than 3
for (var i = 0; i < j; i += n) {
if (i != 0) {n = 3;}
parr[parr.length] = cnum.substr(i, n);
m -= 1;
}

// put chunks back together, separated by comma
fnum = parr.join(',');

// add the precision back in
if (psplit[1]) {fnum += '.' + psplit[1];}
}

// replace the number portion of the format with fnum
return format.replace(/[\d,?\.?]+/, fnum);
},

/**
* Returns a number rendering function that can be reused to apply a number format multiple times efficiently
* @param {String} format Any valid number format string for {@link #number}
* @return {Function} The number formatting function
*/
numberRenderer : function(format){
return function(v){
return Ext.util.Format.number(v, format);
};
}
});

com.aspicio.CSVReader = Ext.extend(Ext.data.DataReader, {
read: function(xhr) {
return this.readRecords(xhr.responseText);
},

readRecords: function(o) {

// Create a hidden grid table for calculating max column width
if (!com.aspicio.CSVReader.prototype.testCell) {
var l = Ext.getBody().createChild({tag: "table", cls: 'x-hide-offsets', cn: {
tag: 'tbody', cn: {
tag: 'tr', cls: 'x-grid3-row', cn: {
tag: 'td', cls: 'x-grid3-col x-grid3-cell', cn: {
cls: 'x-grid3-cell-inner'
}
}
}
}});
com.aspicio.CSVReader.prototype.testCell = l.child('td', true);
com.aspicio.CSVReader.prototype.testInner = com.aspicio.CSVReader.prototype.testCell.firstChild;
}

var s = this.meta;
var columns;

// Split data into an Array of lines.
o = o.split('\n');

// Create a recordType from the header row.
if (!this.recordType) {

var headers = o[0].split(',');
var names = new Array(headers.length);
columns = new Array(headers.length);
var fields = this.createFieldSpec(o, headers.length);

for (var i = 0; i < headers.length; i++) {
headers[i] = headers[i].trim();

// Generate a field name from the header, but ensure it is unique
var fname = headers[i].replace(/\W/g, '');
if (names.indexOf(fname) != -1) {
fname += i;
}
names[i] = fields[i].name = fname;

columns[i] = {
id: fname,
header: headers[i],
dataIndex: fname,
width: this.calculateColumnWidth(headers[i]),
sortable: true
}

// Create special renderers for Dates and floats
if (fields[i].type == 'date') {
columns[i].renderer = Ext.util.Format.dateRenderer(fields[i].dateFormat);
} else if (fields[i].type == 'float') {
var f = '0.';
for (var j = 0; j < fields[i].precision; j++) {
f += '0';
}
columns[i].renderer = Ext.util.Format.numberRenderer(f);
columns[i].css = 'text-align:right;';
} else if (fields[i].type == 'int') {
columns[i].css = 'text-align:right;';
}

}
this.recordType = Ext.data.Record.create(fields);
this.columnModel = new Ext.grid.ColumnModel(columns);

// Inform anyone interested that we have changed metadata
this.onMetaChange(s || {}, this.recordType, o)
}

var sid = this.meta ? this.meta.id : null;
var Record = this.recordType, fields = Record.prototype.fields, fl = fields.length, fi = fields.items;
var records = [];
for (var i = 1; i < o.length; i++) {
var n = o[i].trim();

// Ignore blank lines
if (!n.length) continue;

// Break into fields in an Array
n = n.split(',');

var values = {};
var id = ((sid || sid === 0) && n[sid] !== undefined && n[sid] !== "" ? n[sid] : null);
for (var j = 0; j < fl; j++) {
var f = fi[j];
var k = f.mapping !== undefined && f.mapping !== null ? f.mapping : j;
var v = n[k] !== undefined ? n[k] : f.defaultValue;
v = f.convert(v, n);
values[f.name] = v;

// Keep track of the maximum data width
columns[j].width = Math.max(columns[j].width, this.calculateColumnWidth(this.columnModel.getRenderer(j).call(this, v)));

}
var record = new Record(values, id);
record.csv = n;
records.push(record);
}
return {
success : true,
records : records
};
},

onMetaChange: Ext.emptyFn,

getColumnModel: function() {
return this.columnModel;
},

calculateColumnWidth: function(v) {
this.testInner.innerHTML = v;
return this.testCell.scrollWidth;
},

createFieldSpec: function(o, fieldCount) {
var dataLine;
var finished = false;
var result = new Array(fieldCount);

// Loop through dataset to determine data type.
for (var i = 1; !finished && i < o.length; i++) {
dataLine = o[i].trim();
if (dataLine.length) {
var r = dataLine.split(',');
this.updateFieldSpec(result, r);
}
}

// If no data type could be determined.
for (var i = 0; i < result.length; i++) {
if (!result[i]) {
result[i] = {
type: 'string'
};
}
}

return result;
},

updateFieldSpec: function(fieldSpec, r) {
var df;
for (var i = 0; i < r.length; i++) {

// We can only infer data type from non-empty data fields.
var d = r[i].trim();
if (!d.length) continue;

var dot = d.indexOf('.');
if (!fieldSpec[i]) {
if (!isNaN(Number(d))) {
if (dot == -1) {
fieldSpec[i] = {
type: 'int'
};
} else {
fieldSpec[i] = {
type: 'float',
precision: d.length - dot - 1
};
}
} else if (df = this.getDateFormat(d)) {
fieldSpec[i] = {
type: 'date',
dateFormat: df
};
}
} else if (fieldSpec[i].type == 'int') {
if (isNaN(Number(d))) {
fieldSpec[i].type = 'string';
} else if (dot != -1) {
fieldSpec[i].type = 'float';
fieldSpec[i].precision = d.length - dot - 1;
}
} else if (fieldSpec[i].type == 'float') {
if (isNaN(Number(d))) {
fieldSpec[i].type = 'string';
} else if (dot != -1) {
fieldSpec[i].type = 'float';
fieldSpec[i].precision = Math.max(fieldSpec[i].precision, d.length - dot - 1);
}
}
}
},

dateFormats: [
// Default 01-Jun-2001 format
'd-M-Y',
'j-M-Y',
'd-M-y',
'j-M-y',

// UK 01/06/2001 format
'd/m/Y',
'd/n/Y',
'j/m/Y',
'j/n/Y',
'd/m/y',
'd/n/y',
'j/m/y',
'j/n/y',

// Daft and ambiguity-causing US 06/01/2001 format
'm/d/y',
'n/d/y',
'm/d/Y',
'n/d/Y',
'm/j/y',
'n/j/y',
'm/j/Y',
'n/j/Y'
],

getDateFormat: function(d) {
for (var i = 0; i < this.dateFormats.length; i++) {
if (Date.parseDate(d, this.dateFormats[i])) {
return this.dateFormats[i];
}
}
}
});

danh2000
4 Nov 2008, 2:40 AM
Nice CSVReader :)

I still don't see how ArrayReader can be used with paged recordsets though - the return object from the readRecords method has it's totalRecords property set to the length of the loaded array, not the total records on the server.. unless I'm missing something blindingly obvious..

Animal
4 Nov 2008, 3:23 AM
Ah, yes, you're right. The ArrayReader does not accept any totalRecords data, just a pure Array.

To read an Array, but use paging, I'd use a JsonReader. Pass the root as an Array, and use numeric dataIndex values. JsonReader uses the dataIndex as an Array subscript, so it will work fine.

The J