PDA

View Full Version : When to apply filterBy store VS ux.grid.FiltersFeature and Remote Vs Local performanc



Aero
2 Feb 2013, 7:00 PM
Hi!
What is the preferd method of filtering a grid (or indirectly the store) ?

I have tried several methods which ends up equal, i.e it filers the grid.



store.filterBy(function (rec, id) {
var value = rec.get('market'); return (value == 'SE' || value == 'NO');
})

get the job done of filtering the grid, but also


[
grid.filters.createFilters();
grid.filters.getFilter('market').setActive(true) ;
grid.filters.getFilter('market').setValue(myMarketArray);


None of them seems to update the paging toolbar counter (which i guess only get updated on store.load) which in this case initally is being loaded from a remote ajax proxy. So even if i filter out a subset of 1000 entries, it will still how 1000 items and pages (300 per page). I think i can live with that, however it would be nice to also update the actual number after filter has been applied (even if it is local).

What is the performance of local/client side filtering with say 1000 records ? Why would you like to have remote filtering when local filter exists ? (assuming your inital load has eveything you user is expected to ever need in the appliction)

Previously i added the filtering critera to the extraParams and made the filtering and search server-side, but i guess in a large scale application with some users that would be a quite huge penelty if the users is frequently searching and sorting you grid items that was initially loaded.

Thanks for any experience of filtering the grid.

existdissolve
3 Feb 2013, 7:24 AM
Personally, I would advise against *ever* loading 1000 records, whether locally or remotely. In my experience, there are extremely few cases where such a large recordset would ever be needed, given how easy it is to provide rich search tools to allow users to narrow the results to something meaningful and manageable (honestly, have you ever made it past the first couple of "pages" of Google results? By page 3 or 4, you've probably found what you're looking for, or you've refined your search).

And if you do need to deal in such a dataset size, the only reasonable way to deal with it, IMO, is server-side. The loss in latency for 25-50 records will probably be much less, anyway, than the client-side lag in trying to filter and redraw hundreds and hundreds of grid rows.

Yes, you'll get extra hits to the server for filtering/sorting operations, but so what? If you're dealing in small datasets (e.g., 25-50 per *page*), it should be very fast. And besides, if your app can't handle the insignificant load of these types of frequent operations, well, the changes are good for it anyway.

Aero
3 Feb 2013, 8:32 AM
Hi,

Thanks, i hear you and i agree. The application i am working on is a ticket/support system where the user might need to see tickets that covers a rather large dataset, even though by default the user has it own preferences and filters.
On the other hand you right, it is not very meaningful for a user to see 1000 tickets if you can supply search tool to pinpoint a range of markets rather than all.

I am just thinking that in those kind of application they will frequently search for #id number, names and whatever and if it could be handled locally out of the inital dataset that would be nice. But you right the server load should be ok i guess (well it up to me to design the server side query indeed =)).