View Full Version : How to handle loading a very large XML file into stores
17 Jul 2012, 1:52 PM
I'm well on my way in my first ExtJS project. My application has to draw a lot of data initially from an XML file that can vary dramatically from 1mb to 15mb. I've been prototyping using a smaller database for speed's sake, but when I tried to use one of the 15mb files, Chrome thinks the script has crashed and gives me the 'Aw, Snap' page before it's finished loading. I'm only using one hard-coded 'hasMany' association in my models, so all it's doing is parsing the XML.
What can I do to prevent chrome from jumping the gun? I can't get around the initial loading, as the application itself requires a lot of data to begin. Is JSON faster than using XML? Is there some kind of message pump I can ping to prevent it? What are my options?
17 Jul 2012, 2:25 PM
Is it required that you use 15 MB XML files? JSON is preferred ... yes ... If you are loading into a grid, it would also be helpful to use paging and load only visible records or a buffered grid if required.
17 Jul 2012, 4:30 PM
no, the data isn't being loaded into controls directly. The data from the XML file is processed in the controller before anything is loaded into a view. A lot of data must be processed in the back-end for only a small amount of data that ends up visible, and as the application is used, that data must be quickly accessible.
The XML data is simply a flat file generated from a database, so I could export the database in JSON if it is faster than XML. I'd rather not split up the data file into multiple pieces, also that is possible. Does the XML proxy have to actually scan through the entire file for each store?
In some cases, I could also use a SQL database but I haven't been able to find any way to access a database directly using Ext JS.
17 Jul 2012, 4:50 PM
I posted a few examples here using PHP/MySQL
This is you best bet ... it uses JSON to communicate between server/client
JSON is always better than XML (xml is full of artifacts)
18 Jul 2012, 9:55 AM
18 Jul 2012, 1:43 PM
Just to update, JSON seemed to actually perform worse. I ended up breaking the data into multiple JSON files and it loaded up in a snap, but now I have to manage 100 different files holding the data
18 Jul 2012, 2:18 PM
JSON performs worse than?
18 Jul 2012, 2:39 PM
I couldn't get the 14mb JSON file to load at all, whereas occasionally the XML file would load eventually if you refreshed enough. It doesn't matter though because after breaking down the data file into smaller JSON pieces, its working great
18 Jul 2012, 2:42 PM
Oh .. yes ... you do not want to load 14MB JSON file .. the best option would be to place it in a DB and load from tables using JSON requests. This would eliminate the need for multiple JSON files.
Powered by vBulletin® Version 4.1.5 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.