Saturday, February 22, 2014

22k objects

Let's imagine, that there is a device (iPhone) and web server. On server there are 22 000 objects (each object with name, date, count, title) and we would need to download all 22 000 objects (2 MB). The faster the better. What are our options?
  1. Download in batches (using revision, each time download dictionary with ~500 objects);
  2. Download dictionary with all objects;
  3. Download archived JSON string (containing all objects);
  4. Download archived binary data from JSON string (containing all objects);
Downloading in batches is the most slower case - because, there is a delay between downloads, but on the other hand - for user (in application), there appears new data in list every few seconds. Still - with 500 object batches, it would take 44 download times. And - if user needs the whole 22 000 objects  - he must wait while all data has been downloaded.

Downloading dictionary with all objects at once will be faster in general (no delay between downloads), but the downside is that it will take much more time until user will finally see some data. 

Then there is the possibility to archive data on server side - so that it would take less time to actually download those 22 000 objects - and this post is actually about the results on option 3 and 4.

I exported from phpMyAdmin SQL database all 22 000 objects as .json file - ~2 MB. Then I archived it - ~450 KB.  Great! But this can be optimised even more. In SQL database I renamed all column names:
  • aName => n
  • aDate => d
  • aCount => c
  • aInfo => i
And then exported again .json file:  ~ 1.4 MB, and once .json file was archived: ~ 290 KB.

Then I uploaded this archive file to server and started experiment:
  1. downloaded in Application;                             (1.304701 seconds)
  2. saved archive file in Cache directory;              (0.016478 seconds)
  3. unarchived and saved file;                               (0.078202 seconds)      
  4. loaded from file to a text string;                        (0.064022 seconds)
  5. converted to NSData (binary);                          (0.027673 seconds)
  6. converted to JSON;                                          (0.377311 seconds)
  7. iterated through all items and write in DB
    (without checking if such object exist);            (1.209763 seconds)
   TOTAL TIME till user sees results: 3.07815 seconds

Pretty good results, but in one of my previous projects, where I had to work with .wavefront object files, I ended up converting .wavefront objects to binary files, which could be loaded into necessary arrays once object should be loaded. So - maybe I could do the same here - convert to binary, download binary and fill in JSON array from binary data. 

First problem - when converting 1.4 MB .json file to binary - it boosts up its size to 4.9 MB. Once archived - 2 MB.

So - second experiment (with binary file):

  1. downloaded in Application;                             (9.994034 seconds)
  2. saved archive file in Cache directory;              (0.070027 seconds)
  3. unarchived and saved file;                               (0.255974 seconds)      
  4. loaded from file to a JSON array;                    (1.064016 seconds)
  5. iterated through all items and write in DB
    (without checking if such object exist);            (1.224640 seconds)
   TOTAL TIME till user sees results: 12.608691 seconds

Working with .wavefront 3d objects binary data usage paid itself off, because: 
  • .wavefront file contains only float values - thus, converting to binary did not increase file size;
  • in order to show object, I had to load into array necessary vertices, normals, texture coordinates - it is much faster to fill in array from binary data, than iterating through arrays and copying each value between two arrays.


(Experiments were conducted on iPhone 5 / connected to WIFI)


No comments:

Post a Comment