Josh Software

StreamTable.js – The next generation search filter – Josh Software

web based app design

StreamTable.js – The next generation search filter – Josh Software

So, you think you should search data? Or filter it?  And now, stream … what?

Searching for data has a performance price – we need heavy duty server resources and have delayed responses. See the LinkedIn search.

Filtering for data requires us to load the entire data on the client-side (via JSON) and then filter the results. So, its slow to load but has faster search results.

StreamTable.js is JavaScript that helps us stream large tabular data and filter it. This gives us the best of both worlds – super fast rendering of the pages or tables, no more “Loading data…” delays, no more waiting for page to load and fast client-side search filtering.

DataTables does client side filtering but doesn’t quite get it right. The problem with DataTables is that is too heavy and has complex JavaScript options (with all its aaData and aoData and fnRender to quote a few). To add to our woes, the data load via ajax (sAjaxSource) can be rather slow in rendering data, especially if the table requires a LOT of information – say 10,000 records or more.

Here is a ‘simple’ example of using dataTable:

$('#ClientsList').dataTable( {
   "sAjaxSource": '/accounts.json',
   "aaSorting": [[ 1, "desc" ]],
   "aoColumns": [
    {'sTitle': 'Client Type'},
    {'sTitle': 'Reports Due',
     'fnRender': function(obj) {
       return '<span class="badge badge-important">' + obj.aData[4] + '</span>'
    {'sTitle': 'Actions',
       'fnRender': function(obj) {
          return "<a href="" class="btn btn-primary btn-mini">View</a> " + '<a href="/accounts/' + obj.aData[5] +'/detail" data-remote="true" class="btn btn-primary btn-mini load_view">Filings</a>';
    "bFilter": true,
    "sPaginationType": "bootstrap",
    "oLanguage": {
      "sLengthMenu": "_MENU_ records per page"

Now, to me that was congested, confusing and prone to error. Further more, I have no control over the sAjaxSource. So, if the server is sending large amount of data, we are done for!

StreamTable.js to the rescue

Now, not only can we stttrrreeeaaam data to our tables silently and populate it but also have live filters showing instant results. The page never ‘hangs’, we get millisecond response  and we can manage the table configuration easily with a lot of control.

StreamTable uses mustache template for rendering and can work seamlessly with any other templating mechanism too.

StreamTable can get data in chunks to the table or can fetch all of it silently. So, whether there are 50 rows or 5000 rows, the page loads with data instantly.

StreamTable can manage various JSON data formats: array of arrays or array of Objects. This helps us easily integration with to_json or as_json methods.

StreamTable has live filtering, so a user can see filtering results even while the data is streaming.

Here is an example of using StreamTable.js

var options = {
  view: view,                    //View function to render rows.
  data_url: 'clients/all.json',  //Data fetching url
  stream_after: 2,               //Start streaming after 2 secs
  fetch_data_limit: 500,         //Streaming data in batch of 500

$('#clients_table').stream_table(options, data);

And here is the JavaScript

# app/assets/javascripts/clients.js
var template = Mustache.compile($.trim($("#template").html()));

var view = function(record, index){
  return template({record: record, index: index});

and the HTML template

# app/views/client/index.html.haml
<script id="template" type="text/html">
      <td><span class="badge badge-info">{{due}}</span></td>

A clean and fast approach. Its important to note that the first request should have the first page of data ready to render. So, we can fetch for example the first 10 records and render directly in HAML.

# app/views/clients/index.html.haml
  @data = #{@entities.to_json}

For the remaining data, StreamTable sends the chunked Ajax request with the the offset and limit parameters.

clients/all.json?q="search text"&limit=500&offset=1000


Did I just hear you say benchmarks? Check it out for your self. We have tested for upto 100,000 records loading in a chunks of 500. The server response was in milliseconds and the page renders completely in under 2 seconds. Then the rest of the data is silently loaded into the tables.

Check out the demo here (it simulates chunking): and

You can fork the repository on github. STREAM AWAY!