Note that there are some explanatory texts on larger screens.

plurals
  1. POGo vs JavaScript JSON parsing
    primarykey
    data
    text
    <p>Recently, I needed to parse the JSON that the Chrome web browser produces when you record events in its dev tools, and get some timing data out of it. Chrome can produce a pretty large amount of data in a small amount of time, so the Ruby parser I originally built was quite slow.</p> <p>Since I'm learning Go, I decided to write scripts in both Go and JavaScript/Node and compare them.</p> <p>The simplest possible form of the JSON file is what I have in <a href="https://gist.github.com/jclem/5978436#file-data-json" rel="noreferrer">this Gist</a>. It contains an event representing the request sent to fetch a page, and the event representing the response. Typically, there's a <em>huge</em> amount of extra data to sift through. That's its own problem, but not what I'm worried about in this question.</p> <p>The JavaScript script that I wrote is <a href="https://gist.github.com/jclem/5978436#file-parser-js" rel="noreferrer">here</a>, and the Go program I wrote is <a href="https://gist.github.com/jclem/5978436#file-parser-go" rel="noreferrer">here</a>. This is the first useful thing I've written in Go, so I'm sure it's all sorts of bad. However, one thing I noticed is that it's <em>much</em> slower than JavaScript at parsing a large JSON file.</p> <p>Time with a 119Mb JSON file in Go:</p> <pre><code>$ time ./parse data.json = 22 Requests Min Time: 0.77 Max Time: 0.77 Average Time: 0.77 ./gm data.json 4.54s user 0.16s system 99% cpu 4.705 total </code></pre> <p>Time with a 119Mb JSON file in JavaScript/Node:</p> <pre><code>$ time node parse.js data.json = 22 Requests Min Time: 0.77 Max Time: 0.77 Avg Time: 0.77 node jm.js data.json 1.73s user 0.24s system 100% cpu 1.959 total </code></pre> <p>(The min/max/average times are all identical in this example because I duplicated JSON objects so as to have a very large data set, but that's irrelevant.)</p> <p>I'm curious if it's just that JavaScript/Node is just way faster at parsing JSON (which wouldn't be particularly surprising, I guess), or if there's something I'm doing totally wrong in the Go program. I'm also just curious what I'm doing wrong in the Go program in general, because I'm sure there's plenty wrong with it.</p> <p>Note that while these two scripts do more than parsing, it's <em>definitely</em> <code>json.Unmarshal()</code> in Go that is adding lots of time in the program.</p> <p><strong>Update</strong></p> <p>I added a <a href="https://gist.github.com/jclem/5978436#file-parse-rb" rel="noreferrer">Ruby script</a>:</p> <pre><code>$ ruby parse.rb = 22 Requests Min Time: 0.77 Max Time: 0.77 Avg Time: 0.77 ruby parse.rb 4.82s user 0.82s system 99% cpu 5.658 total </code></pre>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload