An Overview of Result Sets in the Node.js Driver

By July 23, 2015 Uncategorized

I’m happy to announce that version 0.7.0 of the Oracle Database driver for Node.js was released earlier this week! Checkout Chris Jones’ post for a rundown on the various changes in that release. In this post, I’ll be diving into my favorite new feature: the ResultSet class.

Result sets are great for huge queries or when you don’t know how many rows will be returned. And how often do you really know how many rows will be returned from a query? Probably not very often, which is why I highly recommend using result sets as the default means through which you execute queries with the Node.js driver.

Test setup

Let’s take a look at an example. We’ll start by creating a new table and populate it with a lot of random rows. You can adjust the following script as needed, but I populate the test table with about 1.1 gigs of random data. Why 1.1 gigs? Because my Node.js test server only has 1 gig of memory. Hmmmmm, that can’t be good! 🙂 Checkout SQL Developer if you need a tool to execute this script. It took a few minutes to run on my test database (a 1 gig XE server) so be prepared for that.

In my case I ended up with 7.2 million rows – fun! Let’s see what happens when we try to ingest all that data with the default execute method. We’ll start by putting the database configuration info in a file named dbconfig.js so it can be reused.

Using the default execute

Next we’ll create a file named standard-fetch.js to get the data.

Then we can test standard-fetch.js in Node.js.

Woohoo, it worked! Wait, only 100 rows? What happened? Oh yeah, there’s a maxRows setting that defaults to 100. Hopefully this doesn’t come as any surprise, this is the default behavior of the driver.


Let’s crank the maxRows up a bit. I’ll try 8 million just in case our table grows a little.

And when we test standard-fetch2.js in Node.js.

Segmentation fault? That’s not good! Depending on your configuration you might just get Killed or it’s also possible the execute completes and you get all 7.2 million records. It all depends on the query, the system’s resources, load on the system, etc.

Using result sets

Clearly, moving this amount of data around at once isn’t reliable. Let’s see how we could do this with Result Sets instead! Here’s a solution I’ll save in a file named result-set.js. I set the resultSet property of the configuration object passed to the execute method to true and then setup a recursive function to process the result set. Note that the maxRows property is ignored when using result sets.

Then we can run result-set.js in Node.js.

Woot, it finished without errors! But 12 minutes, can’t we do better than that? Of course!

Tuning result sets

There are two things we can do to tune result sets:

Row prefetching allows multiple rows to be returned from Oracle Database to Node.js in each network round-trip. We can control how many rows are prefetched via the prefetchRows property of either the base driver class or the options object passed to the execute method of a connection. The default value is 100.

When rows are returned from the database they are queued in an internal buffer. We can retrieve those rows by invoking either the getRow or getRows methods of the ResultSet class. The context switch from the JavaScript event loop to the driver’s C code is generally very fast, but it’s not free. The getRows method can be used to retrieve many rows from the internal buffer at once, greatly reducing the number of context switches needed to get all the rows.

Let’s see what happens when we set prefetchRows to 1000 and use getRows with a matching number (though they don’t have to match). I’ll put the following code in a file named result-set2.js.

And when we run it in Node.js…

Nice, with relatively minor changes to our code, we got the execution time down from 12.26 to 1.65 minutes. That’s over 6x faster – not too shabby! Of course, you’ll need to adjust these numbers in your environment to get the best balance between performance and resource utilization.


  • Daniel Hair says:

    I appreciate you writing this tutorial. Although I had created a function that looked identical to yours before reading this, as I was going through your code, you helped me see a few ways I could improve my code. You saved me a lot of time. Thanks!

  • RAvi says:

    Is there anyway we can read the cursor into json string?

    • danmcghan says:

      The data comes in using native JavaScript data types, but it’s easy enough to do a JSON.stringify() on each row. I’d probably recommend setting outFormat to oracledb.OBJECT. If you’re writing the data to a file, for example, you’d want to start by writing a “[” (start array) to the file before appending stringified rows. After each row (except the last) you would write a “,” before the next row. After you’ve written all the rows you’d write a final “]” (end array).

      Does that help?

      • Ravi says:

        I am using nodeJS as an API. I want to return the JSON string as a response when a http call to method has been raised. below is my code. I am always get outString is [] because meantime the function(processResultSet) get called, rest code gets executed.

        var db = require(‘../config/dbconnect’);
        var oracledb = require(‘oracledb’);
        oracledb.outFormat = oracledb.OBJECT;

        exports.getSource = function(req,res){
        if (err) {
        connection.execute( “BEGIN ECRR.PKG_FEEDSSOURCE.ECRR_FeedsSource_GetAll(:refCur); END;”,
        refCur: { type: oracledb.CURSOR, dir: oracledb.BIND_OUT }
        function(err, result) {
        var rowsProcessed = 0;
        var startTime;
        if (err) {
        var outString=”[“;
        startTime =;
        function processResultSet() {
        result.outBinds.refCur.getRows(1000, function(err, rows) {
        if (err) throw err;

        if (rows.length) {
        rows.forEach(function(row) {
        rowsProcessed += 1;
        outString += row + “,”;
        //do work on the row here

        processResultSet(); //try to get more rows from the result set

        return; //exit recursive function prior to closing result set

        console.log(‘Finish processing ‘ + rowsProcessed + ‘ rows’);
        console.log(‘Total time (in seconds):’, (( – startTime)/1000));

        result.outBinds.refCur.close(function(err) {
        if (err) console.error(err.message);

        connection.release(function(err) {
        if (err) console.error(err.message);

        outString += “]”;
        res.json({ message: outString});

  • I have implemented the result-set2.js sample above and it works great with a simple query such as “select * from Broker”. But when I put the same query in a procedure following the get_emp_rs example in the Oracle refcursor.js sample, I fail on connection.execute with “NJS-019: ResultSet cannot be returned for non-query statements”.

    I supply connection.execute with “BEGIN kprs.kp_get_brokers(:cursor); END;” and use “cursor: { type: oracledb.CURSOR, dir : oracledb.BIND_OUT }” just like in the sample.

    I have verified that the procedure works by entering into SQL*Plus:

    var r refcursor;
    exec kprs.kp_get_brokers(:r);
    print r;

    I’ve Googled NJS-019 and not found anything helpful. I’ve tried other very rudimentary experiments but always get the same error.

    Seems like I’m missing something obvious but am very frustrated. Any guidance is appreciated. Thanks.

  • In case anyone else has this issue, my mistake was using the “resultSet: true” option in the conn.execute call. The bind variables ({ type: oracledb.CURSOR, dir: oracledb.BIND_OUT }) are all that are required for a PL/SQL call that returns a REF CURSOR.

  • CCC says:

    Great. learnt a lot from this.

    Thank U so much.

  • Achutananda says:

    how i can insert multiple rows into oracle databaase at a time ?

Leave a Reply