By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Stop Buffering: Stream JSON to S3 in CSV Format Using Node.js and Axios | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > Stop Buffering: Stream JSON to S3 in CSV Format Using Node.js and Axios | HackerNoon
Computing

Stop Buffering: Stream JSON to S3 in CSV Format Using Node.js and Axios | HackerNoon

News Room
Last updated: 2025/04/06 at 12:25 AM
News Room Published 6 April 2025
Share
SHARE

I’m assuming that, like me, you ran into a challenge where you needed to stream a large JSON response (not a file) without having to store the entire JSON array in memory before processing and uploading it as a document.

Streaming a JSON response with Axios returns the chunks of data as substrings of the entire JSON response, an approach would be to append all the chunks into a single string and run JSON.parse on the final string.

let data = '';
const response = await axios.get(url, { responseType: 'stream' });
response.data.on('data', (chunk) => {
    data += chunk;
});
const parsedData = JSON.parse(data);

This approach certainly works, but depending on the size of the records, this approach is not memory efficient; for over a hundred thousand records, you could be seeing a memory usage of over 1GB. Having multiple users making concurrent requests with each request utilizing over 1GB is not an optimal use of system resources.

To convert the chunks of data into a format that can be uploaded to CSV, you’d need to parse the data from the text format to a compatible one. This can be achieved by making use of AsyncParser from the @json2csv/node npm package.

The steps below highlight how to go about it.

Step 1

Install the @json2csv/node package

npm i --save @json2csv/node

Step 2

Define your parser

const opts = {
  fields: ['header1', 'header2', 'header3']
};
const transformOpts = {};
const asyncOpts = {};
const parser = new AsyncParser(opts, asyncOpts, transformOpts);

For further configuration options, you can check out the doc

Step 3

your parser to parse your stream data and generate a PassThrough stream

function generateStream(url) {
  // Create a PassThrough stream to pipe data through
  const passThroughStream = new PassThrough();

  // Stream JSON data from the URL using Axios 
  axios.get(url, { responseType: 'stream' }).then((response) => {
    // Pipe the parsed response stream to the PassThrough stream
    parser.parse(response.data).pipe(passThroughStream);
  });
  return passThroughStream;
}

The generateStream function makes an API call with a { responseType: 'stream' } to indicate to Axios that the data should be returned as a continuous stream instead of a single chunk. Using a stream is memory efficient as opposed to loading all the records in memory at once; you only load a chunk into memory and process it.

Step 4

Upload your data to s3

const { S3Client } = require('@aws-sdk/client-s3');

const s3Client = new S3Client({
  region: 'region',
  credentials: {
    accessKeyId: 'access_key',
    secretAccessKey: 'secret_key',
  },
});
async function streamJsonToS3(url, s3Bucket, s3Key) {
  try {

    const passThroughStream = generateStream(url)


    // Create an S3 upload command
    const uploadParams = {
      Bucket: s3Bucket,
      Key: s3Key,
      Body: passThroughStream,
      ContentType: 'application/csv', // Set the content type to CSV
    };

    // Upload the data to S3
    const command = new Upload({
      client: s3Client,
      params: uploadParams
    });

     await command.done();


    console.log(`Successfully uploaded JSON data to S3 as CSV at ${s3Bucket}/${s3Key}`);
  } catch (error) {
    console.error('Error streaming and uploading data:', error);
  }
}

The Passthrough stream generated is then passed on to S3 for upload.

Conclusion

By streaming data to AWS S3 as a CSV file using Axios, you can significantly enhance the performance and efficiency of your application. This method allows for handling large datasets without overwhelming memory resources, as data is processed in manageable chunks. Utilizing AsyncParser from the @json2csv/node package ensures that the data is accurately parsed and formatted, while using a PassThrough stream facilitates seamless data transfer to S3. Overall, this approach optimizes resource usage and maintains consistent performance regardless of the volume of data being processed.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Google Calendar and Keep widgets are getting some much-needed spring cleaning (APK teardown)
Next Article ‘0 to 1939 in 3 seconds’: Why Anti-Elon Musk Satire Is Flourishing in Britain
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Oracle looks to open up data access in the era of AI – News
News
This Refurbished Lenovo Chromebook Costs Less Than $100, While Supplies Last
News
From ‘Black Bag’ to ‘Nonnas,’ 10 movies you need to stream right now
Software
How to unblock ePorner for free
News

You Might also Like

Computing

Nvidia’s tailored-for-China H20 AI chip now available for pre-orders · TechNode

4 Min Read
Computing

Alibaba mulls sale of grocery retail chain Freshippo: report · TechNode

1 Min Read
Computing

China’s Chery reportedly forms standalone business unit in collaboration with Huawei · TechNode

1 Min Read
Computing

Xiaomi 15 smartphone set to debut with Snapdragon 8 Gen 4 in October · TechNode

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?