Ad

JavaScript Heap Out Of Memory | S3.getObject In React/Node App

We have a React/Node app hosted on Digital Ocean. We're also utilizing Digital Ocean spaces which is interoperable with AWS S3 for Object storage. Basically, the app is sort of an in-house dropbox. We have Admins who are able to create folders and upload content to these folders. We then have Clients who are able to login and download any files we allow them access to.

We're successfully able to upload all files to Digital Ocean Spaces. No matter how large/small they are.

The problem is when we try to download (as Admin or Client) any content that is over 100MB in size we experience a JavaScript Heap out of memory error. This error appears on the Backend of the system.

Some solutions we've attempted to administer are:

  1. Provisioning the VM with more more ram
  2. Increase the browser memory '--max-old-space-size'
  3. Enabling the Digital Ocean CDN to stream content through edge servers
  4. Manually streaming file data to our backend and then sending it to the frontend

Frontend code

downloadFile = (id, name, type) => {
axios
  .get(
    `/test-download/${id}`,
    this.props.handleSnackBar(
      "Your download has been started. Please wait."
    )
  )

  .then(res => {
    download(

      new Blob([new Uint8Array(res.data.data.Body.data)]),
      `${name}.${type}`
    );
    console.log(res);

    console.log(res.data.data.Body),
      this.props.handleSnackBar("Your download is now ready.");
  })
  .catch(err => console.log(err));
};

Backend code

app.get("/test-download/:id", (req, res) => {
var params = {
  Bucket: bucketName,
  Key: req.params.id
};

s3.getObject(params, function(err, data) {
  //
  console.log(data);
  //
  if (!err) {

    res.send({ data, key: params.Key });
  } else {
    console.log({ err }); // an error occurred
  }
});
});

Backend code with stream

app.get("/test-download/:id", (req, res) => {
var params = {
  Bucket: bucketName,
  Key: req.params.id
};
// TRY

const fileRequest = s3.getObject(params);

let chunks = [];
fileRequest
  .createReadStream()

  .on("data", function(data) {
    console.log(`Received ${data.length} bytes of data`);
    chunks.push(data);
  })
  .on("end", function() {
    console.log("no more data");
    bufferData = Buffer.concat(chunks);
    console.log(bufferData);
    res.send({ bufferData, key: params.Key });
  });

});

So, basically I'm sort of stuck. Any assistance that can be offered is greatly appreciated. Thanks.

Ad

Answer

Thanks to Marcos, I revisited the piping code we had attempted. But now fully understanding the raw data response I was receiving from the createReadStream().pipe() I was able to convert the data.

Frontend code

app.get("/test-download/:id", (req, res) => {
var params = {
  Bucket: bucketName,
  Key: req.params.id
};

s3.getObject(params)
  .createReadStream()
  .pipe(res)
  .on("finish", () => {
    console.log("** done");
  });
});

Backend code

downloadFile = (id, name, type) => {
axios
  .get(
    `/test-download/${id}`,
    { responseType: "arraybuffer" },
    this.props.handleSnackBar(
      "Your download has been started. Please wait."
    )
  )
  .then(res => {
    console.log(res);
    download(res.data, `${name}.${type}`);
    this.props.handleSnackBar("Your download is now ready.");
  })
  .catch(err => console.log(err));
};
Ad
source: stackoverflow.com
Ad