You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When a user is creating a custom template, they specify a Dockerfile. If the Dockerfile requires to copy more than 4,294,967,295 bytes of a local data, it will fail with the following message
Preparing sandbox template building (35639 files in Docker build context).
Found ./e2b.Dockerfile that will be used to build the sandbox template.
node:internal/blob:165
throw new ERR_BUFFER_TOO_LARGE(0xFFFFFFFF);
^
RangeError [ERR_BUFFER_TOO_LARGE]: Cannot create a Buffer larger than 4294967295 bytes
at new NodeError (node:internal/errors:399:5)
at new Blob (node:internal/blob:165:13)
at R1.<anonymous> (/Users/default/Library/pnpm/global/5/.pnpm/@e2b+cli@0.1.20_openai@4.24.1/node_modules/@e2b/cli/dist/index.js:104:1093)
at R1.emit (node:events:513:28)
at n1.updateNonPrimary (/Users/default/Library/pnpm/global/5/.pnpm/@e2b+cli@0.1.20_openai@4.24.1/node_modules/@e2b/cli/dist/index.js:54:11372)
at n1.update (/Users/default/Library/pnpm/global/5/.pnpm/@e2b+cli@0.1.20_openai@4.24.1/node_modules/@e2b/cli/dist/index.js:54:11193)
at n1.ez (/Users/default/Library/pnpm/global/5/.pnpm/@e2b+cli@0.1.20_openai@4.24.1/node_modules/@e2b/cli/dist/index.js:54:13685)
at process.processTicksAndRejections (node:internal/process/task_queues:77:11) {
code: 'ERR_BUFFER_TOO_LARGE'
}
Node.js v18.16.1
We're currently using Node's Blob to represent the files and upload them all at once. One of the potential solutions to this issue is to start streaming the data in chunks to the backend.
The text was updated successfully, but these errors were encountered:
mlejva
changed the title
Unable to create custom templates with Dockerfiles containing more than ~4.3GB of local data
[E2B-457] Unable to create custom templates with Dockerfiles containing more than ~4.3GB of local data
Dec 31, 2023
When a user is creating a custom template, they specify a Dockerfile. If the Dockerfile requires to copy more than 4,294,967,295 bytes of a local data, it will fail with the following message
We're currently using Node's
Blob
to represent the files and upload them all at once. One of the potential solutions to this issue is to start streaming the data in chunks to the backend.From SyncLinear.com | E2B-457
The text was updated successfully, but these errors were encountered: