Opens an external site in a new window
Mental Health Awareness Month
“Community”
RODNEY LAB
  • Home
  • Plus +
  • Newsletter
  • Links
  • Profile
RODNEY LAB
  • Home
  • Plus +
  • Newsletter
  • Links

SvelteKit S3 Multipart Upload: Video Cloud Storage # SvelteKit S3 Multipart Upload: Video Cloud Storage #

blurry low resolution placeholder image SvelteKit S3 Compatible Storage
  1. Home Rodney Lab Home
  2. Blog Posts Rodney Lab Blog Posts
  3. SvelteKit SvelteKit Blog Posts
<PREVIOUS POST
NEXT POST >
LATEST POST >>

SvelteKit S3 Multipart Upload: Video Cloud Storage #

Updated 3 months ago
10 minute read
Gunning Fog Index: 5.8
2 comments
Content by Rodney
blurry low resolution placeholder image Author Image: Rodney from Rodney Lab
SHARE:

🏋🏽 Uploading Video and other Large File to S3 Compatible Storage #

This post on SvelteKit S3 multipart upload follows on from the earlier post on uploading small files to S3 compatible storage. We will see how to upload large video files to cloud storage. In that earlier post we saw using an S3 compatible API (even while using Backblaze, Cloudflare R2, Supabase or another cloud storage provider) makes your code more flexible than using the provider's native API. We also saw the benefits of using pre-signed URLs for file upload and download. We level up the code from that tutorial here and introduce multipart uploads with pre-signed URLs. Sticking with an S3 compatible API, we will still leverage the flexibility benefits that brings. I hope you find this a useful and interesting extension to the previous tutorial.

⚙️ SvelteKit S3 Multipart Upload: Getting Started #

Instead of building everything from scratch, we will use the previous tutorial on SvelteKit S3 Compatible Storage Uploads as a starting point. You can start here and check out the other tutorial another day, although multipart S3 uploads might make more sense if you start with the other tutorial. If you did work through the pre-signed URL upload tutorial, you can create a new branch in your repo and carry on from your existing code. Otherwise, clone the following repo to get going:

    
git clone https://github.com/rodneylab/sveltekit-s3-compatible-storage.git sveltekit-s3-multipart-upload
cd sveltekit-s3-multipart-upload
pnpm install

We won't need to add any extra packages beyond the ones we used last time.

🔨 Utility Functions #

With multipart uploads, the pre-signed URL part works much as it did for a single upload. The workflow is a little different though. We will still keep the single file upload code and only use this when the file is small. With a multipart upload, we need to create a signed URL for each part we need to upload. Another difference is that once we have uploaded all the parts to their respective URLs, we then we need to tell the provider we are done. This is so that they can combine the pieces at their end. For this to work, we need to add a few more utility functions to our src/lib/utilities.js file. On top, we will be restructuring our app slightly, so need to export some of the existing functions.

To get going, let us import a few extra functions from the S3 SDK. Remember, although we are using the S3 SDK, we can expect our code to work with any S3 compatible provider (recalling only the initial authorization step will vary from provider to provider).

src/lib/utilities/storage.js
javascript
    
1 import {
2 CompleteMultipartUploadCommand,
3 CreateMultipartUploadCommand,
4 GetObjectCommand,
5 PutObjectCommand,
6 S3,
7 UploadPartCommand,
8 } from '@aws-sdk/client-s3';

Continuing, in line 18, export the authoriseAccount function because we will want to access it from our SvelteKit endpoint:

src/lib/utilities/storage.js
javascript
    
18 export async function authoriseAccount() {

Multipart Upload Functions #

Next, we have to create the function which tells the provider we are done uploading. Add this code to the same file:

src/lib/utilities/storage.js
javascript
    
62 export async function completeMultipartUpload({ parts, client, key, uploadId }) {
63 try {
64 const { VersionId: id } = await client.send(
65 new CompleteMultipartUploadCommand({
66 Key: key,
67 Bucket: S3_COMPATIBLE_BUCKET_NAME,
68 MultipartUpload: { Parts: parts },
69 UploadId: uploadId,
70 }),
71 );
72 if (id) {
73 return { successful: true, id };
74 }
75 } catch (error) {
76 console.error('Error in completing multipart upload: ', error);
77 }
78 return { successful: false };
79 }

As with authoriseAccount, we will need to export getS3Client:

src/lib/utilities/storage.js
javascript
    
85 export function getS3Client({ s3ApiUrl }) {

Next, we want a function to generate pre-signed URLs. This works just like the function we had for single file upload pre-signed URLs:

src/lib/utilities/storage.js
javascript
    
100 export async function generatePresignedPartUrls({ client, key, uploadId, partCount }) {
101 const signer = new S3RequestPresigner({ ...client.config });
102 const createRequestPromises = [];
103
104 for (let index = 0; index < partCount; index += 1) {
105 createRequestPromises.push(
106 createRequest(
107 client,
108 new UploadPartCommand({
109 Key: key,
110 Bucket: S3_COMPATIBLE_BUCKET_NAME,
111 UploadId: uploadId,
112 PartNumber: index + 1,
113 }),
114 ),
115 );
116 }
117
118 const uploadPartRequestResults = await Promise.all(createRequestPromises);
119
120 const presignPromises = [];
121 uploadPartRequestResults.forEach((element) => presignPromises.push(signer.presign(element)));
122 const presignPromiseResults = await Promise.all(presignPromises);
123 return presignPromiseResults.map((element) => formatUrl(element));
124 }

Talking of the single upload, the generatePresignedUrls function needs exporting too:

src/lib/utilities/storage.js
javascript
    
126 export async function generatePresignedUrls({ key, s3ApiUrl }) {</CodeFragment>

Lastly, we will create a function to initiate a multipart upload using the S3 SDK:

src/lib/utilities/storage.js
javascript
    
139 export const initiateMultipartUpload = async ({ client, key }) => {
140 const { UploadId: uploadId } = await client.send(
141 new CreateMultipartUploadCommand({ Key: key, Bucket: S3_COMPATIBLE_BUCKET_NAME }),
142 );
143 return uploadId;
144 };

That was a lot of pasting! Do not worry if it is not 100% clear what we are doing yet, We will start to pull everything together in the next section where we call these functions from our endpoint.

📹 Multipart Pre‑signed Upload Endpoint #

You might remember, from our SvelteKit frontend, we called an endpoint to tell us the pre-signed URL to upload the file to. Once we had that URL back, we proceeded with the upload directly from the frontend to the cloud provider. With multipart uploads, our ambition is again to upload directly from the frontend to our provider. For this to work, we will change the logic in the endpoint.

We will pass the file size to the endpoint when we request the pre-signed upload URLs. Based on the file size, our logic will decide whether we will do a single file or multipart upload. When we create an S3 client object, we get back some parameters from the provider which give us minimum, maximum and recommended file part size. So, to look at a concrete example. Let's say we want to upload a 16 MB video and the recommended part size is 5 MB. In this case, we will need four parts: the first 3 parts will be 5 MB and the final one, 1 MB. Typically, the minimum part size is not enforced by the provider for the final part in a multipart upload.

Now we know what we are doing, let's get coding!

SvelteKit S3 Multipart Upload: presigned‑urls.json Endpoint Code #

This is a substantial refactor on the previous code for the file at src/routes/api/presigned-urls.json/+server.js:

src/routes/api/presigned-urls.json/+server.js
javascript
    
1 import {
2 authoriseAccount,
3 generatePresignedPartUrls,
4 getS3Client,
5 initiateMultipartUpload,
6 presignedUrls,
7 } from '$lib/utilities/storage';
8
9 export async function POST({ request, setHeaders }) {
10 const { key, size } = await request.json();
11
12 try {
13 const { absoluteMinimumPartSize, recommendedPartSize, s3ApiUrl } = await authoriseAccount();
14 if (s3ApiUrl) {
15 const client = getS3Client({ s3ApiUrl });
16 if (absoluteMinimumPartSize && size > absoluteMinimumPartSize) {
17 const uploadId = await initiateMultipartUpload({ client, key });
18 if (recommendedPartSize) {
19 const partSize =
20 size < recommendedPartSize ? absoluteMinimumPartSize : recommendedPartSize;
21 const partCount = Math.ceil(size / partSize);
22 if (uploadId) {
23 const multipartUploadUrls = await generatePresignedPartUrls({
24 client,
25 key,
26 uploadId,
27 partCount,
28 });
29
30 const { readSignedUrl, writeSignedUrl } = await presignedUrls(key);
31
32 setHeaders({
33 'Content-Type': 'application/json',
34 });
35
36 return new Response(
37 JSON.stringify({
38 multipartUploadUrls,
39 partCount,
40 partSize,
41 readSignedUrl,
42 writeSignedUrl,
43 uploadId,
44 }),
45 );
46 }
47 }
48 }
49
50 const { readSignedUrl, writeSignedUrl } = await presignedUrls(key);
51
52 setHeaders({
53 'Content-Type': 'application/json',
54 });
55
56 return new Response(
57 JSON.stringify({
58 partCount: 1,
59 readSignedUrl,
60 writeSignedUrl,
61 }),
62 );
63 }
64 } catch (error) {
65 const message = `Error in route api/presigned-urls.json: ${error}`;
66 console.error(message);
67 throw new Error(message);
68 }
69 }

At the top of the file, you can see, we now import the functions we have just exported from the utilities file. In line 13, we get the file size parameters we talked about. We use them in line 16 to work out if we will do a multipart upload or single. For a single upload we jump to line 50 and the code is not too different to what we had last time. We just add a partCount field in the response, to let the front end code know we only have one part (line 53).

For multipart uploads, we work out how big each of the parts is based on the recommendedPartSize provided by our authorization response. Once we have that, it is just a case of generating the pre-signed URLs and returning these to the frontend with some extra meta we will find handy.

🚛 Complete Multipart Upload Endpoint #

Once the parts have been uploaded, we need to let the provider know, so they can piece the parts together. We will have a separate endpoint for this. Let’s create the file now at src/routes/api/complete-multipart-upload.json/+server.js, pasting in the content below:

src/routes/api/complete-multipart-upload.json/+server.js
javascript
    
import { authoriseAccount, completeMultipartUpload, getS3Client } from '$lib/utilities/storage';
export async function POST({ request, setHeaders }) {
const { key, parts, uploadId } = await request.json();
try {
const { s3ApiUrl } = await authoriseAccount();
if (s3ApiUrl) {
const client = getS3Client({ s3ApiUrl });
await completeMultipartUpload({ parts, client, key, uploadId });
return new Response();
}
setHeaders({
'Content-Type': 'application/json',
});
return new Response(JSON.stringify({ message: 'unauthorised' }));
} catch (error) {
const message = `Error in route api/complete-multipart-upload.json: ${error}`;
console.error(message);
throw new Error(message);
}
}

That’s all the endpoint code in place now. Let's move on to the client page next.

🧑🏽 Client Homepage Svelte Code #

There’s not too much to change vs. the single file upload code. We'll start by adding a completeMultipartUpload function which calls that last endpoint we created. Add this block to src/routes/+page.svelte:

src/routes/+page.svelte
svelte
    
async function completeMultipartUpload({ key, parts, uploadId }) {
try {
await fetch('/api/complete-multipart-upload.json', {
method: 'POST',
credentials: 'omit',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ key, parts, uploadId }),
});
} catch (error) {
console.error(`Error in completeMultipartUpload on / route: ${error}`);
}
}

Handle Submit #

Next we need to check in handleSubmit whether we have a single or multipart upload. If you are using this code in your own new project, you will probably want to refactor the block into separate functions, possibly in different files. Anyway, for now, paste in this block:

src/routes/+page.svelte
svelte
    
41 const handleSubmit = async (event) => {
42 event.preventDefault()
43 try {
44 if (files.length === 0) {
45 errors.files = 'Select a file to upload first';
46 return;
47 }
48
49 isSubmitting = true;
50 const { name: key, size, type } = files[0];
51
52 // get signed upload URL
53 const response = await fetch('/api/presigned-urls.json', {
54 method: 'POST',
55 credentials: 'omit',
56 headers: {
57 'Content-Type': 'application/json',
58 },
59 body: JSON.stringify({ key, size }),
60 });
61 const json = await response.json();
62 const { multipartUploadUrls, partCount, partSize, readSignedUrl, writeSignedUrl, uploadId } =
63 json;
64 const reader = new FileReader();
65 if (partCount === 1) {
66 downloadUrl = readSignedUrl;
67
68 // Upload (single part) file
69 reader.onloadend = async () => {
70 await fetch(writeSignedUrl, {
71 method: 'PUT',
72 body: reader.result,
73 headers: {
74 'Content-Type': type,
75 },
76 });
77 uploadComplete = true;
78 isSubmitting = false;
79 };
80 reader.readAsArrayBuffer(files[0]);
81 } else {
82 downloadUrl = readSignedUrl;
83 const lastIndex = multipartUploadUrls.length - 1;
84
85 // Upload (multipartpart) file
86 reader.onloadend = async () => {
87 const uploadPromises = multipartUploadUrls.map((element, index) =>
88 fetch(element, {
89 method: 'PUT',
90 body:
91 index !== lastIndex
92 ? reader.result.slice(index * partSize, (index + 1) * partSize)
93 : reader.result.slice(index * partSize),
94 headers: {
95 'Content-Type': type,
96 'Content-Length': index !== lastIndex ? partSize : size - index * partSize,
97 },
98 }),
99 );
100 const uploadResults = await Promise.all(uploadPromises);
101 const parts = uploadResults.map((element, index) => ({
102 ETag: element.headers.get('etag'),
103 PartNumber: index + 1,
104 }));
105 await completeMultipartUpload({ parts, key, uploadId });
106 uploadComplete = true;
107 isSubmitting = false;
108 };
109 reader.readAsArrayBuffer(files[0]);
110 }
111 } catch (error) {
112 console.error(`A;Error in handleSubmit on / route: ${error}`A;);
113 }
114 };
115 </script>

Notice in line 49 we now get the file size, so we can pass that to the pre-signed URL endpoint. The value we have is in bytes. For single part uploads, nothing really changes. So let's jump to the reader.onloadend block for multipart uploads starting at line 85.

We use JavaScript’s Promise API. That way, we do not need to wait for one part to finish uploading before we start on the next one. This allows for faster uploads. For larger files, where there will be dozens of parts, it would make sense to extend this code to throttle the downloads, so we only upload say three or four parts simultaneously and wait for one of those to finish before starting to upload a new part. We won't look at the detail of doing that here.

The code in lines 90 – 92 splits the file into chunks of the right size. We compute the part length and send it in the Content-Length header in line 95.

Multipart Upload Completion #

When we complete the multipart upload, to help piece together the parts, we send an ID to identify each part. That ID comes in the form of an ETag which is included in the multipart upload response header sent from our provider. We collate this data in lines 100 – 103 into the parts variable.

That parts object is passed to our completeMultipartUpload in this file and subsequently passed to the endpoint and the utility function.

Allowing Video Upload #

The final change is to update the user interface to accept video as well as image files:

src/routes/+page.svelte
svelte
    
165 <input
166 id="file"
167 aria-invalid={errors.files != null}
168 aria-describedby={errors.files != null ? 'files-error' : null}
169 type="file"
170 multiple
171 formenctype="multipart/form-data"
172 accept="image/*,video/*"
173 title="File"
174 onchange={handleChange}
175 />

Remember, you can change this to be more restrictive or, in fact, allow other types based on your own needs.

⛔️ CORS Update #

Because we want to look at a new header (the ETag header) from the client browser, we will need to update the bucket CORS policy. Check how to do this with your storage provider. If you are using Backblaze, you can update the backblaze-bucket-cors-rules.json file we introduced in the previous tutorial and submit this to Backblaze using the CLI.

backblaze-bucket-cors-rules.json
json
    
1 [
2 {
3 "corsRuleName": "development",
4 "allowedOrigins": ["https://test.localhost.com:5173"],
5 "allowedHeaders": ["content-type", "range"],
6 "allowedOperations": ["s3_put"],
7 "exposeHeaders": ["etag", "x-amz-version-id"],
8 "maxAgeSeconds": 300
9 },
10 {
11 "corsRuleName": "production",
12 "allowedOrigins": ["https://example.com"],
13 "allowedHeaders": ["content-type", "range"],
14 "allowedOperations": ["s3_put"],
15 "exposeHeaders": ["etag", "x-amz-version-id"],
16 "maxAgeSeconds": 3600
17 }
18 ]

🗳 Poll #

How do you use storage in your apps?
Voting reveals latest results.

🙌🏽 SvelteKit S3 Multipart Upload: What we Learned #

In this post we looked at:

  • how you can upload larger files to S3 compatible cloud storage,
  • generating pre-signed URLs for multipart upload,
  • how you can determine whether to use single or multipart upload and also calculate part size when choosing multipart upload.

I do hope there is at least one thing in this article which you can use in your work or a side project. As an extension, you might consider throttling uploads, especially when uploading very large files with many parts. You can also extend the UI to show existing uploads in the bucket and even generate download pre-signed links with custom parameters, like link validity. On top, consider adding code to abandon failed multipart uploads. This can potentially reduce costs.

You can see the full completed code for this tutorial on the Rodney Lab Git Hub repo .

🏁 SvelteKit S3 Multipart Upload: Summary #

When should you use an S3 multipart upload? #

Multipart uploads make sense for larger files. Typically, this would be when uploading videos and files of similar sizes. Generally for images, small PDFs and so on, single part uploads are fine. As a rule of thumb, for files larger than 5 MB you will consider multipart uploads. Using multipart uploads allows your file to upload quicker. This is because the file can be split into parts, uploaded simultaneously. Consider a 16 MB file; instead of uploading 16 MB from start to finish, you can split the file into four chunks and start uploading all of them at the same time.

Can you upload large files to S3 with a pre-signed URL? #

Yes. Much like you can upload single files using pre-signed URLs, you can upload multipart files too. As an initial step, you need to work out how many parts you will use and get a pre-signed URL for each of these parts. You then upload the parts, looking out for the ETag header included in the server response when each part finishes uploading. You will use those ETags in a final call to let the provider know the upload is complete. The tags will help combine the parts into a single object (or file) in the cloud.

Is it possible to upload large files to S3 compatible storage from a SvelteKit app? #

Yes! We look at SvelteKit S3 multipart uploads in this tutorial. On top, we sort out determining when a multipart or single part upload makes most sense and how to upload directly from the client browser using pre-signed links for S3 compatible cloud storage providers.

🙏🏽 SvelteKit S3 Multipart Upload: Feedback #

Have you found the post useful? Would you prefer to see posts on another topic instead? Get in touch with ideas for new posts. Also, if you like my writing style, get in touch if I can write some posts for your company site on a consultancy basis. Read on to find ways to get in touch, further below. If you want to support posts similar to this one and can spare a few dollars, euros or pounds, please consider supporting me through Buy me a Coffee.

blurry low resolution placeholder image ask Rodney X (formerly Twitter) avatar

Rodney

@askRodney

🚚 how you can upload large files like video to S3 compatible storage with SvelteKit.

You can use this to let users upload content using your ❤️ Svelte app or to push your own media to the cloud.

Hope you find it useful!

https://t.co/kLFO52d8SG #askRodney #sveltekit @backblaze

— Rodney (@askRodney) November 19, 2021

Finally, feel free to share the post on your social media accounts for all your followers who will find it useful. As well as leaving a comment below, you can get in touch via @askRodney on Twitter and also askRodney on Telegram . Also, see further ways to get in touch with Rodney Lab. I post regularly on SvelteKit as well as other topics. Also, subscribe to the newsletter to keep up-to-date with our latest projects.

Thanks for reading this post. I hope you found it valuable. Please get in touch with your feedback and suggestions for posts you would like to see. Read more about me …

blurry low resolution placeholder image Rodney from Rodney Lab
TAGS:
SVELTEKIT

Related Post

blurry low resolution placeholder image SvelteKit S3 Compatible Storage: Pre-signed Uploads

SvelteKit S3 Compatible Storage: Pre-signed Uploads

sveltekit
<PREVIOUS POST
NEXT POST >
LATEST POST >>

Leave a comment …

Your information will be handled in line with our Privacy Policy .

Comments

  • Kaspar

    Hi Rodney, thanks a lot. Is it deployed somewhere to see by any chance?

    3 years ago
    • Rodney

      Hi Kaspar, thanks for reaching out. It’s a little tricky to deploy this project publicly because anybody would be able to upload content to the S3 compatible storage. Is there a coding issue I can help with? There is also a GitHub repo with the full code, which you might find helpful.
      3 years ago

Ask for more

1 Nov 2022 — Astro Server-Side Rendering: Edge Search Site
3 Oct 2022 — Svelte eCommerce Site: SvelteKit Snipcart Storefront
1 Sept 2022 — Get Started with SvelteKit Headless WordPress

Copyright © 2020 – 2025 Rodney Johnson. All Rights Reserved. Please read important copyright and intellectual property information.

  • Home
  • Profile
  • Plus +
  • Newsletter
  • Contact
  • Links
  • Terms of Use
  • Privacy Policy
We use cookies  to enhance visitors’ experience. Please click the “Options” button to make your choice.  Learn more here.