返回
直击痛点!一文带你搞定AWS S3大文件上传
前端
2023-12-22 13:11:09
如何将 AWS S3 文件上传集成到 JavaScript
引言
在数据爆炸的时代,我们经常需要上传视频、图像、音频等大文件。传统的文件上传方式往往效率低下且容易出错。AWS S3 作为一款对象存储服务,提供了高效、可靠的文件上传解决方案。本文将详细介绍如何将 AWS S3 大文件上传相关的 API 集成为 JavaScript 文件,涵盖多文件并行上传、文件分片上传、断点续传、文件分片合成、上传暂停、取消上传、文件上传进度条显示等功能。
一、多文件并行上传
多文件并行上传是指同时上传多个文件。这种方式可以大大提高上传效率,尤其是在上传大量小文件时。在 JavaScript 中,可以使用以下代码实现多文件并行上传:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const files = ['file1.txt', 'file2.txt', 'file3.txt'];
files.forEach((file) => {
s3.upload({
Bucket: 'my-bucket',
Key: file,
Body: fs.createReadStream(file),
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`File ${file} uploaded successfully`);
}
});
});
二、文件分片上传
文件分片上传是指将大文件分成多个小块,然后分别上传这些小块。这种方式可以避免因为网络波动导致的上传失败。在 JavaScript 中,可以使用以下代码实现文件分片上传:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const file = 'large-file.txt';
const chunkSize = 10 * 1024 * 1024; // 10MB
const fileStream = fs.createReadStream(file);
const chunks = [];
fileStream.on('data', (chunk) => {
chunks.push(chunk);
});
fileStream.on('end', () => {
const numChunks = Math.ceil(fileStream.bytesRead / chunkSize);
for (let i = 0; i < numChunks; i++) {
const startByte = i * chunkSize;
const endByte = Math.min(startByte + chunkSize, fileStream.bytesRead);
const chunk = Buffer.concat(chunks.slice(i, i + 1));
s3.upload({
Bucket: 'my-bucket',
Key: file,
Body: chunk,
PartNumber: i + 1,
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`Part ${i + 1} of file ${file} uploaded successfully`);
}
});
}
});
三、断点续传
断点续传是指在上传过程中出现网络中断时,能够从断点处继续上传。在 JavaScript 中,可以使用以下代码实现断点续传:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const file = 'large-file.txt';
const chunkSize = 10 * 1024 * 1024; // 10MB
const fileStream = fs.createReadStream(file);
const chunks = [];
fileStream.on('data', (chunk) => {
chunks.push(chunk);
});
fileStream.on('end', () => {
const numChunks = Math.ceil(fileStream.bytesRead / chunkSize);
for (let i = 0; i < numChunks; i++) {
const startByte = i * chunkSize;
const endByte = Math.min(startByte + chunkSize, fileStream.bytesRead);
const chunk = Buffer.concat(chunks.slice(i, i + 1));
s3.upload({
Bucket: 'my-bucket',
Key: file,
Body: chunk,
PartNumber: i + 1,
UploadId: 'my-upload-id', // Use the same UploadId for all parts
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`Part ${i + 1} of file ${file} uploaded successfully`);
}
});
}
});
四、文件分片合成
文件分片合成是指将上传的多个分片合并成一个完整的文件。在 JavaScript 中,可以使用以下代码实现文件分片合成:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const file = 'large-file.txt';
const numChunks = 10; // Number of chunks
const s3Client = new S3();
const params = {
Bucket: 'my-bucket',
Key: file,
UploadId: 'my-upload-id', // Use the same UploadId for all parts
Parts: [] // Array of Part objects
};
for (let i = 0; i < numChunks; i++) {
params.Parts.push({
PartNumber: i + 1,
ETag: 'etag-' + i
});
}
s3Client.completeMultipartUpload(params, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`File ${file} uploaded successfully`);
}
});
五、上传暂停和取消
上传暂停和取消是指在上传过程中可以暂停或取消上传。在 JavaScript 中,可以使用以下代码实现上传暂停和取消:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const file = 'large-file.txt';
const chunkSize = 10 * 1024 * 1024; // 10MB
const fileStream = fs.createReadStream(file);
const chunks = [];
fileStream.on('data', (chunk) => {
chunks.push(chunk);
});
fileStream.on('end', () => {
const numChunks = Math.ceil(fileStream.bytesRead / chunkSize);
for (let i = 0; i < numChunks; i++) {
const startByte = i * chunkSize;
const endByte = Math.min(startByte + chunkSize, fileStream.bytesRead);
const chunk = Buffer.concat(chunks.slice(i, i + 1));
s3.upload({
Bucket: 'my-bucket',
Key: file,
Body: chunk,
PartNumber: i + 1,
UploadId: 'my-upload-id', // Use the same UploadId for all parts
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`Part ${i + 1} of file ${file} uploaded successfully`);
}
});
}
});
// Pause the upload
s3.abortMultipartUpload({
Bucket: 'my-bucket',
Key: file,
UploadId: 'my-upload-id',
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`Upload of file ${file} paused successfully`);
}
});
// Cancel the upload
s3.abortMultipartUpload({
Bucket: 'my-bucket',
Key: file,
UploadId: 'my-upload-id',
}, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(`Upload of file ${file} cancelled successfully`);
}
});
六、文件上传进度条
文件上传进度条是指在上传过程中显示上传进度。在 JavaScript 中,可以使用以下代码实现文件上传进度条:
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const file = 'large-file.txt';
const chunkSize = 10 * 1024 * 1024; // 10MB
const fileStream = fs.createReadStream(file);
const chunks = [];
fileStream.on('data', (chunk) => {
chunks.push(chunk);
});
fileStream.on('end', () => {
const numChunks = Math.ceil(fileStream.bytesRead / chunkSize);
const progressBar = document.createElement('progress');