-
Notifications
You must be signed in to change notification settings - Fork 2
Description
When uploading e.g. 100GB split up into ten thousands of files, the upload process is significantly slower than uploading the same file size as one or only a few files.
As the number of chunks (and therefore the number of upload requests to the server) is the same, the problem has to be the number of files.
We suspect that the reason for this is that in uploadNextChunk() in resumable.ts the code is iterating over all files (and inside resumableFile then over all corresponding chunks) until a chunk is found that needs to be uploaded.
This means for every finished file the code has to iterate over all finished files before it can upload the next chunk.
Is there a better way to do this? Probably some solution that keeps track of the last uploaded chunk and automatically starts the next one without searching for it?
Caveats:
- Make sure that such a new solution will still work correctly with simultaneous uploads of multiple chunks. The "last uploaded" chunk might change when another concurrent upload is started.
- If the upload keeps track of the last uploaded chunk and ignores previous ones, how to handle resumed uploads? Maybe if no "last chunk" is known just iterate over all chunks as before until one is found that needs uploading?