You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Azure will calculate md5 checksums only in the case where a file is uploaded in a single chunk. (See SO post for more info).
The chunk size defaults to 32MB but can be raised up to 265MB. Anything over that would require local calculation of the MD5 hash. Alternatively, Azure offers an ETag which seems to be calculated for larger files, though this value doesn't look like it corresponds to a content hash.
Not having Checksums reliably calculated causes issues with Pimcore, which assumes the checksum method won't error out. I filed downstream ticket: pimcore/pimcore#17438.
How to reproduce
Create a file larger than 32MB (e.g. with mkfile -n 44m ~/Downloads/44mb) and then call checksum on that file.
Suggestion
Expose an option to set singleBlobUploadThresholdInBytes on the client.
The text was updated successfully, but these errors were encountered:
Bug Report
Summary
Azure will calculate md5 checksums only in the case where a file is uploaded in a single chunk. (See SO post for more info).
The chunk size defaults to 32MB but can be raised up to 265MB. Anything over that would require local calculation of the MD5 hash.
Alternatively, Azure offers an ETag which seems to be calculated for larger files, though this value doesn't look like it corresponds to a content hash.Not having Checksums reliably calculated causes issues with Pimcore, which assumes the checksum method won't error out. I filed downstream ticket: pimcore/pimcore#17438.
How to reproduce
Create a file larger than 32MB (e.g. with
mkfile -n 44m ~/Downloads/44mb
) and then callchecksum
on that file.Suggestion
Expose an option to set
singleBlobUploadThresholdInBytes
on the client.The text was updated successfully, but these errors were encountered: