You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This can be tricky because I'm assuming that the digest is recursive. But we could check the whole unpacked directory size first? Maybe even in parallel, delaying the first progress update to not lose unpacking time? Or even just progress in terms of the number of files processed would work well, the speed wouldn't be constant, but still.
I guess this is something that can be done file by file as we compute the digest of Cardano database immutable files one by one with the CardanoImmutableDigester 👍
Why
Considering that the digest currently takes over 10 minutes to compute on Mainnet, it would be better UX if we could show the progress to the user.
What
I see in this code fragment that no progress is being updated.
This can be tricky because I'm assuming that the digest is recursive. But we could check the whole unpacked directory size first? Maybe even in parallel, delaying the first progress update to not lose unpacking time? Or even just progress in terms of the number of files processed would work well, the speed wouldn't be constant, but still.
(original Slack thread)
The text was updated successfully, but these errors were encountered: