Implement a more accurate algorithm for z interpolation #2000
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implements a more accurate model for z interp which results in closely replicating the precision loss.
The amount of precision lost is directly correlated to how wide a polygon is.
This behavior seems to only apply to interpolation along x?
One theory I have is that it takes the difference between the left and right depth value and right shifts by 1. Followed by, in the case where z0 > z1, adding the remainder of the division of (z0-z1) and xdiff towards the end of the process.
Note: there are a few alternative ways to do this, while still getting an equivalent result. Such as by left shifting xdiff by 1 before division.
Some samples:
before > after > console