This repository has been archived by the owner on Jul 7, 2023. It is now read-only.
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix transformer decoding when using attention other than dot_product.
Note that _beam_decode_slow should not be wrapped in variable_scope(name) unlike _fast_decode which must be wrapped so. Fixes #674 and allows to decode with transformer_relative.
- Loading branch information