Skip to content

Commit

Permalink
make it clear in the rfc that the poposed semantics is optional
Browse files Browse the repository at this point in the history
  • Loading branch information
sdasgup3 committed Oct 4, 2023
1 parent fd80e3a commit 696570b
Showing 1 changed file with 8 additions and 5 deletions.
13 changes: 8 additions & 5 deletions rfcs/20230622-quantized-reduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ The RFC introduces the following proposal, emerged out of discussion in the
[thread](https://github.com/openxla/stablehlo/pull/1538#issuecomment-1599476906)
, along with their tradeoffs.

The proposal allows the reducer block to express the computation in a different
The proposal optionally allows the reducer block to express the computation in a different
element type (preferably wider accumulation type) than the one used in reduce
op's ops arguments and return type. For illustrative purposes, in the following
example, the operand element type
Expand Down Expand Up @@ -71,11 +71,14 @@ block return (`tensor<!quant.uniform<i32:f32, accum_scale:accum_zp>>`).

### Semantics

If (1) the input operand type is different from the reduction block
argument type or (2) the op result type is different from the reduction block
return type, there will be implicit type conversion defined by either
If (1) the input operand type is different from the reduction block argument
type or (2) the op result type is different from the reduction block return
type, there will be implicit type conversion defined by either
`stablehlo.convert`, `stablehlo.uniform_quantize`, or
`stablehlo.uniform_dequantize`. For example,
`stablehlo.uniform_dequantize`. When the types are not differnet, i.e., when (1)
and (2) does not hold true, then no implicit convertion is needed.

For example,

| Implicit type conversion op | element type of operand or block return | element type of block argument or op return |
|-----------------------------------|-----------------------------------------|---------------------------------------------|
Expand Down

0 comments on commit 696570b

Please sign in to comment.