Operator is the basic computation unit in the neural network models. Currently, ONNX covers a core set of operators. There are operators which are needed by users, however they are still excluded from ONNX operator set. For example, GroupNormalization was proposed this year, and many developers already use it in their models, but in ONNX operator set, it's still missing. To make ONNX more comprehensive, we should encourage community to contribute high quality operator specification to expand our operator set.
We are maintaining a list in our GitHub issue. Please check it and post your request following this thread.
Determine whether if the proposed operator is common enough, usually it should/will be supported by at least two frameworks, such as PyTorch/Caffe2, Tensorflow, MxNet, etc.
If the requested operator can be easily expressed as several (e.g., 2 or 3) existing operators, we can define it as a Function (i.e., composite operator). MeanVarianceNormalization is the first example of registering an operator as a Function.
The spec should include:
- opset version of the added operator. Check out our versioning doc for more details
- description about the operators, should be with enough details to avoid ambiguity, adding the links to refs if necessary
- inputs,
- outputs,
- attributes,
- type constraints about input and output tensors
- shape inference function Example: https://github.com/onnx/onnx/blob/master/onnx/defs/nn/defs.cc#L1183
Usually, if we can find similar functions in Numpy, we will try to align with numpy.
The testing examples will be extracted to the doc. Later, we also generate binary data for it. Example: https://github.com/onnx/onnx/blob/master/onnx/backend/test/case/node/abs.py
Running the script to update the doc and generate the test data.
In PR 1428, we add EyeLike generator operator. It's a good example to follow.