[MKLDNN]Add quantized relu#14604
Conversation
|
Thanks for your contributions @huangzhiyuan. |
|
@mxnet-label-bot Add [MKLDNN, pr-awaiting-review] |
|
|
||
| mkldnn::algorithm GetMKLDNNActAlgo(const ActivationParam& param); | ||
| mkldnn::eltwise_forward::primitive_desc GetActFwdDescImpl( | ||
| const ActivationParam& param, bool is_train, |
There was a problem hiding this comment.
unify the format of "&"
There was a problem hiding this comment.
Not necessary there for bool or int type, please refer to https://github.com/apache/incubator-mxnet/blob/master/src/operator/nn/mkldnn/mkldnn_pooling-inl.h#L139
|
cc @anirudh2290 |
| */ | ||
| /*! | ||
| * Copyright (c) 2019 by Contributors | ||
| * \file mkldnn_act-inl.h |
| const std::vector<NDArray>& out_data) { | ||
| CHECK(in_data[0].dtype() == mshadow::kUint8 || | ||
| in_data[0].dtype() == mshadow::kInt8) | ||
| << "mkldnn_quantized_activation op only supports uint8 and int8 as input " |
There was a problem hiding this comment.
mkldnn_quantized_activation is not a valid operator name.
| if (param.act_type == activation::kReLU) { | ||
| TYPE_ASSIGN_CHECK(*out_type, 0, mshadow::kInt8); | ||
| } else { | ||
| LOG(FATAL) << "QuantizedActivationOp only supports act_type=relu for now"; |
There was a problem hiding this comment.
QuantizedActivationOp is not a valid operator name.
| check_quantized_flatten((3, 4, 23, 23), qdtype) | ||
|
|
||
| @with_seed() | ||
| def test_quantized_act(): |
There was a problem hiding this comment.
Do we have cases for non-relu activation type?
There was a problem hiding this comment.
Negative test cases are not necessary there, it well prompt that _contrib_quantized_act only supports act_type=relu for now.
There was a problem hiding this comment.
Fine. I'm personally okay about there is no non-relu cases in UT. But do you mind running one on your machine and paste the error log here? I just want to make sure that the error happens at a appropriate place and the message is accurate.
There was a problem hiding this comment.
Sure, the error log just like below when change activation type from relu to no-relu type (e.g. sigmoid):
[16:03:13] src/operator/quantization/quantized_activation.cc:54: _contrib_quantized_act only supports act_type=relu for now
Aborted
|
Please retrigger the CI |
|
@TaoLv @ZhennanQin @ciyongch please help review the PR again. |
TaoLv
left a comment
There was a problem hiding this comment.
Thank you for addressing my comments. Approved now.
pengzhao-intel
left a comment
There was a problem hiding this comment.
Thanks for your contribution. LGTM
|
@huangzhiyuan please rebase the code since I have merged the quantize v2 changes. I will merge the PR if everything is fine. |
|
Merging now, thanks for your contribution :) |
* add quantized relu * fix testcase * add author and skip quantized-relu for gpu * fix comments * retrigger ci * retrigger ci * comment fix * retrigger ci * retrigger ci
* add quantized relu * fix testcase * add author and skip quantized-relu for gpu * fix comments * retrigger ci * retrigger ci * comment fix * retrigger ci * retrigger ci
Description
This PR is to add quantized relu op and its MKLDNN implementation.
@pengzhao-intel @ZhennanQin @TaoLv
Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments