[MXNET-867] Pooling1D with "same" padding#12594
Conversation
|
Thanks for your contribution @ChaiBapchya |
| param.stride[0])); | ||
| } else { | ||
| oshape[2] = static_cast<int>(ceil( | ||
| static_cast<float>(dshape[2] + 2 * param.pad[0]) / |
There was a problem hiding this comment.
Unable to find the indentation issue. Also is there a similar linting tool for python to ensure I don't miss out on those. Do we use pylint?
| stride = 2 | ||
| kernel = 2 | ||
| output_data=mx.nd.Pooling( | ||
| input_data, |
There was a problem hiding this comment.
Do you also want to test the case where user selects "same" pad type but also supplies pad values?
|
Thanks for making this contribution. LGTM overall. You may also want to update the operator description in operator/nn/pooling.cc since you added a new type "same" |
|
As discussed with Sina Afrooze and Lin, this is just the first step towards complete solution/fix. Immediate demand was for Max Pool 1 Dimension to support "same" padding (the way Tensorflow does). Above pull request addresses that. However, for a complete solution, following things need to be understood Asymmetric PaddingFeature request is about "Asymmetric padding" Why isn't Asymmetric padding supported by CUDnn, MXNet?Due to misalignment of input-output if padding happens only on one side (left or right alone). Current work-aroundLevel of abstraction introduced. Future stepsIf one has to fully implement asymmetric padding, following changes need to be made
|
|
@ChaiBapchya can you create and issue for the future steps, giving the full description as in the above comment? Issues will be tracked, the above comment could get lost. |
|
Sure. Thanks for suggesting :) |
Description
MaxPooling 1D added another pooling convention (aka padding type) called "same"
Checklist
Essentials
Please feel free to remove inapplicable items for your PR.