Skip to content

refine doc for prelu #17810

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 5, 2019
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion paddle/fluid/API.spec
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ paddle.fluid.layers.pow (ArgSpec(args=['x', 'factor', 'name'], varargs=None, key
paddle.fluid.layers.stanh (ArgSpec(args=['x', 'scale_a', 'scale_b', 'name'], varargs=None, keywords=None, defaults=(0.6666666666666666, 1.7159, None)), ('document', '959936a477efc6c1447a9c8bf8ce94bb'))
paddle.fluid.layers.hard_sigmoid (ArgSpec(args=['x', 'slope', 'offset', 'name'], varargs=None, keywords=None, defaults=(0.2, 0.5, None)), ('document', '607d79ca873bee40eed1c79a96611591'))
paddle.fluid.layers.swish (ArgSpec(args=['x', 'beta', 'name'], varargs=None, keywords=None, defaults=(1.0, None)), ('document', 'ef745e55a48763ee7b46b21a81dc7e84'))
paddle.fluid.layers.prelu (ArgSpec(args=['x', 'mode', 'param_attr', 'name'], varargs=None, keywords=None, defaults=(None, None)), ('document', 'f6acef7ff7d887e49ff499fbb1dad4a9'))
paddle.fluid.layers.prelu (ArgSpec(args=['x', 'mode', 'param_attr', 'name'], varargs=None, keywords=None, defaults=(None, None)), ('document', 'e3ba1188359f4343650bea77831a9b74'))
paddle.fluid.layers.brelu (ArgSpec(args=['x', 't_min', 't_max', 'name'], varargs=None, keywords=None, defaults=(0.0, 24.0, None)), ('document', '3db337c195e156e6ef2b8b4a57113600'))
paddle.fluid.layers.leaky_relu (ArgSpec(args=['x', 'alpha', 'name'], varargs=None, keywords=None, defaults=(0.02, None)), ('document', 'f878486c82b576938151daad0de995a0'))
paddle.fluid.layers.soft_relu (ArgSpec(args=['x', 'threshold', 'name'], varargs=None, keywords=None, defaults=(40.0, None)), ('document', '3490ed5c9835ae039a82979daf3918a4'))
Expand Down
22 changes: 15 additions & 7 deletions python/paddle/fluid/layers/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -8817,14 +8817,19 @@ def prelu(x, mode, param_attr=None, name=None):
.. math::
y = \max(0, x) + \\alpha * \min(0, x)

There are three modes for the activation:

.. code-block:: text

all: all elements share same alpha
channel: elements in same channel share same alpha
element: All element has the same alpha

Args:
x (Variable): The input tensor.
param_attr(ParamAttr|None): The parameter attribute for the learnable
weight (alpha).
mode (string): The mode for weight sharing. It supports all, channel
and element. all: all elements share same weight
channel:elements in a channel share same weight
element:each element has a weight
weight (alpha), it can be create by ParamAttr.
mode (string): The mode for weight sharing.
name(str|None): A name for this layer(optional). If set None, the layer
will be named automatically.

Expand All @@ -8835,9 +8840,12 @@ def prelu(x, mode, param_attr=None, name=None):

.. code-block:: python

x = fluid.layers.data(name="x", shape=[10,10], dtype="float32")
from paddle.fluid.param_attr import ParamAttr
x = fluid.layers.data(name="x", shape=[5,10,10], dtype="float32")
mode = 'channel'
output = fluid.layers.prelu(x,mode)
output = fluid.layers.prelu(
x,mode,param_attr=ParamAttr(name='alpha'))

"""
helper = LayerHelper('prelu', **locals())
if mode not in ['all', 'channel', 'element']:
Expand Down