Skip to content

Commit b661505

Browse files
authored
doc for pylayer (#3483)
* doc for pylayer * polish doc * polish doc * alias paddle.autograd.PyLayer * alias PyLayerContext * edit doc according to comment * edit ref * edit ref * polish doc * polish doc * polish doc
1 parent d9da4bf commit b661505

File tree

4 files changed

+481
-6
lines changed

4 files changed

+481
-6
lines changed

doc/paddle/api/alias_api_mapping

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -575,4 +575,6 @@ paddle.nn.layer.container.LayerDict paddle.nn.LayerDict
575575
paddle.hapi.hub.Overview paddle.hub.Overview
576576
paddle.hapi.hub.list paddle.hub.list
577577
paddle.hapi.hub.help paddle.hub.help
578-
paddle.hapi.hub.load paddle.hub.load
578+
paddle.hapi.hub.load paddle.hub.load
579+
paddle.autograd.py_layer.PyLayer paddle.autograd.PyLayer
580+
paddle.autograd.py_layer.PyLayerContext paddle.autograd.PyLayerContext
Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
.. _cn_api_autograd_PyLayerContext:
2+
3+
PyLayerContext
4+
-------------------------------
5+
6+
.. py:class:: paddle.autograd.PyLayerContext
7+
8+
``PyLayerContext`` 对象能够辅助 :ref:`cn_api_autograd_PyLayer` 实现某些功能。
9+
10+
11+
**示例代码**
12+
13+
.. code-block:: python
14+
15+
import paddle
16+
from paddle.autograd import PyLayer
17+
18+
class cus_tanh(PyLayer):
19+
@staticmethod
20+
def forward(ctx, x):
21+
# ctx is a object of PyLayerContext.
22+
y = paddle.tanh(x)
23+
ctx.save_for_backward(y)
24+
return y
25+
26+
@staticmethod
27+
def backward(ctx, dy):
28+
# ctx is a object of PyLayerContext.
29+
y, = ctx.saved_tensor()
30+
grad = dy * (1 - paddle.square(y))
31+
return grad
32+
33+
34+
.. py:method:: save_for_backward(self, *tensors)
35+
36+
用于暂存 ``backward`` 需要的 ``Tensor`` ,在 ``backward`` 中调用 ``saved_tensor`` 获取这些 ``Tensor`` 。
37+
38+
.. note::
39+
这个API只能被调用一次,且只能在 ``forward`` 中调用。
40+
41+
参数
42+
::::::::::
43+
- **tensors** (list of Tensor) - 需要被暂存的 ``Tensor``
44+
45+
46+
返回:None
47+
48+
**示例代码**
49+
50+
.. code-block:: python
51+
52+
import paddle
53+
from paddle.autograd import PyLayer
54+
55+
class cus_tanh(PyLayer):
56+
@staticmethod
57+
def forward(ctx, x):
58+
# ctx is a context object that store some objects for backward.
59+
y = paddle.tanh(x)
60+
# Pass tensors to backward.
61+
ctx.save_for_backward(y)
62+
return y
63+
64+
@staticmethod
65+
def backward(ctx, dy):
66+
# Get the tensors passed by forward.
67+
y, = ctx.saved_tensor()
68+
grad = dy * (1 - paddle.square(y))
69+
return grad
70+
71+
72+
.. py:method:: saved_tensor(self, *tensors)
73+
74+
获取被 ``save_for_backward`` 暂存的 ``Tensor`` 。
75+
76+
77+
返回:如果调用 ``save_for_backward`` 暂存了一些 ``Tensor`` ,则返回这些 ``Tensor`` ,否则,返回 None。
78+
79+
**示例代码**
80+
81+
.. code-block:: python
82+
83+
import paddle
84+
from paddle.autograd import PyLayer
85+
86+
class cus_tanh(PyLayer):
87+
@staticmethod
88+
def forward(ctx, x):
89+
# ctx is a context object that store some objects for backward.
90+
y = paddle.tanh(x)
91+
# Pass tensors to backward.
92+
ctx.save_for_backward(y)
93+
return y
94+
95+
@staticmethod
96+
def backward(ctx, dy):
97+
# Get the tensors passed by forward.
98+
y, = ctx.saved_tensor()
99+
grad = dy * (1 - paddle.square(y))
100+
return grad
Lines changed: 162 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,162 @@
1+
.. _cn_api_autograd_PyLayer:
2+
3+
PyLayer
4+
-------------------------------
5+
6+
.. py:class:: paddle.autograd.PyLayer
7+
8+
Paddle通过创建 ``PyLayer`` 子类的方式实现Python端自定义算子,这个子类必须遵守以下规则:
9+
10+
1. 子类必须包含静态的 ``forward`` 和 ``backward`` 函数,它们的第一个参数必须是 :ref:`cn_api_autograd_PyLayerContext` ,如果 ``backward`` 的某个返回值在 ``forward`` 中对应的 ``Tensor`` 是需要梯度,这个返回值必须为 ``Tensor`` 。
11+
12+
2. ``backward`` 除了第一个参数以外,其他参数都是 ``forward`` 函数的输出 ``Tensor`` 的梯度,因此, ``backward`` 输入的 ``Tensor`` 的数量必须等于 ``forward`` 输出 ``Tensor`` 的数量。如果你需在 ``backward`` 中使用 ``forward`` 的输入 ``Tensor`` ,你可以将这些 ``Tensor`` 输入到 :ref:`cn_api_autograd_PyLayerContext` 的 ``save_for_backward`` 方法,之后在 ``backward`` 中使用这些 ``Tensor`` 。
13+
14+
3. ``backward`` 的输出可以是 ``Tensor`` 或者 ``list/tuple(Tensor)`` ,这些 ``Tensor`` 是 ``forward`` 输出 ``Tensor`` 的梯度。因此, ``backward`` 的输出 ``Tensor`` 的个数等于 ``forward`` 输入 ``Tensor`` 的个数。
15+
16+
构建完自定义算子后,通过 ``apply`` 运行算子。
17+
18+
19+
**示例代码**
20+
21+
.. code-block:: python
22+
23+
import paddle
24+
from paddle.autograd import PyLayer
25+
26+
# Inherit from PyLayer
27+
class cus_tanh(PyLayer):
28+
@staticmethod
29+
def forward(ctx, x, func1, func2=paddle.square):
30+
# ctx is a context object that store some objects for backward.
31+
ctx.func = func2
32+
y = func1(x)
33+
# Pass tensors to backward.
34+
ctx.save_for_backward(y)
35+
return y
36+
37+
@staticmethod
38+
# forward has only one output, so there is only one gradient in the input of backward.
39+
def backward(ctx, dy):
40+
# Get the tensors passed by forward.
41+
y, = ctx.saved_tensor()
42+
grad = dy * (1 - ctx.func(y))
43+
# forward has only one input, so only one gradient tensor is returned.
44+
return grad
45+
46+
47+
data = paddle.randn([2, 3], dtype="float64")
48+
data.stop_gradient = False
49+
z = cus_tanh.apply(data, func1=paddle.tanh)
50+
z.mean().backward()
51+
52+
print(data.grad)
53+
54+
55+
.. py:method:: forward(ctx, *args, **kwargs)
56+
57+
``forward`` 函数必须被子类重写,它的第一个参数是 :ref:`cn_api_autograd_PyLayerContext` 的对象,其他输入参数的类型和数量任意。
58+
59+
参数
60+
::::::::::
61+
- **\*args** (tuple) - 自定义算子的输入
62+
- **\*\*kwargs** (dict) - 自定义算子的输入
63+
64+
返回:Tensor或至少包含一个Tensor的list/tuple
65+
66+
**示例代码**
67+
68+
.. code-block:: python
69+
70+
import paddle
71+
from paddle.autograd import PyLayer
72+
73+
class cus_tanh(PyLayer):
74+
@staticmethod
75+
def forward(ctx, x):
76+
y = paddle.tanh(x)
77+
# Pass tensors to backward.
78+
ctx.save_for_backward(y)
79+
return y
80+
81+
@staticmethod
82+
def backward(ctx, dy):
83+
# Get the tensors passed by forward.
84+
y, = ctx.saved_tensor()
85+
grad = dy * (1 - paddle.square(y))
86+
return grad
87+
88+
89+
.. py:method:: backward(ctx, *args, **kwargs)
90+
91+
``backward`` 函数的作用是计算梯度,它必须被子类重写,其第一个参数为 :ref:`cn_api_autograd_PyLayerContext` 的对象,其他输入参数为 ``forward`` 输出 ``Tensor`` 的梯度。它的输出 ``Tensor`` 为 ``forward`` 输入 ``Tensor`` 的梯度。
92+
93+
参数
94+
::::::::::
95+
- **\*args** (tuple) - ``forward`` 输出 ``Tensor`` 的梯度。
96+
- **\*\*kwargs** (dict) - ``forward`` 输出 ``Tensor`` 的梯度。
97+
98+
返回: ``forward`` 输入 ``Tensor`` 的梯度。
99+
100+
**示例代码**
101+
102+
.. code-block:: python
103+
104+
import paddle
105+
from paddle.autograd import PyLayer
106+
107+
class cus_tanh(PyLayer):
108+
@staticmethod
109+
def forward(ctx, x):
110+
y = paddle.tanh(x)
111+
# Pass tensors to backward.
112+
ctx.save_for_backward(y)
113+
return y
114+
115+
@staticmethod
116+
def backward(ctx, dy):
117+
# Get the tensors passed by forward.
118+
y, = ctx.saved_tensor()
119+
grad = dy * (1 - paddle.square(y))
120+
return grad
121+
122+
123+
.. py:method:: apply(cls, *args, **kwargs)
124+
125+
构建完自定义算子后,通过 ``apply`` 运行算子。
126+
127+
参数
128+
::::::::::
129+
- **\*args** (tuple) - 自定义算子的输入
130+
- **\*\*kwargs** (dict) - 自定义算子的输入
131+
132+
返回:Tensor或至少包含一个Tensor的list/tuple
133+
134+
**示例代码**
135+
136+
.. code-block:: python
137+
138+
import paddle
139+
from paddle.autograd import PyLayer
140+
141+
class cus_tanh(PyLayer):
142+
@staticmethod
143+
def forward(ctx, x, func1, func2=paddle.square):
144+
ctx.func = func2
145+
y = func1(x)
146+
# Pass tensors to backward.
147+
ctx.save_for_backward(y)
148+
return y
149+
150+
@staticmethod
151+
def backward(ctx, dy):
152+
# Get the tensors passed by forward.
153+
y, = ctx.saved_tensor()
154+
grad = dy * (1 - ctx.func(y))
155+
return grad
156+
157+
158+
data = paddle.randn([2, 3], dtype="float64")
159+
data.stop_gradient = False
160+
# run custom Layer.
161+
z = cus_tanh.apply(data, func1=paddle.tanh)
162+

0 commit comments

Comments
 (0)