You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: python/paddle/amp/auto_cast.py
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -96,7 +96,7 @@ def decorate(
96
96
Args:
97
97
models(Layer|list of Layer, optional): The defined models by user, models must be either a single model or a list of models. Default is None.
98
98
optimizers(Optimizer|list of Optimizer, optional): The defined optimizers by user, optimizers must be either a single optimizer or a list of optimizers. Default is None.
99
-
level(str, optional): Auto mixed precision level. Accepted values are "O1" and "O2": O1 represent mixed precision, the decorator will do nothing;
99
+
level(str, optional): Auto mixed precision level. Accepted values are O1 and O2: O1 represent mixed precision, the decorator will do nothing;
100
100
O2 represent Pure float16/bfloat16, the decorator will cast all parameters of models to float16/bfloat16, except BatchNorm and LayerNorm. Default is O1(amp)
101
101
dtype(str, optional): Whether to use 'float16' or 'bfloat16'. Default is 'float16'.
102
102
master_weight(bool, optinal): For level='O2', whether to use multi-precision during weight updating. If master_weight is None, in O2 level optimizer will use multi-precision. Default is None.
0 commit comments