From 7182522cc582da055c626525f34c476aacc01298 Mon Sep 17 00:00:00 2001 From: Anthony Lee Date: Mon, 13 Mar 2023 10:18:37 -0600 Subject: [PATCH 1/2] Fixed a typo. I believe the input (i.e., x) is supposed to be 5. --- docs/Module3_IntroducingNumpy/AutoDiff.html | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/Module3_IntroducingNumpy/AutoDiff.html b/docs/Module3_IntroducingNumpy/AutoDiff.html index 85c620ab..4c946b90 100644 --- a/docs/Module3_IntroducingNumpy/AutoDiff.html +++ b/docs/Module3_IntroducingNumpy/AutoDiff.html @@ -439,7 +439,7 @@

Introduction to MyGrad

As expected, MyGrad computes the appropriate value for the evaluated derivative: \(\frac{\mathrm{d}f}{\mathrm{d}x}\big|_{x=5}=2 \times 5=10\). Note that all Tensor instances have a grad attribute, but prior to invoking fx.backward(), x.grad would have simply returned None.

-

It is important to reiterate that MyGrad never gives us the actual function \(\frac{\mathrm{d}f}{\mathrm{d}x}\); it only computes the derivative evaluated at a specific input \(x=10\).

+

It is important to reiterate that MyGrad never gives us the actual function \(\frac{\mathrm{d}f}{\mathrm{d}x}\); it only computes the derivative evaluated at a specific input \(x=5\).

MyGrad Adds “Drop-In” AutoDiff to NumPy

MyGrad’s functions are intentionally designed to mirror NumPy’s functions almost exactly. In fact, for all of the NumPy functions that MyGrad mirrors, we can pass a tensor to a NumPy function and it will be “coerced” into returning a tensor instead of a NumPy array – thus we can differentiate through NumPy functions!

@@ -840,4 +840,4 @@

Reading Comprehension Exercise Solutions