From 5c48b607fb6889dae5e838437783ffce7cf69e56 Mon Sep 17 00:00:00 2001 From: partev Date: Sat, 26 Jul 2025 17:20:02 -0400 Subject: [PATCH] fix a typo noinlinearities -> nonlinearities --- docs/notebooks/nonlinear_gaussian_ssm/ekf_mlp.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/notebooks/nonlinear_gaussian_ssm/ekf_mlp.ipynb b/docs/notebooks/nonlinear_gaussian_ssm/ekf_mlp.ipynb index f1c7a3ef..914cd40f 100644 --- a/docs/notebooks/nonlinear_gaussian_ssm/ekf_mlp.ipynb +++ b/docs/notebooks/nonlinear_gaussian_ssm/ekf_mlp.ipynb @@ -154,7 +154,7 @@ "source": [ "## Neural network\n", "\n", - "We aim to approximate the true data generating function, $f(x)$, with a parametric approximation, $h(\\theta, x)$, where $\\theta$ are the parameters and $x$ are the inputs. We use a simple feedforward neural network — a.k.a. multi-layer perceptron (MLP) — with sigmoidal noinlinearities. Here, $\\theta$ corresponds to the flattened vector of all the weights from all the layers of the model. " + "We aim to approximate the true data generating function, $f(x)$, with a parametric approximation, $h(\\theta, x)$, where $\\theta$ are the parameters and $x$ are the inputs. We use a simple feedforward neural network — a.k.a. multi-layer perceptron (MLP) — with sigmoidal nonlinearities. Here, $\\theta$ corresponds to the flattened vector of all the weights from all the layers of the model. " ] }, {