update docs

This commit is contained in:
Atsushi Sakai
2018-11-17 08:34:10 +09:00
parent ad9173f0ae
commit 5832322591
6 changed files with 67 additions and 40 deletions

View File

@@ -27,7 +27,7 @@ In the continous form,
.. math:: \mathbb E[X] = \int_{-\infty}^\infty x\, f(x) \,dx
.. code:: ipython3
.. code-block:: ipython3
import numpy as np
import random
@@ -61,7 +61,7 @@ data meaning the spread of the data.
.. math:: \mathit{VAR}(X) = \frac{1}{n}\sum_{i=1}^n (x_i - \mu)^2
.. code:: ipython3
.. code-block:: ipython3
x=np.random.randn(10)
np.var(x)
@@ -123,7 +123,7 @@ normal distribution:
\begin{aligned}VAR(X) = \sigma_x^2 &= \frac{1}{n}\sum_{i=1}^n(X - \mu)^2\\
COV(X, Y) = \sigma_{xy} &= \frac{1}{n}\sum_{i=1}^n[(X-\mu_x)(Y-\mu_y)\big]\end{aligned}
.. code:: ipython3
.. code-block:: ipython3
x=np.random.random((3,3))
np.cov(x)
@@ -141,7 +141,7 @@ normal distribution:
Covariance taking the data as **sample** with :math:`\frac{1}{N-1}`
.. code:: ipython3
.. code-block:: ipython3
x_cor=np.random.rand(1,10)
y_cor=np.random.rand(1,10)
@@ -159,7 +159,7 @@ Covariance taking the data as **sample** with :math:`\frac{1}{N-1}`
Covariance taking the data as **population** with :math:`\frac{1}{N}`
.. code:: ipython3
.. code-block:: ipython3
np.cov(x_cor,y_cor,bias=1)
@@ -183,7 +183,7 @@ According to this theorem, the average of n samples of random and
independant variables tends to follow a normal distribution as we
increase the sample size.(Generally, for n>=30)
.. code:: ipython3
.. code-block:: ipython3
import matplotlib.pyplot as plt
import random
@@ -222,13 +222,15 @@ described with two parameters, the mean (:math:`\mu`) and the variance
f(x, \mu, \sigma) = \frac{1}{\sigma\sqrt{2\pi}} \exp\big [{-\frac{(x-\mu)^2}{2\sigma^2} }\big ]
Range is [$-:raw-latex:`\inf`,:raw-latex:`\inf `$]
Range is
.. math:: [-\inf,\inf]
This is just a function of mean(\ :math:`\mu`) and standard deviation
(:math:`\sigma`) and what gives the normal distribution the
charecteristic **bell curve**.
.. code:: ipython3
.. code-block:: ipython3
import matplotlib.mlab as mlab
import math
@@ -284,7 +286,7 @@ New mean is
\sigma_\mathtt{new} = \frac{\sigma_z^2\bar\sigma^2}{\bar\sigma^2+\sigma_z^2}
.. code:: ipython3
.. code-block:: ipython3
import matplotlib.mlab as mlab
import math
@@ -336,7 +338,7 @@ of the two.
\begin{gathered}\mu_x = \mu_p + \mu_z \\
\sigma_x^2 = \sigma_z^2+\sigma_p^2\, \square\end{gathered}
.. code:: ipython3
.. code-block:: ipython3
import matplotlib.mlab as mlab
import math
@@ -375,7 +377,7 @@ of the two.
.. image:: Kalmanfilter_basics_files/Kalmanfilter_basics_21_1.png
.. code:: ipython3
.. code-block:: ipython3
#Example from:
#https://scipython.com/blog/visualizing-the-bivariate-gaussian-distribution/
@@ -448,7 +450,7 @@ a given (X,Y) value.
\*\* numpy einsum examples \*\*
.. code:: ipython3
.. code-block:: ipython3
a = np.arange(25).reshape(5,5)
b = np.arange(5)
@@ -471,7 +473,7 @@ a given (X,Y) value.
[3 4 5]]
.. code:: ipython3
.. code-block:: ipython3
#this is the diagonal sum, i repeated means the diagonal
np.einsum('ij', a)
@@ -490,7 +492,7 @@ a given (X,Y) value.
.. code:: ipython3
.. code-block:: ipython3
A = np.arange(3).reshape(3,1)
B = np.array([[ 0, 1, 2, 3],
@@ -508,7 +510,7 @@ a given (X,Y) value.
.. code:: ipython3
.. code-block:: ipython3
D = np.array([0,1,2])
E = np.array([[ 0, 1, 2, 3],
@@ -526,7 +528,7 @@ a given (X,Y) value.
.. code:: ipython3
.. code-block:: ipython3
from scipy.stats import multivariate_normal
x, y = np.mgrid[-5:5:.1, -5:5:.1]

View File

@@ -4,9 +4,12 @@ KF Basics - Part 2
### Probabilistic Generative Laws
**1st Law**: The belief representing the state :math:`x_{t}`, is
conditioned on all past states, measurements and controls. This can be
shown mathematically by the conditional probability shown below:
1st Law:
^^^^^^^^
The belief representing the state :math:`x_{t}`, is conditioned on all
past states, measurements and controls. This can be shown mathematically
by the conditional probability shown below:
.. math:: p(x_{t} | x_{0:t-1},z_{1:t-1},u_{1:t})
@@ -27,7 +30,8 @@ Therefore the law now holds as:
.. math:: p(x_{t} | x_{0:t-1},z_{1:t-1},u_{1:t})=p(x_{t} | x_{t-1},u_{t})
**2nd Law**:
2nd Law:
^^^^^^^^
If :math:`x_{t}` is complete, then:
@@ -84,7 +88,9 @@ hand, given C (Coin 1 is selected), A and B are independent.
Bayes Rule:
~~~~~~~~~~~
Posterior = $:raw-latex:`\frac{Likelihood*Prior}{Marginal}` $
Posterior =
.. math:: \frac{Likelihood*Prior}{Marginal}
Here,
@@ -158,7 +164,7 @@ and the resultant covariance is smaller.
Bayes filter localization example:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. code:: ipython3
.. code-block:: ipython3
from IPython.display import Image
Image(filename="bayes_filter.png",width=400)
@@ -281,9 +287,9 @@ The variance in terms of the Kalman gain:
&= (1-K)\bar\sigma^2
\end{aligned}
**:math:`K` is the Kalman gain. Its the crux of the Kalman filter. It
:math:`K` is the *Kalman gain*. Its the crux of the Kalman filter. It
is a scaling term that chooses a value partway between :math:`\mu_z` and
:math:`\bar\mu`.**
:math:`\bar\mu`.
Kalman Filter - Univariate and Multivariate
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

View File

@@ -31,7 +31,7 @@ https://robotacademy.net.au/lesson/inverse-kinematics-for-a-2-joint-robot-arm-us
First, lets define a class to make plotting our arm easier.
.. code:: ipython3
.. code-block:: ipython3
%matplotlib inline
from math import cos, sin
@@ -70,7 +70,7 @@ First, lets define a class to make plotting our arm easier.
Lets also define a function to make it easier to draw an angle on our
diagram.
.. code:: ipython3
.. code-block:: ipython3
from math import sqrt
@@ -103,7 +103,7 @@ called forward_kinematics - forward kinematics specifies the
end-effector position given the joint angles and link lengths. Forward
kinematics is easier than inverse kinematics.
.. code:: ipython3
.. code-block:: ipython3
arm = TwoLinkArm()
@@ -172,7 +172,7 @@ kinematics for :math:`\theta_0` and :math:`\theta_1`, but that would be
the wrong move. An easier path involves going back to the geometry of
the arm.
.. code:: ipython3
.. code-block:: ipython3
from math import pi
@@ -245,7 +245,7 @@ to the “arm-down” configuration of the arm, which is whats shown in the
diagram. Now well derive an equation for :math:`\theta_0` that depends
on this value of :math:`\theta_1`.
.. code:: ipython3
.. code-block:: ipython3
from math import atan2
@@ -294,7 +294,7 @@ We now have an expression for this angle :math:`\beta` in terms of one
of our arms joint angles. Now, can we relate :math:`\beta` to
:math:`\theta_0`? Yes!
.. code:: ipython3
.. code-block:: ipython3
arm.plot()
label_diagram()