Adding all gifs to the doc 2 (#585)

* update docs

* update docs

* update docs
This commit is contained in:
Atsushi Sakai
2021-11-28 16:00:02 +09:00
committed by GitHub
parent f8f10a3ec8
commit c99716d692
28 changed files with 193 additions and 140 deletions

View File

@@ -43,6 +43,7 @@ See this paper for more details:
modules/arm_navigation/arm_navigation
modules/aerial_navigation/aerial_navigation
modules/bipedal/bipedal
modules/control/control
modules/appendix/appendix
how_to_contribute

View File

@@ -0,0 +1,7 @@
.. _control:
Control
=================
.. include:: inverted_pendulum_mpc_control/inverted_pendulum_mpc_control.rst

View File

@@ -0,0 +1,6 @@
Inverted Pendulum MPC Control
-----------------------------
Bipedal Walking with modifying designated footsteps
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Control/InvertedPendulumCart/animation.gif

View File

@@ -2,4 +2,41 @@
Introduction
============
TBD
TBD
Definition Of Robotics
----------------------
TBD
History Of Robotics
----------------------
TBD
Application Of Robotics
------------------------
TBD
Software for Robotics
----------------------
TBD
Software for Robotics
----------------------
TBD
Python for Robotics
----------------------
TBD
Learning Robotics Algorithms
----------------------------
TBD

View File

@@ -0,0 +1,7 @@
Ensamble Kalman Filter Localization
-----------------------------------
.. figure:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/ensamble_kalman_filter/animation.gif
This is a sensor fusion localization with Ensamble Kalman Filter(EnKF).

View File

@@ -8,7 +8,6 @@ Extended Kalman Filter Localization
.. figure:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/extended_kalman_filter/animation.gif
:alt: EKF
This is a sensor fusion localization with Extended Kalman Filter(EKF).

View File

@@ -0,0 +1,22 @@
Histogram filter localization
-----------------------------
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/histogram_filter/animation.gif
This is a 2D localization example with Histogram filter.
The red cross is true position, black points are RFID positions.
The blue grid shows a position probability of histogram filter.
In this simulation, x,y are unknown, yaw is known.
The filter integrates speed input and range observations from RFID for
localization.
Initial position is not needed.
References:
~~~~~~~~~~~
- `PROBABILISTIC ROBOTICS`_

View File

@@ -3,89 +3,11 @@
Localization
============
.. include:: extended_kalman_filter_localization.rst
Unscented Kalman Filter localization
------------------------------------
|2|
This is a sensor fusion localization with Unscented Kalman Filter(UKF).
The lines and points are same meaning of the EKF simulation.
References:
~~~~~~~~~~~
- `Discriminatively Trained Unscented Kalman Filter for Mobile Robot
Localization`_
Particle filter localization
----------------------------
|3|
This is a sensor fusion localization with Particle Filter(PF).
The blue line is true trajectory, the black line is dead reckoning
trajectory,
and the red line is estimated trajectory with PF.
It is assumed that the robot can measure a distance from landmarks
(RFID).
This measurements are used for PF localization.
How to calculate covariance matrix from particles
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The covariance matrix :math:`\Xi` from particle information is calculated by the following equation:
.. math:: \Xi_{j,k}=\frac{1}{1-\sum^N_{i=1}(w^i)^2}\sum^N_{i=1}w^i(x^i_j-\mu_j)(x^i_k-\mu_k)
- :math:`\Xi_{j,k}` is covariance matrix element at row :math:`i` and column :math:`k`.
- :math:`w^i` is the weight of the :math:`i` th particle.
- :math:`x^i_j` is the :math:`j` th state of the :math:`i` th particle.
- :math:`\mu_j` is the :math:`j` th mean state of particles.
References:
~~~~~~~~~~~
- `PROBABILISTIC ROBOTICS`_
- `Improving the particle filter in high dimensions using conjugate artificial process noise`_
Histogram filter localization
-----------------------------
|4|
This is a 2D localization example with Histogram filter.
The red cross is true position, black points are RFID positions.
The blue grid shows a position probability of histogram filter.
In this simulation, x,y are unknown, yaw is known.
The filter integrates speed input and range observations from RFID for
localization.
Initial position is not needed.
References:
~~~~~~~~~~~
- `PROBABILISTIC ROBOTICS`_
.. include:: extended_kalman_filter_localization_files/extended_kalman_filter_localization.rst
.. include:: ensamble_kalman_filter_localization_files/ensamble_kalman_filter_localization.rst
.. include:: unscented_kalman_filter_localization/unscented_kalman_filter_localization.rst
.. include:: histogram_filter_localization/histogram_filter_localization.rst
.. include:: particle_filter_localization/particle_filter_localization.rst
.. _PROBABILISTIC ROBOTICS: http://www.probabilistic-robotics.org/
.. _Discriminatively Trained Unscented Kalman Filter for Mobile Robot Localization: https://www.researchgate.net/publication/267963417_Discriminatively_Trained_Unscented_Kalman_Filter_for_Mobile_Robot_Localization
.. _Improving the particle filter in high dimensions using conjugate artificial process noise: https://arxiv.org/pdf/1801.07000.pdf
.. |2| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/unscented_kalman_filter/animation.gif
.. |3| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/particle_filter/animation.gif
.. |4| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/histogram_filter/animation.gif

View File

@@ -0,0 +1,37 @@
Particle filter localization
----------------------------
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/particle_filter/animation.gif
This is a sensor fusion localization with Particle Filter(PF).
The blue line is true trajectory, the black line is dead reckoning
trajectory,
and the red line is estimated trajectory with PF.
It is assumed that the robot can measure a distance from landmarks
(RFID).
This measurements are used for PF localization.
How to calculate covariance matrix from particles
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The covariance matrix :math:`\Xi` from particle information is calculated by the following equation:
.. math:: \Xi_{j,k}=\frac{1}{1-\sum^N_{i=1}(w^i)^2}\sum^N_{i=1}w^i(x^i_j-\mu_j)(x^i_k-\mu_k)
- :math:`\Xi_{j,k}` is covariance matrix element at row :math:`i` and column :math:`k`.
- :math:`w^i` is the weight of the :math:`i` th particle.
- :math:`x^i_j` is the :math:`j` th state of the :math:`i` th particle.
- :math:`\mu_j` is the :math:`j` th mean state of particles.
References:
~~~~~~~~~~~
- `PROBABILISTIC ROBOTICS`_
- `Improving the particle filter in high dimensions using conjugate artificial process noise <https://arxiv.org/pdf/1801.07000.pdf>`_

View File

@@ -0,0 +1,13 @@
Unscented Kalman Filter localization
------------------------------------
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Localization/unscented_kalman_filter/animation.gif
This is a sensor fusion localization with Unscented Kalman Filter(UKF).
The lines and points are same meaning of the EKF simulation.
References:
~~~~~~~~~~~
- `Discriminatively Trained Unscented Kalman Filter for Mobile Robot Localization <https://www.researchgate.net/publication/267963417_Discriminatively_Trained_Unscented_Kalman_Filter_for_Mobile_Robot_Localization>`_

View File

@@ -0,0 +1,13 @@
Object shape recognition using circle fitting
---------------------------------------------
This is an object shape recognition using circle fitting.
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/circle_fitting/animation.gif
The blue circle is the true object shape.
The red crosses are observations from a ranging sensor.
The red circle is the estimated object shape using circle fitting.

View File

@@ -0,0 +1,6 @@
Gaussian grid map
-----------------
This is a 2D Gaussian grid mapping example.
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/gaussian_grid_map/animation.gif

View File

@@ -0,0 +1,6 @@
k-means object clustering
-------------------------
This is a 2D object clustering with k-means algorithm.
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/kmeans_clustering/animation.gif

View File

@@ -1,3 +1,7 @@
Lidar to grid map
--------------------
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/lidar_to_grid_map/animation.gif
This simple tutorial shows how to read LIDAR (range) measurements from a
file and convert it to occupancy grid.
@@ -12,8 +16,7 @@ a ``numpy array``, and numbers close to 1 means the cell is occupied
free (*marked with green*). The grid has the ability to represent
unknown (unobserved) areas, which are close to 0.5.
.. figure:: lidar_to_grid_map_tutorial_files/grid_map_example.png
:alt: Example
.. figure:: lidar_to_grid_map_tutorial/grid_map_example.png
In order to construct the grid map from the measurement we need to
discretise the values. But, first lets need to ``import`` some
@@ -65,7 +68,7 @@ From the distances and the angles it is easy to determine the ``x`` and
.. image:: lidar_to_grid_map_tutorial_files/lidar_to_grid_map_tutorial_5_0.png
.. image:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial_5_0.png
The ``lidar_to_grid_map.py`` contains handy functions which can used to
@@ -86,7 +89,7 @@ map. Lets see how this works.
.. image:: lidar_to_grid_map_tutorial_files/lidar_to_grid_map_tutorial_7_0.png
.. image:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial_7_0.png
.. code:: ipython3
@@ -103,7 +106,7 @@ map. Lets see how this works.
.. image:: lidar_to_grid_map_tutorial_files/lidar_to_grid_map_tutorial_8_0.png
.. image:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial_8_0.png
To fill empty areas, a queue-based algorithm can be used that can be
@@ -160,7 +163,7 @@ from a center point (e.g. (10, 20)) with zeros:
.. image:: lidar_to_grid_map_tutorial_files/lidar_to_grid_map_tutorial_12_0.png
.. image:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial_12_0.png
Lets use this flood fill on real data:
@@ -191,5 +194,5 @@ Lets use this flood fill on real data:
.. image:: lidar_to_grid_map_tutorial_files/lidar_to_grid_map_tutorial_14_1.png
.. image:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial_14_1.png

View File

@@ -3,49 +3,10 @@
Mapping
=======
Gaussian grid map
-----------------
.. include:: gaussian_grid_map/gaussian_grid_map.rst
.. include:: ray_casting_grid_map/ray_casting_grid_map.rst
.. include:: lidar_to_grid_map_tutorial/lidar_to_grid_map_tutorial.rst
.. include:: k_means_object_clustering/k_means_object_clustering.rst
.. include:: circle_fitting/circle_fitting.rst
.. include:: rectangle_fitting/rectangle_fitting.rst
This is a 2D Gaussian grid mapping example.
|2|
Ray casting grid map
--------------------
This is a 2D ray casting grid mapping example.
|3|
Lidar to grid map
--------------------
|6|
.. include:: lidar_to_grid_map_tutorial.rst
k-means object clustering
-------------------------
This is a 2D object clustering with k-means algorithm.
|4|
Object shape recognition using circle fitting
---------------------------------------------
This is an object shape recognition using circle fitting.
|5|
The blue circle is the true object shape.
The red crosses are observations from a ranging sensor.
The red circle is the estimated object shape using circle fitting.
.. |2| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/gaussian_grid_map/animation.gif
.. |3| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/raycasting_grid_map/animation.gif
.. |4| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/kmeans_clustering/animation.gif
.. |5| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/circle_fitting/animation.gif
.. |6| image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/lidar_to_grid_map/animation.gif

View File

@@ -0,0 +1,6 @@
Ray casting grid map
--------------------
This is a 2D ray casting grid mapping example.
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/raycasting_grid_map/animation.gif

View File

@@ -0,0 +1,7 @@
Object shape recognition using rectangle fitting
------------------------------------------------
This is an object shape recognition using rectangle fitting.
.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/Mapping/rectangle_fitting/animation.gif