Commit a0f850b2 authored by Steven Cordwell's avatar Steven Cordwell
Browse files

[doc] Add detail to README

Add more detail to the installation instructions by listing what
Debian/Ubuntu packages would need to be installed to use the toolbox.
Recommend installing the package by pip. Fix the markup for the links
and make each command-line code be on its own line.
parent a674c1c2
Markov Decision Process (MDP) Toolbox 4.0 for Python Markov Decision Process (MDP) Toolbox for Python
==================================================== ================================================
.. image:: https://travis-ci.org/sawcordwell/pymdptoolbox.svg?branch=master .. image:: https://travis-ci.org/sawcordwell/pymdptoolbox.svg?branch=master
:target: https://travis-ci.org/sawcordwell/pymdptoolbox :target: https://travis-ci.org/sawcordwell/pymdptoolbox
...@@ -12,8 +12,8 @@ implemented includes backwards induction, linear programming, policy iteration, ...@@ -12,8 +12,8 @@ implemented includes backwards induction, linear programming, policy iteration,
q-learning and value iteration along with several variations. q-learning and value iteration along with several variations.
The classes and functions were developped based on the The classes and functions were developped based on the
`MATLAB <http://www.mathworks.com/products/matlab/>`_ py: `MATLAB <http://www.mathworks.com/products/matlab/>`_
`MDP toolbox http://www.inra.fr/mia/T/MDPtoolbox/>`_ by the `MDP toolbox <http://www.inra.fr/mia/T/MDPtoolbox/>`_ by the
`Biometry and Artificial Intelligence Unit <http://mia.toulouse.inra.fr/>`_ of `Biometry and Artificial Intelligence Unit <http://mia.toulouse.inra.fr/>`_ of
`INRA Toulouse <http://www.toulouse.inra.fr/>`_ (France). There are editions `INRA Toulouse <http://www.toulouse.inra.fr/>`_ (France). There are editions
available for MATLAB, GNU Octave, Scilab and R. available for MATLAB, GNU Octave, Scilab and R.
...@@ -29,44 +29,81 @@ Features ...@@ -29,44 +29,81 @@ Features
Documentation Documentation
------------- -------------
Documentation is available as docstrings in the module code and as html in the Documentation is available as docstrings in the module code.
doc folder or from `the MDPtoolbox homepage <http://www.TODO>`_.
.. TODO and as html in the doc folder or from `the MDPtoolbox homepage <>`_.
Installation Installation
------------ ------------
1. Download the latest stable release from NumPy and SciPy must be on your system to use of this toolbox. Please have a
`https://pypi.python.org/pypi/pymdptoolbox`_ or clone the look at their documentation to get them installed. If you are installing
Git repository onto Ubuntu or Debian and using Python 2 then this will pull in all the
``git https://github.com/sawcordwell/pymdptoolbox.git`` dependencies:
``sudo apt-get install python-numpy python-scipy python-cvxopt``
On the other hand, if you are using Python 3 then cvxopt will have to be
compiled (pip will do it automatically). To get NumPy, SciPy and all the
dependencies to have a fully featured cvxopt then run:
``sudo apt-get install python3-numpy python3-scipy liblapack-dev libatlas-base-dev libgsl0-dev fftw-dev libglpk-dev libdsdp-dev``
I recommend using `pip <https://pip.pypa.io/en/latest/>`_ to install
the toolbox if you have it available. Just type
``pip install pymdptoolbox``
at the console and it should take care of downloading and installing everything
for you. If you also want cvxopt to be automatically downloaded and installed
so that you can solve MDPs using linear programming then type:
``pip install "pymdptoolbox[LP]"``
If you want it to be installed just for you rather than system wide then do
2. If you downloaded the `*.zip` or `*.tar.gz` archive, then extract it ``pip install --user pymdptoolbox``
``tar -xzvf pymdptoolbox-<VERSION>.tar.gz``
``unzip pymdptoolbox-<VERSION>``
3. Change to the PyMDPtoolbox directory Otherwise, you can download the package manually from the web
``cd pymdptoolbox``
4. Install via Setuptools, either to the filesystem or to a home directory 1. Download the latest stable release from
``python setup.py install`` https://pypi.python.org/pypi/pymdptoolbox or clone the Git repository
``python setup.py install --user``
Alternatively if you have `pip <https://pip.pypa.io/en/latest/>`_ ``git clone https://github.com/sawcordwell/pymdptoolbox.git``
available then just type ``pip install pymdptoolbox`` at the console. If you
also want to be able to solve MDPs using linear programming from the cvxopt 2. If you downloaded the `*.zip` or `*.tar.gz` archive, then extract it
package then type ``pip install "pymdptoolbox[LP]"``.
``tar -xzvf pymdptoolbox-<VERSION>.tar.gz``
``unzip pymdptoolbox-<VERSION>``
3. Change to the PyMDPtoolbox directory
``cd pymdptoolbox``
4. Install via Setuptools, either to the root filesystem or to your home
directory if you don't have administrative access.
``python setup.py install``
``python setup.py install --user``
Read the
`Setuptools documentation <https://pythonhosted.org/setuptools/>`_ for
more advanced information.
Quick Use Quick Use
--------- ---------
Start Python in your favourite way. Then follow the example below to import the Start Python in your favourite way. The following example shows you how to
module, set up an example Markov decision problem using a discount value of 0.9, import the module, set up an example Markov decision problem using a discount
solve it using the value iteration algorithm, and then check the optimal policy. value of 0.9, solve it using the value iteration algorithm, and then check the
optimal policy.
>>> import mdptoolbox.example
>>> P, R = mdptoolbox.example.forest() >>> import mdptoolbox.example
>>> vi = mdptoolbox.mdp.ValueIteration(P, R, 0.9) >>> P, R = mdptoolbox.example.forest()
>>> vi.run() >>> vi = mdptoolbox.mdp.ValueIteration(P, R, 0.9)
>>> vi.policy >>> vi.run()
(0, 0, 0) >>> vi.policy
(0, 0, 0)
Contribute Contribute
---------- ----------
...@@ -76,6 +113,7 @@ Source Code: https://github.com/sawcordwell/pymdptoolbox ...@@ -76,6 +113,7 @@ Source Code: https://github.com/sawcordwell/pymdptoolbox
Support Support
------- -------
Use the issue tracker.
License License
------- -------
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment