mirror of
https://github.com/SheffieldML/GPy.git
synced 2026-05-05 01:32:40 +02:00
v1.10.0 (#908)
* Update self.num_data in GP when X is updated
* Update appveyor.yml
* Update setup.cfg
* Stop using legacy bdist_wininst
* fix: reorder brackets to avoid an n^2 array
* Minor fix to multioutput regression example, to clarify code + typo.
* added missing import
* corrected typo in function name
* fixed docstring and added more explanation
* changed ordering of explanation to get to the point fast and provide additional details after
* self.num_data and self.input_dim are set dynamically in class GP() after the shape of X. In MRD, the user-specific values are passed around until X is defined.
* fixed technical description of gradients_X()
* brushed up wording
* fix normalizer
* fix ImportError in likelihood.py
in function log_predictive_density_sampling
* Update setup.py
bump min require version of scipy to 1.3.0
* Add cython into installation requirement
* Coregionalized regression bugfix (#824)
* route default arg W_rank correctly (Addresses #823)
* Drop Python 2.7 support (fix #833)
* travis, appveyor: Add Python 3.8 build
* README: Fix scipy version number
* setup.py: Install scipy < 1.5.0 when using Python 3.5
* plotting_tests.py: Use os.makedirs instead of matplotlib.cbook.mkdirs (fix #844)
* Use super().__init__ consistently, instead of sometimes calling base class __init__ directly
* README.md: Source formatting, one badge per line
* README.md: Remove broken landscape badge (fix #831)
* README.md: Badges for devel and deploy (fix #830)
* ignore itermediary sphinx restructured text
* ignore vs code project settings file
* add yml config for readthedocs
* correct path
* drop epub and pdf builds (as per main GPy)
* typo
* headings and structure
* update copyright
* restructuring and smartening
* remove dead links
* reorder package docs
* rst "markup"
* change rst syntax
* makes sense for core to go first
* add placeholder
* initial core docs, class diagram
* lower level detail
* higher res diagrams
* layout changes for diagrams
resolve conflict
* better syntax
* redunant block
* introduction
* inheritance diagrams
* more on models
* kernel docs to kern.src
* moved doc back from kern.src to kern
* kern not kern.src in index
* better kernel description
* likelihoods
* placeholder
* add plotting to docs index
* summarise plotting
* clarification
* neater contents
* architecture diagram
* using pods
* build with dot
* more on examples
* introduction for utils package
* compromise formatting for sphinx
* correct likelihod definition
* parameterization of priors
* latent function inference intro and format
* maint: Remove tabs (and some trailing spaces)
* dpgplvm.py: Wrap long line + remove tabs
* dpgplvm.py: Fix typo in the header
* maint: Wrap very long lines (> 450 chars)
* maint: Wrap very long lines (> 400 chars)
* Add the link to the api doc on the readme page.
* remove deprecated parameter
* Update README.md
* new: Added to_dict() method to Ornstein-Uhlenbeck (OU) kernel
* fix: minor typos in README !minor
* added python 3.9 build following 4aa2ea9f5e to address https://github.com/SheffieldML/GPy/issues/881
* updated cython-generated c files for python 3.9 via `pyenv virtualenv 3.9.1 gpy391 && pyenv activate gpy391 && python setup.py build --force
* updated osx to macOS 10.15.7, JDK to 14.0.2, and XCode to Xcode 12.2 (#904)
The CI was broken. This commit fixes the CI. The root cause is reported in more detail in issue #905.
In short, the default macOS version (10.13, see the TravisCI docs) used in TravisCI isn't supported by brew which caused the brew install pandoc in the download_miniconda.sh pre-install script to hang and time out the build. It failed even on inert PRs (adding a line to README, e.g.). Now, with the updated macOS version (from 10.13 to 10.15), brew is supported and the brew install pandoc command succeeds and allows the remainder of the CI build and test sequence to succeed.
* incremented version
Co-authored-by: Masha Naslidnyk 🦉 <naslidny@amazon.co.uk>
Co-authored-by: Zhenwen Dai <zhenwendai@users.noreply.github.com>
Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
Co-authored-by: Mark McLeod <mark.mcleod@mindfoundry.ai>
Co-authored-by: Sigrid Passano Hellan <sighellan@gmail.com>
Co-authored-by: Antoine Blanchard <antoine@sand-lab-gpu.mit.edu>
Co-authored-by: kae_mihara <rukamihara@outlook.com>
Co-authored-by: lagph <49130858+lagph@users.noreply.github.com>
Co-authored-by: Julien Bect <julien.bect@centralesupelec.fr>
Co-authored-by: Neil Lawrence <ndl21@cam.ac.uk>
Co-authored-by: bobturneruk <bob.turner.uk@gmail.com>
Co-authored-by: bobturneruk <r.d.turner@sheffield.ac.uk>
Co-authored-by: gehbiszumeis <16896724+gehbiszumeis@users.noreply.github.com>
This commit is contained in:
parent
92f2e87e7b
commit
fa909768bd
72 changed files with 8568 additions and 14545 deletions
|
|
@ -64,7 +64,7 @@ if on_rtd:
|
|||
print(out)
|
||||
|
||||
#Lets regenerate our rst files from the source, -P adds private modules (i.e kern._src)
|
||||
proc = subprocess.Popen("sphinx-apidoc -P -f -o . ../../GPy", stdout=subprocess.PIPE, shell=True)
|
||||
proc = subprocess.Popen("sphinx-apidoc -M -P -f -o . ../../GPy", stdout=subprocess.PIPE, shell=True)
|
||||
(out, err) = proc.communicate()
|
||||
print("$ Apidoc:")
|
||||
print(out)
|
||||
|
|
@ -83,8 +83,13 @@ extensions = [
|
|||
#'sphinx.ext.coverage',
|
||||
'sphinx.ext.mathjax',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx.ext.graphviz',
|
||||
'sphinx.ext.inheritance_diagram',
|
||||
]
|
||||
|
||||
#---sphinx.ext.inheritance_diagram config
|
||||
inheritance_graph_attrs = dict(rankdir="LR", dpi=1200)
|
||||
|
||||
#----- Autodoc
|
||||
#import sys
|
||||
#try:
|
||||
|
|
@ -134,7 +139,7 @@ master_doc = 'index'
|
|||
project = u'GPy'
|
||||
#author = u'`Humans <https://github.com/SheffieldML/GPy/graphs/contributors>`_'
|
||||
author = 'GPy Authors, see https://github.com/SheffieldML/GPy/graphs/contributors'
|
||||
copyright = u'2015, '+author
|
||||
copyright = u'2020, '+author
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
|
|
@ -245,6 +250,10 @@ html_theme = 'sphinx_rtd_theme'
|
|||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ['_static']
|
||||
|
||||
html_css_files = [
|
||||
'wide.css',
|
||||
]
|
||||
|
||||
# Add any extra paths that contain custom files (such as robots.txt or
|
||||
# .htaccess) here, relative to this directory. These files are copied
|
||||
# directly to the root of the documentation.
|
||||
|
|
|
|||
|
|
@ -1,48 +1,90 @@
|
|||
.. GPy documentation master file, created by
|
||||
sphinx-quickstart on Fri Sep 18 18:16:28 2015.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
GPy - A Gaussian Process (GP) framework in Python
|
||||
=================================================
|
||||
|
||||
Welcome to GPy's documentation!
|
||||
===============================
|
||||
Introduction
|
||||
------------
|
||||
|
||||
`GPy <http://sheffieldml.github.io/GPy/>`_ is a Gaussian Process (GP) framework written in Python, from the Sheffield machine learning group.
|
||||
`GPy <http://sheffieldml.github.io/GPy/>`_ is a Gaussian Process (GP) framework written in Python, from the Sheffield machine learning group. It includes support for basic GP regression, multiple output GPs (using coregionalization), various noise models, sparse GPs, non-parametric regression and latent variables.
|
||||
|
||||
The `GPy homepage <http://sheffieldml.github.io/GPy/>`_ contains tutorials for users and further information on the project, including installation instructions.
|
||||
This documentation is mostly aimed at developers interacting closely with the code-base.
|
||||
|
||||
The documentation hosted here is mostly aimed at developers interacting closely with the code-base.
|
||||
|
||||
Source Code
|
||||
-----------
|
||||
|
||||
The code can be found on our `Github project page <https://github.com/SheffieldML/GPy>`_. It is open source and provided under the BSD license.
|
||||
|
||||
For developers:
|
||||
Installation
|
||||
------------
|
||||
|
||||
- `Writing new models <tuto_creating_new_models.html>`_
|
||||
- `Writing new kernels <tuto_creating_new_kernels.html>`_
|
||||
- `Write a new plotting routine using gpy_plot <tuto_plotting.html>`_
|
||||
- `Parameterization handles <tuto_parameterized.html>`_
|
||||
Installation instructions can currently be found on our `Github project page <https://github.com/SheffieldML/GPy>`_.
|
||||
|
||||
Contents:
|
||||
Tutorials
|
||||
---------
|
||||
|
||||
Several tutorials have been developed in the form of `Jupyter Notebooks <https://nbviewer.jupyter.org/github/SheffieldML/notebook/blob/master/GPy/index.ipynb>`_.
|
||||
|
||||
Architecture
|
||||
------------
|
||||
|
||||
GPy is a big, powerful package, with many features. The concept of how to use GPy in general terms is roughly as follows. A model (:py:class:`GPy.models`) is created - this is at the heart of GPy from a user perspective. A kernel (:py:class:`GPy.kern`), data and, usually, a representation of noise are assigned to the model. Specific models require, or can make use of, additional information. The kernel and noise are controlled by hyperparameters - calling the optimize (:py:class:`GPy.core.gp.GP.optimize`) method against the model invokes an iterative process which seeks optimal hyperparameter values. The model object can be used to make plots and predictions (:py:class:`GPy.core.gp.GP.predict`).
|
||||
|
||||
.. graphviz::
|
||||
|
||||
digraph GPy_Arch {
|
||||
|
||||
rankdir=LR
|
||||
node[shape="rectangle" style="rounded,filled" fontname="Arial"]
|
||||
edge [color="#006699" len=2.5]
|
||||
|
||||
Data->Model
|
||||
Hyperparameters->Kernel
|
||||
Hyperparameters->Noise
|
||||
Kernel->Model
|
||||
Noise->Model
|
||||
|
||||
Model->Optimize
|
||||
Optimize->Hyperparameters
|
||||
|
||||
Model->Predict
|
||||
Model->Plot
|
||||
|
||||
Optimize [shape="ellipse"]
|
||||
Predict [shape="ellipse"]
|
||||
Plot [shape="ellipse"]
|
||||
|
||||
subgraph cluster_0 {
|
||||
Data
|
||||
Kernel
|
||||
Noise
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
:caption: For developers
|
||||
|
||||
tuto_creating_new_models
|
||||
tuto_creating_new_kernels
|
||||
tuto_plotting
|
||||
tuto_parameterized
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
:caption: API Documentation
|
||||
|
||||
GPy.core
|
||||
GPy.core.parameterization
|
||||
GPy.models
|
||||
GPy.kern
|
||||
GPy.likelihoods
|
||||
GPy.mappings
|
||||
GPy.examples
|
||||
GPy.util
|
||||
GPy.plotting.gpy_plot
|
||||
GPy.plotting.matplot_dep
|
||||
GPy.core
|
||||
GPy.core.parameterization
|
||||
GPy.plotting
|
||||
GPy.inference.optimization
|
||||
GPy.inference.latent_function_inference
|
||||
GPy.inference.mcmc
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
||||
|
|
|
|||
|
|
@ -53,13 +53,15 @@ your code. The parameters have to be added by calling
|
|||
:py:class:`~GPy.core.parameterization.param.Param` objects as
|
||||
arguments::
|
||||
|
||||
from .core.parameterization import Param
|
||||
|
||||
def __init__(self,input_dim,variance=1.,lengthscale=1.,power=1.,active_dims=None):
|
||||
super(RationalQuadratic, self).__init__(input_dim, active_dims, 'rat_quad')
|
||||
assert input_dim == 1, "For this kernel we assume input_dim=1"
|
||||
self.variance = Param('variance', variance)
|
||||
self.lengthscale = Param('lengtscale', lengthscale)
|
||||
self.power = Param('power', power)
|
||||
self.add_parameters(self.variance, self.lengthscale, self.power)
|
||||
self.link_parameters(self.variance, self.lengthscale, self.power)
|
||||
|
||||
From now on you can use the parameters ``self.variance,
|
||||
self.lengthscale, self.power`` as normal numpy ``array-like`` s in your
|
||||
|
|
@ -71,13 +73,13 @@ automatically.
|
|||
|
||||
The implementation of this function is optional.
|
||||
|
||||
This functions deals as a callback for each optimization iteration. If
|
||||
one optimization step was successfull and the parameters (added by
|
||||
This functions is called as a callback upon each successful change to the parameters. If
|
||||
one optimization step was successfull and the parameters (linked by
|
||||
:py:func:`~GPy.core.parameterization.parameterized.Parameterized.link_parameters`
|
||||
``(*parameters)``) this callback function will be called to be able to
|
||||
update any precomputations for the kernel. Do not implement the
|
||||
gradient updates here, as those are being done by the model enclosing
|
||||
the kernel::
|
||||
``(*parameters)``) are changed, this callback function will be called. This callback may be used to
|
||||
update precomputations for the kernel. Do not implement the
|
||||
gradient updates here, as gradient updates are performed by the model enclosing
|
||||
the kernel. In this example, we issue a no-op::
|
||||
|
||||
def parameters_changed(self):
|
||||
# nothing todo here
|
||||
|
|
@ -90,8 +92,9 @@ the kernel::
|
|||
The implementation of this function in mandatory.
|
||||
|
||||
This function is used to compute the covariance matrix associated with
|
||||
the inputs X, X2 (np.arrays with arbitrary number of line (say
|
||||
:math:`n_1`, :math:`n_2`) and ``self.input_dim`` columns). ::
|
||||
the inputs X, X2 (np.arrays with arbitrary number of lines,
|
||||
:math:`n_1`, :math:`n_2`, corresponding to the number of samples over which to calculate covariance)
|
||||
and ``self.input_dim`` columns. ::
|
||||
|
||||
def K(self,X,X2):
|
||||
if X2 is None: X2 = X
|
||||
|
|
@ -171,16 +174,24 @@ is set to each ``param``. ::
|
|||
This function is required for GPLVM, BGPLVM, sparse models and uncertain inputs.
|
||||
|
||||
Computes the derivative of the likelihood with respect to the inputs
|
||||
``X`` (a :math:`n \times q` np.array). The result is returned by the
|
||||
function which is a :math:`n \times q` np.array. ::
|
||||
``X`` (a :math:`n \times q` np.array), that is, it calculates the quantity:
|
||||
|
||||
.. math::
|
||||
|
||||
\frac{\partial L}{\partial K} \frac{\partial K}{\partial X}
|
||||
|
||||
The partial derivative matrix is, in this case, comes out as an :math:`n \times q` np.array. ::
|
||||
|
||||
def gradients_X(self,dL_dK,X,X2):
|
||||
"""derivative of the covariance matrix with respect to X."""
|
||||
"""derivative of the likelihood with respect to X, calculated using dL_dK*dK_dX"""
|
||||
if X2 is None: X2 = X
|
||||
dist2 = np.square((X-X2.T)/self.lengthscale)
|
||||
|
||||
dX = -self.variance*self.power * (X-X2.T)/self.lengthscale**2 * (1 + dist2/2./self.lengthscale)**(-self.power-1)
|
||||
return np.sum(dL_dK*dX,1)[:,None]
|
||||
dK_dX = -self.variance*self.power * (X-X2.T)/self.lengthscale**2 * (1 + dist2/2./self.lengthscale)**(-self.power-1)
|
||||
return np.sum(dL_dK*dK_dX,1)[:,None]
|
||||
|
||||
Were the number of parameters to be larger than 1 or the number of dimensions likewise any larger
|
||||
than 1, the calculated partial derivitive would be a 3- or 4-tensor.
|
||||
|
||||
:py:func:`~GPy.kern.src.kern.Kern.gradients_X_diag` ``(self,dL_dKdiag,X)``
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue