update kernel tutorial

This commit is contained in:
Homer Strong 2015-07-19 14:30:27 -07:00
parent 3a150198e8
commit 1d7712ecc8
372 changed files with 92313 additions and 121 deletions

View file

@ -0,0 +1,102 @@
GPy.core.parameterization package
=================================
Submodules
----------
GPy.core.parameterization.domains module
----------------------------------------
.. automodule:: GPy.core.parameterization.domains
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.index_operations module
-------------------------------------------------
.. automodule:: GPy.core.parameterization.index_operations
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.lists_and_dicts module
------------------------------------------------
.. automodule:: GPy.core.parameterization.lists_and_dicts
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.observable_array module
-------------------------------------------------
.. automodule:: GPy.core.parameterization.observable_array
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.param module
--------------------------------------
.. automodule:: GPy.core.parameterization.param
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.parameter_core module
-----------------------------------------------
.. automodule:: GPy.core.parameterization.parameter_core
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.parameterized module
----------------------------------------------
.. automodule:: GPy.core.parameterization.parameterized
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.priors module
---------------------------------------
.. automodule:: GPy.core.parameterization.priors
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.ties_and_remappings module
----------------------------------------------------
.. automodule:: GPy.core.parameterization.ties_and_remappings
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.transformations module
------------------------------------------------
.. automodule:: GPy.core.parameterization.transformations
:members:
:undoc-members:
:show-inheritance:
GPy.core.parameterization.variational module
--------------------------------------------
.. automodule:: GPy.core.parameterization.variational
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.core.parameterization
:members:
:undoc-members:
:show-inheritance:

77
doc/_build/html/_sources/GPy.core.txt vendored Normal file
View file

@ -0,0 +1,77 @@
GPy.core package
================
Subpackages
-----------
.. toctree::
GPy.core.parameterization
Submodules
----------
GPy.core.gp module
------------------
.. automodule:: GPy.core.gp
:members:
:undoc-members:
:show-inheritance:
GPy.core.mapping module
-----------------------
.. automodule:: GPy.core.mapping
:members:
:undoc-members:
:show-inheritance:
GPy.core.model module
---------------------
.. automodule:: GPy.core.model
:members:
:undoc-members:
:show-inheritance:
GPy.core.sparse_gp module
-------------------------
.. automodule:: GPy.core.sparse_gp
:members:
:undoc-members:
:show-inheritance:
GPy.core.sparse_gp_mpi module
-----------------------------
.. automodule:: GPy.core.sparse_gp_mpi
:members:
:undoc-members:
:show-inheritance:
GPy.core.svigp module
---------------------
.. automodule:: GPy.core.svigp
:members:
:undoc-members:
:show-inheritance:
GPy.core.symbolic module
------------------------
.. automodule:: GPy.core.symbolic
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.core
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,70 @@
GPy.examples package
====================
Submodules
----------
GPy.examples.classification module
----------------------------------
.. automodule:: GPy.examples.classification
:members:
:undoc-members:
:show-inheritance:
GPy.examples.coreg_example module
---------------------------------
.. automodule:: GPy.examples.coreg_example
:members:
:undoc-members:
:show-inheritance:
GPy.examples.dimensionality_reduction module
--------------------------------------------
.. automodule:: GPy.examples.dimensionality_reduction
:members:
:undoc-members:
:show-inheritance:
GPy.examples.non_gaussian module
--------------------------------
.. automodule:: GPy.examples.non_gaussian
:members:
:undoc-members:
:show-inheritance:
GPy.examples.regression module
------------------------------
.. automodule:: GPy.examples.regression
:members:
:undoc-members:
:show-inheritance:
GPy.examples.stochastic module
------------------------------
.. automodule:: GPy.examples.stochastic
:members:
:undoc-members:
:show-inheritance:
GPy.examples.tutorials module
-----------------------------
.. automodule:: GPy.examples.tutorials
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.examples
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,102 @@
GPy.inference.latent_function_inference package
===============================================
Submodules
----------
GPy.inference.latent_function_inference.dtc module
--------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.dtc
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.exact_gaussian_inference module
-----------------------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.exact_gaussian_inference
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.expectation_propagation module
----------------------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.expectation_propagation
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.expectation_propagation_dtc module
--------------------------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.expectation_propagation_dtc
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.fitc module
---------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.fitc
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.inferenceX module
---------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.inferenceX
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.laplace module
------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.laplace
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.posterior module
--------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.posterior
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.var_dtc module
------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.var_dtc
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.var_dtc_gpu module
----------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.var_dtc_gpu
:members:
:undoc-members:
:show-inheritance:
GPy.inference.latent_function_inference.var_dtc_parallel module
---------------------------------------------------------------
.. automodule:: GPy.inference.latent_function_inference.var_dtc_parallel
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.inference.latent_function_inference
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,30 @@
GPy.inference.mcmc package
==========================
Submodules
----------
GPy.inference.mcmc.hmc module
-----------------------------
.. automodule:: GPy.inference.mcmc.hmc
:members:
:undoc-members:
:show-inheritance:
GPy.inference.mcmc.samplers module
----------------------------------
.. automodule:: GPy.inference.mcmc.samplers
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.inference.mcmc
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,62 @@
GPy.inference.optimization package
==================================
Submodules
----------
GPy.inference.optimization.conjugate_gradient_descent module
------------------------------------------------------------
.. automodule:: GPy.inference.optimization.conjugate_gradient_descent
:members:
:undoc-members:
:show-inheritance:
GPy.inference.optimization.gradient_descent_update_rules module
---------------------------------------------------------------
.. automodule:: GPy.inference.optimization.gradient_descent_update_rules
:members:
:undoc-members:
:show-inheritance:
GPy.inference.optimization.optimization module
----------------------------------------------
.. automodule:: GPy.inference.optimization.optimization
:members:
:undoc-members:
:show-inheritance:
GPy.inference.optimization.scg module
-------------------------------------
.. automodule:: GPy.inference.optimization.scg
:members:
:undoc-members:
:show-inheritance:
GPy.inference.optimization.sgd module
-------------------------------------
.. automodule:: GPy.inference.optimization.sgd
:members:
:undoc-members:
:show-inheritance:
GPy.inference.optimization.stochastics module
---------------------------------------------
.. automodule:: GPy.inference.optimization.stochastics
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.inference.optimization
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,19 @@
GPy.inference package
=====================
Subpackages
-----------
.. toctree::
GPy.inference.latent_function_inference
GPy.inference.mcmc
GPy.inference.optimization
Module contents
---------------
.. automodule:: GPy.inference
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,62 @@
GPy.kern._src.psi_comp package
==============================
Submodules
----------
GPy.kern._src.psi_comp.linear_psi_comp module
---------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.linear_psi_comp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.psi_comp.rbf_psi_comp module
------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.rbf_psi_comp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.psi_comp.rbf_psi_gpucomp module
---------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.rbf_psi_gpucomp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.psi_comp.sslinear_psi_comp module
-----------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.sslinear_psi_comp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.psi_comp.ssrbf_psi_comp module
--------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.ssrbf_psi_comp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.psi_comp.ssrbf_psi_gpucomp module
-----------------------------------------------
.. automodule:: GPy.kern._src.psi_comp.ssrbf_psi_gpucomp
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.kern._src.psi_comp
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,197 @@
GPy.kern._src package
=====================
Subpackages
-----------
.. toctree::
GPy.kern._src.psi_comp
Submodules
----------
GPy.kern._src.ODE_UY module
---------------------------
.. automodule:: GPy.kern._src.ODE_UY
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.ODE_UYC module
----------------------------
.. automodule:: GPy.kern._src.ODE_UYC
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.ODE_st module
---------------------------
.. automodule:: GPy.kern._src.ODE_st
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.ODE_t module
--------------------------
.. automodule:: GPy.kern._src.ODE_t
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.add module
------------------------
.. automodule:: GPy.kern._src.add
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.brownian module
-----------------------------
.. automodule:: GPy.kern._src.brownian
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.coregionalize module
----------------------------------
.. automodule:: GPy.kern._src.coregionalize
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.hierarchical module
---------------------------------
.. automodule:: GPy.kern._src.hierarchical
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.independent_outputs module
----------------------------------------
.. automodule:: GPy.kern._src.independent_outputs
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.kern module
-------------------------
.. automodule:: GPy.kern._src.kern
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.kernel_slice_operations module
--------------------------------------------
.. automodule:: GPy.kern._src.kernel_slice_operations
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.linear module
---------------------------
.. automodule:: GPy.kern._src.linear
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.mlp module
------------------------
.. automodule:: GPy.kern._src.mlp
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.periodic module
-----------------------------
.. automodule:: GPy.kern._src.periodic
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.poly module
-------------------------
.. automodule:: GPy.kern._src.poly
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.prod module
-------------------------
.. automodule:: GPy.kern._src.prod
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.rbf module
------------------------
.. automodule:: GPy.kern._src.rbf
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.splitKern module
------------------------------
.. automodule:: GPy.kern._src.splitKern
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.static module
---------------------------
.. automodule:: GPy.kern._src.static
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.stationary module
-------------------------------
.. automodule:: GPy.kern._src.stationary
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.symbolic module
-----------------------------
.. automodule:: GPy.kern._src.symbolic
:members:
:undoc-members:
:show-inheritance:
GPy.kern._src.trunclinear module
--------------------------------
.. automodule:: GPy.kern._src.trunclinear
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.kern._src
:members:
:undoc-members:
:show-inheritance:

17
doc/_build/html/_sources/GPy.kern.txt vendored Normal file
View file

@ -0,0 +1,17 @@
GPy.kern package
================
Subpackages
-----------
.. toctree::
GPy.kern._src
Module contents
---------------
.. automodule:: GPy.kern
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,86 @@
GPy.likelihoods package
=======================
Submodules
----------
GPy.likelihoods.bernoulli module
--------------------------------
.. automodule:: GPy.likelihoods.bernoulli
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.exponential module
----------------------------------
.. automodule:: GPy.likelihoods.exponential
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.gamma module
----------------------------
.. automodule:: GPy.likelihoods.gamma
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.gaussian module
-------------------------------
.. automodule:: GPy.likelihoods.gaussian
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.likelihood module
---------------------------------
.. automodule:: GPy.likelihoods.likelihood
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.link_functions module
-------------------------------------
.. automodule:: GPy.likelihoods.link_functions
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.mixed_noise module
----------------------------------
.. automodule:: GPy.likelihoods.mixed_noise
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.poisson module
------------------------------
.. automodule:: GPy.likelihoods.poisson
:members:
:undoc-members:
:show-inheritance:
GPy.likelihoods.student_t module
--------------------------------
.. automodule:: GPy.likelihoods.student_t
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.likelihoods
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,46 @@
GPy.mappings package
====================
Submodules
----------
GPy.mappings.additive module
----------------------------
.. automodule:: GPy.mappings.additive
:members:
:undoc-members:
:show-inheritance:
GPy.mappings.kernel module
--------------------------
.. automodule:: GPy.mappings.kernel
:members:
:undoc-members:
:show-inheritance:
GPy.mappings.linear module
--------------------------
.. automodule:: GPy.mappings.linear
:members:
:undoc-members:
:show-inheritance:
GPy.mappings.mlp module
-----------------------
.. automodule:: GPy.mappings.mlp
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.mappings
:members:
:undoc-members:
:show-inheritance:

198
doc/_build/html/_sources/GPy.models.txt vendored Normal file
View file

@ -0,0 +1,198 @@
GPy.models package
==================
Submodules
----------
GPy.models.bayesian_gplvm module
--------------------------------
.. automodule:: GPy.models.bayesian_gplvm
:members:
:undoc-members:
:show-inheritance:
GPy.models.bayesian_gplvm_minibatch module
------------------------------------------
.. automodule:: GPy.models.bayesian_gplvm_minibatch
:members:
:undoc-members:
:show-inheritance:
GPy.models.bcgplvm module
-------------------------
.. automodule:: GPy.models.bcgplvm
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_classification module
-----------------------------------
.. automodule:: GPy.models.gp_classification
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_coregionalized_regression module
----------------------------------------------
.. automodule:: GPy.models.gp_coregionalized_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_heteroscedastic_regression module
-----------------------------------------------
.. automodule:: GPy.models.gp_heteroscedastic_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_kronecker_gaussian_regression module
--------------------------------------------------
.. automodule:: GPy.models.gp_kronecker_gaussian_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_multioutput_regression module
-------------------------------------------
.. automodule:: GPy.models.gp_multioutput_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_regression module
-------------------------------
.. automodule:: GPy.models.gp_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.gp_var_gauss module
------------------------------
.. automodule:: GPy.models.gp_var_gauss
:members:
:undoc-members:
:show-inheritance:
GPy.models.gplvm module
-----------------------
.. automodule:: GPy.models.gplvm
:members:
:undoc-members:
:show-inheritance:
GPy.models.gradient_checker module
----------------------------------
.. automodule:: GPy.models.gradient_checker
:members:
:undoc-members:
:show-inheritance:
GPy.models.mrd module
---------------------
.. automodule:: GPy.models.mrd
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gp_classification module
------------------------------------------
.. automodule:: GPy.models.sparse_gp_classification
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gp_coregionalized_regression module
-----------------------------------------------------
.. automodule:: GPy.models.sparse_gp_coregionalized_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gp_minibatch module
-------------------------------------
.. automodule:: GPy.models.sparse_gp_minibatch
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gp_multioutput_regression module
--------------------------------------------------
.. automodule:: GPy.models.sparse_gp_multioutput_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gp_regression module
--------------------------------------
.. automodule:: GPy.models.sparse_gp_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.sparse_gplvm module
------------------------------
.. automodule:: GPy.models.sparse_gplvm
:members:
:undoc-members:
:show-inheritance:
GPy.models.ss_gplvm module
--------------------------
.. automodule:: GPy.models.ss_gplvm
:members:
:undoc-members:
:show-inheritance:
GPy.models.ss_mrd module
------------------------
.. automodule:: GPy.models.ss_mrd
:members:
:undoc-members:
:show-inheritance:
GPy.models.svigp_regression module
----------------------------------
.. automodule:: GPy.models.svigp_regression
:members:
:undoc-members:
:show-inheritance:
GPy.models.warped_gp module
---------------------------
.. automodule:: GPy.models.warped_gp
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.models
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,30 @@
GPy.plotting.matplot_dep.latent_space_visualizations.controllers package
========================================================================
Submodules
----------
GPy.plotting.matplot_dep.latent_space_visualizations.controllers.axis_event_controller module
---------------------------------------------------------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.latent_space_visualizations.controllers.axis_event_controller
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.latent_space_visualizations.controllers.imshow_controller module
-----------------------------------------------------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.latent_space_visualizations.controllers.imshow_controller
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.plotting.matplot_dep.latent_space_visualizations.controllers
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,17 @@
GPy.plotting.matplot_dep.latent_space_visualizations package
============================================================
Subpackages
-----------
.. toctree::
GPy.plotting.matplot_dep.latent_space_visualizations.controllers
Module contents
---------------
.. automodule:: GPy.plotting.matplot_dep.latent_space_visualizations
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,141 @@
GPy.plotting.matplot_dep package
================================
Subpackages
-----------
.. toctree::
GPy.plotting.matplot_dep.latent_space_visualizations
Submodules
----------
GPy.plotting.matplot_dep.Tango module
-------------------------------------
.. automodule:: GPy.plotting.matplot_dep.Tango
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.base_plots module
------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.base_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.dim_reduction_plots module
---------------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.dim_reduction_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.img_plots module
-----------------------------------------
.. automodule:: GPy.plotting.matplot_dep.img_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.inference_plots module
-----------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.inference_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.kernel_plots module
--------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.kernel_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.mapping_plots module
---------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.mapping_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.maps module
------------------------------------
.. automodule:: GPy.plotting.matplot_dep.maps
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.models_plots module
--------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.models_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.netpbmfile module
------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.netpbmfile
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.priors_plots module
--------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.priors_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.ssgplvm module
---------------------------------------
.. automodule:: GPy.plotting.matplot_dep.ssgplvm
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.svig_plots module
------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.svig_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.variational_plots module
-------------------------------------------------
.. automodule:: GPy.plotting.matplot_dep.variational_plots
:members:
:undoc-members:
:show-inheritance:
GPy.plotting.matplot_dep.visualize module
-----------------------------------------
.. automodule:: GPy.plotting.matplot_dep.visualize
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.plotting.matplot_dep
:members:
:undoc-members:
:show-inheritance:

View file

@ -0,0 +1,17 @@
GPy.plotting package
====================
Subpackages
-----------
.. toctree::
GPy.plotting.matplot_dep
Module contents
---------------
.. automodule:: GPy.plotting
:members:
:undoc-members:
:show-inheritance:

102
doc/_build/html/_sources/GPy.testing.txt vendored Normal file
View file

@ -0,0 +1,102 @@
GPy.testing package
===================
Submodules
----------
GPy.testing.examples_tests module
---------------------------------
.. automodule:: GPy.testing.examples_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.fitc module
-----------------------
.. automodule:: GPy.testing.fitc
:members:
:undoc-members:
:show-inheritance:
GPy.testing.index_operations_tests module
-----------------------------------------
.. automodule:: GPy.testing.index_operations_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.inference_tests module
----------------------------------
.. automodule:: GPy.testing.inference_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.kernel_tests module
-------------------------------
.. automodule:: GPy.testing.kernel_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.likelihood_tests module
-----------------------------------
.. automodule:: GPy.testing.likelihood_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.model_tests module
------------------------------
.. automodule:: GPy.testing.model_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.observable_tests module
-----------------------------------
.. automodule:: GPy.testing.observable_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.parameterized_tests module
--------------------------------------
.. automodule:: GPy.testing.parameterized_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.pickle_tests module
-------------------------------
.. automodule:: GPy.testing.pickle_tests
:members:
:undoc-members:
:show-inheritance:
GPy.testing.prior_tests module
------------------------------
.. automodule:: GPy.testing.prior_tests
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.testing
:members:
:undoc-members:
:show-inheritance:

26
doc/_build/html/_sources/GPy.txt vendored Normal file
View file

@ -0,0 +1,26 @@
GPy package
===========
Subpackages
-----------
.. toctree::
GPy.core
GPy.examples
GPy.inference
GPy.kern
GPy.likelihoods
GPy.mappings
GPy.models
GPy.plotting
GPy.testing
GPy.util
Module contents
---------------
.. automodule:: GPy
:members:
:undoc-members:
:show-inheritance:

230
doc/_build/html/_sources/GPy.util.txt vendored Normal file
View file

@ -0,0 +1,230 @@
GPy.util package
================
Submodules
----------
GPy.util.block_matrices module
------------------------------
.. automodule:: GPy.util.block_matrices
:members:
:undoc-members:
:show-inheritance:
GPy.util.caching module
-----------------------
.. automodule:: GPy.util.caching
:members:
:undoc-members:
:show-inheritance:
GPy.util.classification module
------------------------------
.. automodule:: GPy.util.classification
:members:
:undoc-members:
:show-inheritance:
GPy.util.config module
----------------------
.. automodule:: GPy.util.config
:members:
:undoc-members:
:show-inheritance:
GPy.util.datasets module
------------------------
.. automodule:: GPy.util.datasets
:members:
:undoc-members:
:show-inheritance:
GPy.util.debug module
---------------------
.. automodule:: GPy.util.debug
:members:
:undoc-members:
:show-inheritance:
GPy.util.decorators module
--------------------------
.. automodule:: GPy.util.decorators
:members:
:undoc-members:
:show-inheritance:
GPy.util.diag module
--------------------
.. automodule:: GPy.util.diag
:members:
:undoc-members:
:show-inheritance:
GPy.util.erfcx module
---------------------
.. automodule:: GPy.util.erfcx
:members:
:undoc-members:
:show-inheritance:
GPy.util.functions module
-------------------------
.. automodule:: GPy.util.functions
:members:
:undoc-members:
:show-inheritance:
GPy.util.gpu_init module
------------------------
.. automodule:: GPy.util.gpu_init
:members:
:undoc-members:
:show-inheritance:
GPy.util.initialization module
------------------------------
.. automodule:: GPy.util.initialization
:members:
:undoc-members:
:show-inheritance:
GPy.util.linalg module
----------------------
.. automodule:: GPy.util.linalg
:members:
:undoc-members:
:show-inheritance:
GPy.util.linalg_gpu module
--------------------------
.. automodule:: GPy.util.linalg_gpu
:members:
:undoc-members:
:show-inheritance:
GPy.util.ln_diff_erfs module
----------------------------
.. automodule:: GPy.util.ln_diff_erfs
:members:
:undoc-members:
:show-inheritance:
GPy.util.misc module
--------------------
.. automodule:: GPy.util.misc
:members:
:undoc-members:
:show-inheritance:
GPy.util.mocap module
---------------------
.. automodule:: GPy.util.mocap
:members:
:undoc-members:
:show-inheritance:
GPy.util.mpi module
-------------------
.. automodule:: GPy.util.mpi
:members:
:undoc-members:
:show-inheritance:
GPy.util.multioutput module
---------------------------
.. automodule:: GPy.util.multioutput
:members:
:undoc-members:
:show-inheritance:
GPy.util.netpbmfile module
--------------------------
.. automodule:: GPy.util.netpbmfile
:members:
:undoc-members:
:show-inheritance:
GPy.util.normalizer module
--------------------------
.. automodule:: GPy.util.normalizer
:members:
:undoc-members:
:show-inheritance:
GPy.util.parallel module
------------------------
.. automodule:: GPy.util.parallel
:members:
:undoc-members:
:show-inheritance:
GPy.util.pca module
-------------------
.. automodule:: GPy.util.pca
:members:
:undoc-members:
:show-inheritance:
GPy.util.squashers module
-------------------------
.. automodule:: GPy.util.squashers
:members:
:undoc-members:
:show-inheritance:
GPy.util.subarray_and_sorting module
------------------------------------
.. automodule:: GPy.util.subarray_and_sorting
:members:
:undoc-members:
:show-inheritance:
GPy.util.univariate_Gaussian module
-----------------------------------
.. automodule:: GPy.util.univariate_Gaussian
:members:
:undoc-members:
:show-inheritance:
GPy.util.warping_functions module
---------------------------------
.. automodule:: GPy.util.warping_functions
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: GPy.util
:members:
:undoc-members:
:show-inheritance:

38
doc/_build/html/_sources/index.txt vendored Normal file
View file

@ -0,0 +1,38 @@
.. GPy documentation master file, created by
sphinx-quickstart on Fri Jan 18 17:36:01 2013.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to GPy's documentation!
===============================
`GPy <http://sheffieldml.github.io/GPy/>`_ is a Gaussian Process (GP) framework written in Python, from the Sheffield machine learning group.
The `GPy homepage <http://sheffieldml.github.io/GPy/>`_ contains tutorials for users and further information on the project, including installation instructions.
This documentation is mostly aimed at developers interacting closely with the code-base.
The code can be found on our `Github project page <https://github.com/SheffieldML/GPy>`_. It is open source and provided under the BSD license.
.. * `Basic Gaussian process regression <tuto_GP_regression.html>`_
.. * `Interacting with models <tuto_interacting_with_models.html>`_
.. * `A kernel overview <tuto_kernel_overview.html>`_
.. * `Writing new kernels <tuto_creating_new_kernels.html>`_
.. * `Writing new models <tuto_creating_new_models.html>`_
.. * `Parameterization handles <tuto_parameterized.html>`_
.. You may also be interested by some examples in the GPy/examples folder.
Contents:
.. toctree::
:maxdepth: 2
GPy
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View file

@ -0,0 +1,31 @@
==============
Installation
==============
Linux
============
Windows
======================
One easy way to get a Python distribution with the required packages is to use the Anaconda environment from Continuum Analytics.
* Download and install the free version of Anaconda according to your operating system from `their website <https://store.continuum.io>`_.
* Open a (new) terminal window:
* Navigate to Applications/Accessories/cmd, or
* open *anaconda Command Prompt* from windows *start*
You should now be able to launch a Python interpreter by typing *ipython* in the terminal. In the ipython prompt, you can check your installation by importing the libraries we will need later:
::
$ import numpy
$ import pylab
To install the latest version of GPy, *git* is required. A *git* client on Windows can be found `here <http://git-scm.com/download/win>`_. It is recommened to install with the option "*Use Git from the Windows Command Prompt*". Then, GPy can be installed with the following command
::
pip install git+https://github.com/SheffieldML/GPy.git@devel
MacOSX
===================================

View file

@ -0,0 +1,49 @@
***************************
List of implemented kernels
***************************
The following table shows the implemented kernels in GPy and gives the details of the implemented function for each kernel.
==================== =========== ===== =========== ====== ======= =========== =============== ======= =========== ====== ====== =======
NAME Dimension ARD get/set K Kdiag dK_dtheta dKdiag_dtheta dK_dX dKdiag_dX psi0 psi1 psi2
==================== =========== ===== =========== ====== ======= =========== =============== ======= =========== ====== ====== =======
bias n |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
Brownian 1 |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
exponential n yes |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
finite_dimensional n |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
linear n yes |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
Matern32 n yes |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
Matern52 n yes |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
periodic_exponential 1 |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
periodic_Matern32 1 |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
periodic_Matern52 1 |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
rational quadratic 1 |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
rbf n yes |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
spline 1 |tick| |tick| |tick| |tick| |tick| |tick|
-------------------- ----------- ----- ----------- ------ ------- ----------- --------------- ------- ----------- ------ ------ -------
white n |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick| |tick|
==================== =========== ===== =========== ====== ======= =========== =============== ======= =========== ====== ====== =======
Depending on the use, all functions may not be required
* ``get/set, K, Kdiag``: compulsory
* ``dK_dtheta``: necessary to optimize the model
* ``dKdiag_dtheta``: sparse models, BGPLVM, GPs with uncertain inputs
* ``dK_dX``: sparse models, GPLVM, BGPLVM, GPs with uncertain inputs
* ``dKdiag_dX``: sparse models, BGPLVM, GPs with uncertain inputs
* ``psi0, psi1, psi2``: BGPLVM, GPs with uncertain inputs
.. |tick| image:: Figures/tick.png

7
doc/_build/html/_sources/modules.txt vendored Normal file
View file

@ -0,0 +1,7 @@
GPy
===
.. toctree::
:maxdepth: 4
GPy

View file

@ -0,0 +1,142 @@
*************************************
Gaussian process regression tutorial
*************************************
We will see in this tutorial the basics for building a 1 dimensional and a 2 dimensional Gaussian process regression model, also known as a kriging model. The code shown in this tutorial can be obtained at GPy/examples/tutorials.py, or by running ``GPy.examples.tutorials.tuto_GP_regression()``.
We first import the libraries we will need: ::
import pylab as pb
pb.ion()
import numpy as np
import GPy
1-dimensional model
===================
For this toy example, we assume we have the following inputs and outputs::
X = np.random.uniform(-3.,3.,(20,1))
Y = np.sin(X) + np.random.randn(20,1)*0.05
Note that the observations Y include some noise.
The first step is to define the covariance kernel we want to use for the model. We choose here a kernel based on Gaussian kernel (i.e. rbf or square exponential)::
kernel = GPy.kern.RBF(input_dim=1, variance=1., lengthscale=1.)
The parameter ``input_dim`` stands for the dimension of the input space. The parameters ``variance`` and ``lengthscale`` are optional. Many other kernels are implemented such as:
* linear (:py:class:`~GPy.kern.Linear`)
* exponential kernel (:py:class:`GPy.kern.Exponential`)
* Matern 3/2 (:py:class:`GPy.kern.Matern32`)
* Matern 5/2 (:py:class:`GPy.kern.Matern52`)
* spline (:py:class:`GPy.kern.Spline`)
* and many others...
The inputs required for building the model are the observations and the kernel::
m = GPy.models.GPRegression(X,Y,kernel)
By default, some observation noise is added to the modle. The functions ``print`` and ``plot`` give an insight of the model we have just build. The code::
print m
m.plot()
gives the following output: ::
Name : GP regression
Log-likelihood : -22.8178418808
Number of Parameters : 3
Parameters:
GP_regression. | Value | Constraint | Prior | Tied to
rbf.variance | 1.0 | +ve | |
rbf.lengthscale | 1.0 | +ve | |
Gaussian_noise.variance | 1.0 | +ve | |
.. figure:: Figures/tuto_GP_regression_m1.png
:align: center
:height: 350px
GP regression model before optimization of the parameters. The shaded region corresponds to ~95% confidence intervals (ie +/- 2 standard deviation).
The default values of the kernel parameters may not be relevant for
the current data (for example, the confidence intervals seems too wide
on the previous figure). A common approach is to find the values of
the parameters that maximize the likelihood of the data. It as easy as
calling ``m.optimize`` in GPy::
m.optimize()
If we want to perform some restarts to try to improve the result of the optimization, we can use the ``optimize_restart`` function::
m.optimize_restarts(num_restarts = 10)
Once again, we can use ``print(m)`` and ``m.plot()`` to look at the resulting model resulting model::
Name : GP regression
Log-likelihood : 11.947469082
Number of Parameters : 3
Parameters:
GP_regression. | Value | Constraint | Prior | Tied to
rbf.variance | 0.74229417323 | +ve | |
rbf.lengthscale | 1.43020495724 | +ve | |
Gaussian_noise.variance | 0.00325654460991 | +ve | |
.. figure:: Figures/tuto_GP_regression_m2.png
:align: center
:height: 350px
GP regression model after optimization of the parameters.
2-dimensional example
=====================
Here is a 2 dimensional example::
import pylab as pb
pb.ion()
import numpy as np
import GPy
# sample inputs and outputs
X = np.random.uniform(-3.,3.,(50,2))
Y = np.sin(X[:,0:1]) * np.sin(X[:,1:2])+np.random.randn(50,1)*0.05
# define kernel
ker = GPy.kern.Matern52(2,ARD=True) + GPy.kern.White(2)
# create simple GP model
m = GPy.models.GPRegression(X,Y,ker)
# optimize and plot
m.optimize(max_f_eval = 1000)
m.plot()
print(m)
The flag ``ARD=True`` in the definition of the Matern kernel specifies that we want one lengthscale parameter per dimension (ie the GP is not isotropic). The output of the last two lines is::
Name : GP regression
Log-likelihood : 26.787156248
Number of Parameters : 5
Parameters:
GP_regression. | Value | Constraint | Prior | Tied to
add.Mat52.variance | 0.385463739076 | +ve | |
add.Mat52.lengthscale | (2,) | +ve | |
add.white.variance | 0.000835329608514 | +ve | |
Gaussian_noise.variance | 0.000835329608514 | +ve | |
If you want to see the ``ARD`` parameters explicitly print them
directly::
>>> print m.add.Mat52.lengthscale
Index | GP_regression.add.Mat52.lengthscale | Constraint | Prior | Tied to
[0] | 1.9575587 | +ve | | N/A
[1] | 1.9689948 | +ve | | N/A
.. figure:: Figures/tuto_GP_regression_m3.png
:align: center
:height: 350px
Contour plot of the best predictor (posterior mean).

View file

@ -0,0 +1,212 @@
********************
Creating new kernels
********************
We will see in this tutorial how to create new kernels in GPy. We will also give details on how to implement each function of the kernel and illustrate with a running example: the rational quadratic kernel.
Structure of a kernel in GPy
============================
In GPy a kernel object is made of a list of kernpart objects, which correspond to symetric positive definite functions. More precisely, the kernel should be understood as the sum of the kernparts. In order to implement a new covariance, the following steps must be followed
1. implement the new covariance as a kernpart object
2. update the constructors that allow to use the kernpart as a kern object
3. update the __init__.py file
Theses three steps are detailed below.
Implementing a kernpart object
==============================
We advise the reader to start with copy-pasting an existing kernel and to modify the new file. We will now give a description of the various functions that can be found in a kernpart object.
**Header**
The header is similar to all kernels: ::
from kernpart import kernpart
import numpy as np
class rational_quadratic(kernpart):
**__init__(self,input_dim, param1, param2, ...)**
The implementation of this function in mandatory.
For all kernparts the first parameter ``input_dim`` corresponds to the dimension of the input space, and the following parameters stand for the parameterization of the kernel.
You have to call ``super(<class_name>, self).__init__(input_dim,
name)`` to make sure the input dimension and name of the kernel are
stored in the right place. These attributes are available as
``self.input_dim`` and ``self.name`` at runtime.
.. The following attributes are compulsory: ``self.input_dim`` (the dimension, integer), ``self.name`` (name of the kernel, string), ``self.num_params`` (number of parameters, integer). ::
Parameterization is done by adding
:py:class:``GPy.core.parameter.Param`` objects to ``self`` and use
them as normal numpy ``array-like``s in yout code. The parameters have
to be added by calling
:py:function:``GPy.core.parameterized:Parameterized.add_parameters``
with the :py:class:``GPy.core.parameter.Param`` objects as arguments.
def __init__(self,input_dim,variance=1.,lengthscale=1.,power=1.):
super(RationalQuadratic, self).__init__(input_dim, 'rat_quad')
assert input_dim == 1, "For this kernel we assume input_dim=1"
self.variance = Param('variance', variance)
self.lengthscale = Param('lengtscale', lengthscale)
self.power = Param('power', power)
self.add_parameters(self.variance, self.lengthscale, self.power)
From now on you can use the parameters ``self.variance,
self.lengthscale, self.power`` as normal numpy ``array-like``s in your
code. Updates from the optimization routine will be done
automatically.
**parameters_changed(self)**
The implementation of this function is optional.
This functions deals as a callback for each optimization iteration. If
one optimization step was successfull and the parameters (added by
:py:function:``GPy.core.parameterized:Parameterized.add_parameters``)
this callback function will be called to be able to update any
precomputations for the kernel.
def parameters_changed(self):
# nothing todo here
.. **_get_params(self)**
.. The implementation of this function in mandatory.
.. This function returns a one dimensional array of length ``self.num_params`` containing the value of the parameters. ::
.. def _get_params(self):
.. return np.hstack((self.variance,self.lengthscale,self.power))
.. **_set_params(self,x)**
.. The implementation of this function in mandatory.
.. The input is a one dimensional array of length ``self.num_params`` containing the value of the parameters. The function has no output but it updates the values of the attribute associated to the parameters (such as ``self.variance``, ``self.lengthscale``, ...). ::
.. def _set_params(self,x):
.. self.variance = x[0]
.. self.lengthscale = x[1]
.. self.power = x[2]
.. **_get_param_names(self)**
.. The implementation of this function in mandatory.
.. It returns a list of strings of length ``self.num_params`` corresponding to the parameter names. ::
.. def _get_param_names(self):
.. return ['variance','lengthscale','power']
**K(self,X,X2,target)**
The implementation of this function in mandatory.
This function is used to compute the covariance matrix associated with the inputs X, X2 (np.arrays with arbitrary number of line (say :math:`n_1`, :math:`n_2`) and ``self.input_dim`` columns). This function does not returns anything but it adds the :math:`n_1 \times n_2` covariance matrix to the kernpart to the object ``target`` (a :math:`n_1 \times n_2` np.array). This trick allows to compute the covariance matrix of a kernel containing many kernparts with a limited memory use. ::
def K(self,X,X2,target):
if X2 is None: X2 = X
dist2 = np.square((X-X2.T)/self.lengthscale)
target += self.variance*(1 + dist2/2.)**(-self.power)
**Kdiag(self,X,target)**
The implementation of this function in mandatory.
This function is similar to ``K`` but it computes only the values of the kernel on the diagonal. Thus, ``target`` is a 1-dimensional np.array of length :math:`n_1`. ::
def Kdiag(self,X,target):
target += self.variance
**dK_dtheta(self,dL_dK,X,X2,target)**
This function is required for the optimization of the parameters.
Computes the derivative of the likelihood. As previously, the values are added to the object target which is a 1-dimensional np.array of length ``self.input_dim``. For example, if the kernel is parameterized by :math:`\sigma^2,\ \theta`, then :math:`\frac{dL}{d\sigma^2} = \frac{dL}{d K} \frac{dK}{d\sigma^2}` is added to the first element of target and :math:`\frac{dL}{d\theta} = \frac{dL}{d K} \frac{dK}{d\theta}` to the second. ::
def dK_dtheta(self,dL_dK,X,X2,target):
if X2 is None: X2 = X
dist2 = np.square((X-X2.T)/self.lengthscale)
dvar = (1 + dist2/2.)**(-self.power)
dl = self.power * self.variance * dist2 * self.lengthscale**(-3) * (1 + dist2/2./self.power)**(-self.power-1)
dp = - self.variance * np.log(1 + dist2/2.) * (1 + dist2/2.)**(-self.power)
target[0] += np.sum(dvar*dL_dK)
target[1] += np.sum(dl*dL_dK)
target[2] += np.sum(dp*dL_dK)
**dKdiag_dtheta(self,dL_dKdiag,X,target)**
This function is required for BGPLVM, sparse models and uncertain inputs.
As previously, target is an ``self.num_params`` array and :math:`\frac{dL}{d Kdiag} \frac{dKdiag}{dparam}` is added to each element. ::
def dKdiag_dtheta(self,dL_dKdiag,X,target):
target[0] += np.sum(dL_dKdiag)
# here self.lengthscale and self.power have no influence on Kdiag so target[1:] are unchanged
**dK_dX(self,dL_dK,X,X2,target)**
This function is required for GPLVM, BGPLVM, sparse models and uncertain inputs.
Computes the derivative of the likelihood with respect to the inputs ``X`` (a :math:`n \times d` np.array). The result is added to target which is a :math:`n \times d` np.array. ::
def dK_dX(self,dL_dK,X,X2,target):
"""derivative of the covariance matrix with respect to X."""
if X2 is None: X2 = X
dist2 = np.square((X-X2.T)/self.lengthscale)
dX = -self.variance*self.power * (X-X2.T)/self.lengthscale**2 * (1 + dist2/2./self.lengthscale)**(-self.power-1)
target += np.sum(dL_dK*dX,1)[:,np.newaxis]
**dKdiag_dX(self,dL_dKdiag,X,target)**
This function is required for BGPLVM, sparse models and uncertain inputs. As for ``dKdiag_dtheta``, :math:`\frac{dL}{d Kdiag} \frac{dKdiag}{dX}` is added to each element of target. ::
def dKdiag_dX(self,dL_dKdiag,X,target):
pass
**Psi statistics**
The psi statistics and their derivatives are required for BGPLVM and GPS with uncertain inputs.
The expressions of the psi statistics are:
TODO
For the rational quadratic we have:
TODO
Update the constructor
======================
Once the required functions have been implemented as a kernpart object, the file GPy/kern/constructors.py has to be updated to allow to build a kernel based on the kernpart object.
The following line should be added in the preamble of the file::
from rational_quadratic import rational_quadratic as rational_quadratic_part
as well as the following block ::
def rational_quadratic(input_dim,variance=1., lengthscale=1., power=1.):
part = rational_quadraticpart(input_dim,variance, lengthscale, power)
return kern(input_dim, [part])
Update initialization
=====================
The last step is to update the list of kernels imported from constructor in GPy/kern/__init__.py.

View file

@ -0,0 +1,100 @@
.. _creating_new_models:
*******************
Creating new Models
*******************
In GPy all models inherit from the base class :py:class:`~GPy.core.parameterized.Parameterized`. :py:class:`~GPy.core.parameterized.Parameterized` is a class which allows for parameterization of objects. All it holds is functionality for tying, bounding and fixing of parameters. It also provides the functionality of searching and manipulating parameters by regular expression syntax. See :py:class:`~GPy.core.parameterized.Parameterized` for more information.
The :py:class:`~GPy.core.model.Model` class provides parameter introspection, objective function and optimization.
In order to fully use all functionality of
:py:class:`~GPy.core.model.Model` some methods need to be implemented
/ overridden. And the model needs to be told its parameters, such
that it can provide optimized parameter distribution and handling.
In order to explain the functionality of those methods
we will use a wrapper to the numpy ``rosen`` function, which holds
input parameters :math:`\mathbf{X}`. Where
:math:`\mathbf{X}\in\mathbb{R}^{N\times 1}`.
Obligatory methods
==================
:py:func:`~GPy.core.model.Model.__init__` :
Initialize the model with the given parameters. These need to
be added to the model by calling
`self.add_parameter(<param>)`, where param needs to be a
parameter handle (See parameterized_ for details).::
self.X = GPy.Param("input", X)
self.add_parameter(self.X)
:py:meth:`~GPy.core.model.Model.log_likelihood` :
Returns the log-likelihood of the new model. For our example
this is just the call to ``rosen`` and as we want to minimize
it, we need to negate the objective.::
return -scipy.optimize.rosen(self.X)
:py:meth:`~GPy.core.model.Model.parameters_changed` :
Updates the internal state of the model and sets the gradient of
each parameter handle in the hierarchy with respect to the
log_likelihod. Thus here we need to set the negative derivative of
the rosenbrock function for the parameters. In this case it is the
gradient for self.X.::
self.X.gradient = -scipy.optimize.rosen_der(self.X)
Here the full code for the `Rosen` class::
from GPy import Model, Param
import scipy
class Rosen(Model):
def __init__(self, X, name='rosenbrock'):
super(Rosen, self).__init__(name=name)
self.X = Param("input", X)
self.add_parameter(self.X)
def log_likelihood(self):
return -scipy.optimize.rosen(self.X)
def parameters_changed(self):
self.X.gradient = -scipy.optimize.rosen_der(self.X)
In order to test the newly created model, we can check the gradients
and optimize a standard rosenbrock run::
>>> m = Rosen(np.array([-1,-1]))
>>> print m
Name : rosenbrock
Log-likelihood : -404.0
Number of Parameters : 2
Parameters:
rosenbrock. | Value | Constraint | Prior | Tied to
input | (2,) | | |
>>> m.checkgrad(verbose=True)
Name | Ratio | Difference | Analytical | Numerical
------------------------------------------------------------------------------------------
rosenbrock.input[[0]] | 1.000000 | 0.000000 | -804.000000 | -804.000000
rosenbrock.input[[1]] | 1.000000 | 0.000000 | -400.000000 | -400.000000
>>> m.optimize()
>>> print m
Name : rosenbrock
Log-likelihood : -6.52150088871e-15
Number of Parameters : 2
Parameters:
rosenbrock. | Value | Constraint | Prior | Tied to
input | (2,) | | |
>>> print m.input
Index | rosenbrock.input | Constraint | Prior | Tied to
[0] | 0.99999994 | | | N/A
[1] | 0.99999987 | | | N/A
>>> print m.gradient
[ -1.91169809e-06, 1.01852309e-06]
This is the optimium for the 2D Rosenbrock function, as expected, and
the gradient of the inputs are almost zero.
Optional methods
================
Currently none.

View file

@ -0,0 +1,341 @@
.. _interacting_with_models:
*************************************
Interacting with models
*************************************
The GPy model class has a set of features which are
designed to make it simple to explore the parameter
space of the model. By default, the scipy optimisers
are used to fit GPy models (via model.optimize()),
for which we provide mechanisms for 'free' optimisation:
GPy can ensure that naturally positive parameters
(such as variances) remain positive. But these mechanisms
are much more powerful than simple reparameterisation,
as we shall see.
Along this tutorial we'll use a sparse GP regression model
as example. This example can be in ``GPy.examples.regression``.
All of the examples included in GPy return an instance
of a model class, and therefore they can be called in
the following way: ::
import numpy as np
import pylab as pb
pb.ion()
import GPy
m = GPy.examples.regression.sparse_GP_regression_1D()
Examining the model using print
===============================
To see the current state of the model parameters,
and the model's (marginal) likelihood just print the model ::
print m
The first thing displayed on the screen is the log-likelihood
value of the model with its current parameters. Below the
log-likelihood, a table with all the model's parameters
is shown. For each parameter, the table contains the name
of the parameter, the current value, and in case there are
defined: constraints, ties and prior distrbutions associated. ::
Name : sparse gp
Log-likelihood : 588.947189413
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 1.91644016819 | +ve | |
rbf.lengthscale | 2.62103621347 | +ve | |
Gaussian_noise.variance | 0.00269870373421 | +ve | |
In this case the kernel parameters (``rbf.variance``,
``rbf.lengthscale``) as well as
the likelihood noise parameter (``Gaussian_noise.variance``), are constrained
to be positive, while the inducing inputs have no
constraints associated. Also there are no ties or prior defined.
You can also print all subparts of the model, by printing the
subcomponents individually::
print m.rbf
This will print the details of this particular parameter handle::
rbf. | Value | Constraint | Prior | Tied to
variance | 1.91644016819 | +ve | |
lengthscale | 2.62103621347 | +ve | |
When you want to get a closer look into
multivalue parameters, print them directly::
print m.inducing_inputs
Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to
[0 0] | 2.7189499 | | | N/A
[1 0] | 0.02006533 | | | N/A
[2 0] | -1.5299386 | | | N/A
[3 0] | -2.7001675 | | | N/A
[4 0] | 1.4654162 | | | N/A
Interacting with Parameters:
=======================
The preferred way of interacting with parameters is to act on the
parameter handle itself.
Interacting with parameter handles is simple. The names, printed by `print m`
are accessible interactively and programatically. For example try to
set kernels (`rbf`) `lengthscale` to `.2` and print the result::
m.rbf.lengthscale = .2
print m
You should see this::
Name : sparse gp
Log-likelihood : 588.947189413
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 1.91644016819 | +ve | |
rbf.lengthscale | 0.2 | +ve | |
Gaussian_noise.variance | 0.00269870373421 | +ve | |
This will already have updated the model's inner state, so you can
plot it or see the changes in the posterior `m.posterior` of the model.
Regular expressions
----------------
The model's parameters can also be accessed through regular
expressions, by 'indexing' the model with a regular expression,
matching the parameter name. Through indexing by regular expression,
you can only retrieve leafs of the hierarchy, and you can retrieve the
values matched by calling `values()` on the returned object::
>>> print m['.*var']
Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to
[0] | 2.1500132 | | | N/A
----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------
[0] | 0.0024268215 | | | N/A
>>> print m['.*var'].values()
[ 2.1500132 0.00242682]
>>> print m['rbf']
Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to
[0] | 2.1500132 | | | N/A
----- | sparse_gp.rbf.lengthscale | ---------- | ---------- | -------
[0] | 2.6782803 | | | N/A
There is access to setting parameters by regular expression,
as well. Here are a few examples of how to set parameters by regular expression::
>>> m['.*var'] = .1
>>> print m['.*var']
Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to
[0] | 0.1 | | | N/A
----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------
[0] | 0.1 | | | N/A
>>> m['.*var'] = [.1, .2]
>>> print m['.*var']
Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to
[0] | 0.1 | | | N/A
----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------
[0] | 0.2 | | | N/A
The fact that only leaf nodes can be accesses we can print all
parameters in a flattened view, by printing the regular expression
match of matching all objects::
>>> print m['']
Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to
[0 0] | -2.6716041 | | | N/A
[1 0] | -1.4665111 | | | N/A
[2 0] | -0.031010293 | | | N/A
[3 0] | 1.4563711 | | | N/A
[4 0] | 2.6803046 | | | N/A
----- | sparse_gp.rbf.variance | ---------- | ---------- | -------
[0] | 0.1 | | | N/A
----- | sparse_gp.rbf.lengthscale | ---------- | ---------- | -------
[0] | 2.6782803 | | | N/A
----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------
[0] | 0.2 | | | N/A
Setting and fetching parameters `parameter_array`
------------------------------------------
Another way to interact with the model's parameters is through the
`parameter_array`. The Parameter array holds all the parameters of the
model in one place and is editable. It can be accessed through
indexing the model for example you can set all the parameters through
this mechanism::
>>> new_params = np.r_[[-4,-2,0,2,4], [.5,2], [.3]]
>>> print new_params
array([-4. , -2. , 0. , 2. , 4. , 0.5, 2. , 0.3])
>>> m[:] = new_params
>>> print m
Name : sparse gp
Log-likelihood : -147.561160209
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 0.5 | +sq | |
rbf.lengthscale | 2.0 | +ve | |
Gaussian_noise.variance | 0.3 | +sq | |
Parameters themselves (leafs of the hierarchy) can be indexed and used
the same way as numpy arrays. First let us set a slice of the
`inducing_inputs`::
>>> m.inducing_inputs[2:, 0] = [1,3,5]
>>> print m.inducing_indputs
Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to
[0 0] | -4 | | | N/A
[1 0] | -2 | | | N/A
[2 0] | 1 | | | N/A
[3 0] | 3 | | | N/A
[4 0] | 5 | | | N/A
Or you use the parameters as normal numpy arrays for calculations::
>>> precision = 1./m.Gaussian_noise.variance
array([ 3.33333333])
Getting the model's log likelihood
=============================================
Appart form the printing the model, the marginal
log-likelihood can be obtained by using the function
``log_likelihood()``.::
>>> m.log_likelihood()
array([-152.83377316])
If you want to ensure the log likelihood as a float, call `float()`
around it::
>>> float(m.log_likelihood())
-152.83377316356177
Getting the model parameter's gradients
============================
The gradients of a model can shed light on understanding the
(possibly hard) optimization process. The gradients of each parameter
handle can be accessed through their `gradient` field.::
>>> print m.gradient
[ 5.51170031 9.71735112 -4.20282106 -3.45667035 -1.58828165
-2.11549358 12.40292787 -627.75467803]
>>> print m.rbf.gradient
[ -2.11549358 12.40292787]
>>> m.optimize()
>>> print m.gradient
[ -5.98046560e-04 -3.64576085e-04 1.98005930e-04 3.43381219e-04
-6.85685104e-04 -1.28800748e-05 1.08552429e-03 2.74058081e-01]
Adjusting the model's constraints
================================
When we initially call the example, it was optimized and hence the
log-likelihood gradients were close to zero. However, since
we have been changing the parameters, the gradients are far from zero now.
Next we are going to show how to optimize the model setting different
restrictions on the parameters.
Once a constraint has been set on a parameter, it is possible to remove
it with the command ``unconstrain()``, which can be called on any
parameter handle of the model. The methods `constrain()` and
`unconstrain()` return the indices which were actually unconstrained,
relative to the parameter handle the method was called on. This is
particularly handy for reporting which parameters where reconstrained,
when reconstraining a parameter, which was already constrained::
>>> m.rbf.variance.unconstrain()
array([0])
>>>m.unconstrain()
array([6, 7])
If you want to unconstrain only a specific constraint, you can pass it
as an argument of ``unconstrain(Transformation)`` (:py:class:`~GPy.constraints.Transformation`), or call
the respective method, such as ``unconstrain_fixed()`` (or
``unfix()``) to only unfix fixed parameters.::
>>> m.inducing_input[0].fix()
>>> m.unfix()
>>> m.rbf.constrain_positive()
>>> print m
Name : sparse gp
Log-likelihood : 620.741066698
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 1.48329711218 | +ve | |
rbf.lengthscale | 2.5430947048 | +ve | |
Gaussian_noise.variance | 0.00229714444128 | | |
As you can see, ``unfix()`` only unfixed the inducing_input, and did
not change the positive constraint of the kernel.
The parameter handles come with default constraints, so you will
rarely be needing to adjust the constraints of a model. In the rare
cases of needing to adjust the constraints of a model, or in need of
fixing some parameters, you can do so with the functions
``constrain_{positive|negative|bounded|fixed}()``.::
m['.*var'].constrain_positive()
Available Constraints
==============
* :py:meth:`~GPy.constraints.Logexp`
* :py:meth:`~GPy.constraints.Exponent`
* :py:meth:`~GPy.constraints.Square`
* :py:meth:`~GPy.constraints.Logistic`
* :py:meth:`~GPy.constraints.LogexpNeg`
* :py:meth:`~GPy.constraints.NegativeExponent`
* :py:meth:`~GPy.constraints.NegativeLogexp`
Tying Parameters
============
Not yet implemented for GPy version 0.6.0
Optimizing the model
====================
Once we have finished defining the constraints,
we can now optimize the model with the function
``optimize``.::
m.Gaussian_noise.constrain_positive()
m.rbf.constrain_positive()
m.optimize()
By deafult, GPy uses the lbfgsb optimizer.
Some optional parameters may be discussed here.
* ``optimizer``: which optimizer to use, currently there are ``lbfgsb, fmin_tnc,
scg, simplex`` or any unique identifier uniquely identifying an
optimizer. Thus, you can say ``m.optimize('bfgs') for using the
``lbfgsb`` optimizer
* ``messages``: if the optimizer is verbose. Each optimizer has its
own way of printing, so do not be confused by differing messages of
different optimizers
* ``max_iters``: Maximum number of iterations to take. Some optimizers
see iterations as function calls, others as iterations of the
algorithm. Please be advised to look into ``scipy.optimize`` for
more instructions, if the number of iterations matter, so you can
give the right parameters to ``optimize()``
* ``gtol``: only for some optimizers. Will determine the convergence
criterion, as the tolerance of gradient to finish the optimization.
Further Reading
===============
All of the mechansiams for dealing
with parameters are baked right into GPy.core.model, from which all of
the classes in GPy.models inherrit. To learn how to construct your own
model, you might want to read :ref:`creating_new_models`. If you want
to learn how to create kernels, please refer to
:ref:`creating_new_kernels`

View file

@ -0,0 +1,285 @@
****************************
tutorial : A kernel overview
****************************
The aim of this tutorial is to give a better understanding of the kernel objects in GPy and to list the ones that are already implemented. The code shown in this tutorial can be obtained at GPy/examples/tutorials.py or by running ``GPy.examples.tutorials.tuto_kernel_overview()``.
First we import the libraries we will need ::
import pylab as pb
import numpy as np
import GPy
pb.ion()
For most kernels, the dimension is the only mandatory parameter to define a kernel object. However, it is also possible to specify the values of the parameters. For example, the three following commands are valid for defining a squared exponential kernel (ie rbf or Gaussian) ::
ker1 = GPy.kern.RBF(1) # Equivalent to ker1 = GPy.kern.RBF(input_dim=1, variance=1., lengthscale=1.)
ker2 = GPy.kern.RBF(input_dim=1, variance = .75, lengthscale=2.)
ker3 = GPy.kern.RBF(1, .5, .5)
A ``print`` and a ``plot`` functions are implemented to represent kernel objects. The commands ::
print ker2
ker1.plot()
ker2.plot()
ker3.plot()
return::
Name | Value | Constraints | Ties
-------------------------------------------------------
rbf_variance | 0.7500 | |
rbf_lengthscale | 2.0000 | |
.. figure:: Figures/tuto_kern_overview_basicplot.png
:align: center
:height: 300px
Implemented kernels
===================
Many kernels are already implemented in GPy. The following figure gives a summary of most of them (a comprehensive list can be list can be found `here <kernel_implementation.html>`_):
.. figure:: Figures/tuto_kern_overview_allkern.png
:align: center
:height: 800px
On the other hand, it is possible to use the `sympy` package to build new kernels. This will be the subject of another tutorial.
Operations to combine kernels
=============================
In ``GPy``, kernel objects can be added or multiplied. In both cases, two kinds of operations are possible since one can assume that the kernels to add/multiply are defined on the same space or on different subspaces. In other words, it is possible to use two kernels :math:`k_1,\ k_2` over :math:`\mathbb{R} \times \mathbb{R}` to create
* a kernel over :math:`\mathbb{R} \times \mathbb{R}`: :math:`k(x,y) = k_1(x,y) \times k_2(x,y)`.
This is available in GPy via the ``add`` and ``prod`` functions. Here is a quick example ::
k1 = GPy.kern.RBF(1,1.,2.)
k2 = GPy.kern.Matern32(1, 0.5, 0.2)
# Product of kernels
k_prod = k1.prod(k2)
# Sum of kernels
k_add = k1.add(k2)
.. # plots
pb.figure(figsize=(8,8))
pb.subplot(2,2,1)
k_prod.plot()
pb.title('prod')
pb.subplot(2,2,2)
k_prodtens.plot()
pb.title('prod')
pb.subplot(2,2,3)
k_add.plot()
pb.title('sum')
pb.subplot(2,2,4)
k_addtens.plot()
pb.title('sum')
pb.subplots_adjust(wspace=0.3, hspace=0.3)
.. figure:: Figures/tuto_kern_overview_multadd.png
:align: center
:height: 500px
A shortcut for ``add`` and ``prod`` is provided by the usual ``+`` and ``*`` operators. Here is another example where we create a periodic kernel with some decay ::
k1 = GPy.kern.RBF(1,1.,2)
k2 = GPy.kern.PeriodicMatern52(1,variance=1e3, lengthscale=1, period = 1.5, lower=-5., upper = 5)
k = k1 * k2 # equivalent to k = k1.prod(k2)
print k
# Simulate sample paths
X = np.linspace(-5,5,501)[:,None]
Y = np.random.multivariate_normal(np.zeros(501),k.K(X),1)
.. # plot
pb.figure(figsize=(10,4))
pb.subplot(1,2,1)
k.plot()
pb.subplot(1,2,2)
pb.plot(X,Y.T)
pb.ylabel("Sample path")
pb.subplots_adjust(wspace=0.3)
.. figure:: Figures/tuto_kern_overview_multperdecay.png
:align: center
:height: 300px
In general, ``kern`` objects can be seen as a sum of ``kernparts`` objects, where the later are covariance functions defined on the same space. For example, the following code ::
k = (k1+k2)*(k1+k2)
print k.parts[0].name, '\n', k.parts[1].name, '\n', k.parts[1].parts[0].name, '\n', k.parts[1].parts[1].name, '\n'
returns ::
add_1
add_2
rbf
periodic_Matern52
Constraining the parameters
===========================
Various constrains can be applied to the parameters of a kernel
* ``constrain_fixed`` to fix the value of a parameter (the value will not be modified during optimisation)
* ``constrain_positive`` to make sure the parameter is greater than 0.
* ``constrain_bounded`` to impose the parameter to be in a given range.
* ``tie_params`` to impose the value of two (or more) parameters to be equal.
When calling one of these functions, the parameters to constrain can either by specified by a regular expression that matches its name or by a number that corresponds to the rank of the parameter. Here is an example ::
k1 = GPy.kern.RBF(1)
k2 = GPy.kern.Matern32(1)
k3 = GPy.kern.White(1)
k = k1 + k2 + k3
print k
k.constrain_positive('.*var')
k.constrain_fixed(np.array([1]),1.75)
k.tie_params('.*len')
k.unconstrain('white')
k.constrain_bounded('white',lower=1e-5,upper=.5)
print k
with output::
Name | Value | Constraints | Ties
---------------------------------------------------------
rbf_variance | 1.0000 | |
rbf_lengthscale | 1.0000 | |
Mat32_variance | 1.0000 | |
Mat32_lengthscale | 1.0000 | |
white_variance | 1.0000 | |
Name | Value | Constraints | Ties
----------------------------------------------------------
rbf_variance | 1.0000 | (+ve) |
rbf_lengthscale | 1.7500 | Fixed | (0)
Mat32_variance | 1.0000 | (+ve) |
Mat32_lengthscale | 1.7500 | | (0)
white_variance | 0.3655 | (1e-05, 0.5) |
Example : Building an ANOVA kernel
==================================
In two dimensions ANOVA kernels have the following form:
.. math::
k_{ANOVA}(x,y) = \prod_{i=1}^2 (1 + k_i(x_i,y_i)) = 1 + k_1(x_1,y_1) + k_2(x_2,y_2) + k_1(x_1,y_1) \times k_2(x_2,y_2).
Let us assume that we want to define an ANOVA kernel with a Matern 3/2 kernel for :math:`k_i`. As seen previously, we can define this kernel as follows ::
k_cst = GPy.kern.Bias(1,variance=1.)
k_mat = GPy.kern.Matern52(1,variance=1.,lengthscale=3)
Kanova = (k_cst + k_mat).prod(k_cst + k_mat)
print Kanova
Printing the resulting kernel outputs the following ::
Name | Value | Constraints | Ties
---------------------------------------------------------------------------
bias<times>bias_bias_variance | 1.0000 | | (0)
bias<times>bias_bias_variance | 1.0000 | | (3)
bias<times>Mat52_bias_variance | 1.0000 | | (0)
bias<times>Mat52_Mat52_variance | 1.0000 | | (4)
bias<times>Mat52_Mat52_lengthscale | 3.0000 | | (5)
Mat52<times>bias_Mat52_variance | 1.0000 | | (1)
Mat52<times>bias_Mat52_lengthscale | 3.0000 | | (2)
Mat52<times>bias_bias_variance | 1.0000 | | (3)
Mat52<times>Mat52_Mat52_variance | 1.0000 | | (1)
Mat52<times>Mat52_Mat52_lengthscale | 3.0000 | | (2)
Mat52<times>Mat52_Mat52_variance | 1.0000 | | (4)
Mat52<times>Mat52_Mat52_lengthscale | 3.0000 | | (5)
Note the ties between the parameters of ``Kanova`` that reflect the links between the parameters of the kernparts objects. We can illustrate the use of this kernel on a toy example::
# sample inputs and outputs
X = np.random.uniform(-3.,3.,(40,2))
Y = 0.5*X[:,:1] + 0.5*X[:,1:] + 2*np.sin(X[:,:1]) * np.sin(X[:,1:])
# Create GP regression model
m = GPy.models.GPRegression(X,Y,Kanova)
m.plot()
.. figure:: Figures/tuto_kern_overview_mANOVA.png
:align: center
:height: 300px
As :math:`k_{ANOVA}` corresponds to the sum of 4 kernels, the best predictor can be splited in a sum of 4 functions
.. math::
bp(x) & = k(x)^t K^{-1} Y \\
& = (1 + k_1(x_1) + k_2(x_2) + k_1(x_1)k_2(x_2))^t K^{-1} Y \\
& = 1^t K^{-1} Y + k_1(x_1)^t K^{-1} Y + k_2(x_2)^t K^{-1} Y + (k_1(x_1)k_2(x_2))^t K^{-1} Y
The submodels can be represented with the option ``which_function`` of ``plot``: ::
pb.figure(figsize=(20,3))
pb.subplots_adjust(wspace=0.5)
axs = pb.subplot(1,5,1)
m.plot(ax=axs)
pb.subplot(1,5,2)
pb.ylabel("= ",rotation='horizontal',fontsize='30')
axs = pb.subplot(1,5,3)
m.plot(ax=axs, which_parts=[False,True,False,False])
pb.ylabel("cst +",rotation='horizontal',fontsize='30')
axs = pb.subplot(1,5,4)
m.plot(ax=axs, which_parts=[False,False,True,False])
pb.ylabel("+ ",rotation='horizontal',fontsize='30')
axs = pb.subplot(1,5,5)
pb.ylabel("+ ",rotation='horizontal',fontsize='30')
m.plot(ax=axs, which_parts=[False,False,False,True])
.. pb.savefig('tuto_kern_overview_mANOVAdec.png',bbox_inches='tight')
.. figure:: Figures/tuto_kern_overview_mANOVAdec.png
:align: center
:height: 250px
.. # code
import pylab as pb
import numpy as np
import GPy
pb.ion()
ker1 = GPy.kern.RBF(D=1) # Equivalent to ker1 = GPy.kern.RBF(D=1, variance=1., lengthscale=1.)
ker2 = GPy.kern.RBF(D=1, variance = .75, lengthscale=3.)
ker3 = GPy.kern.RBF(1, .5, .25)
ker1.plot()
ker2.plot()
ker3.plot()
#pb.savefig("Figures/tuto_kern_overview_basicdef.png")
kernels = [GPy.kern.RBF(1), GPy.kern.Exponential(1), GPy.kern.Matern32(1), GPy.kern.Matern52(1), GPy.kern.Brownian(1), GPy.kern.Bias(1), GPy.kern.Linear(1), GPy.kern.PeriodicExponential(1), GPy.kern.PeriodicMatern32(1), GPy.kern.PeriodicMatern52(1), GPy.kern.White(1)]
kernel_names = ["GPy.kern.RBF", "GPy.kern.Exponential", "GPy.kern.Matern32", "GPy.kern.Matern52", "GPy.kern.Brownian", "GPy.kern.Bias", "GPy.kern.Linear", "GPy.kern.PeriodicExponential", "GPy.kern.PeriodicMatern32", "GPy.kern.PeriodicMatern52", "GPy.kern.White"]
pb.figure(figsize=(16,12))
pb.subplots_adjust(wspace=.5, hspace=.5)
for i, kern in enumerate(kernels):
pb.subplot(3,4,i+1)
kern.plot(x=7.5,plot_limits=[0.00001,15.])
pb.title(kernel_names[i]+ '\n')
#pb.axes([.1,.1,.8,.7])
#pb.figtext(.5,.9,'Foo Bar', fontsize=18, ha='center')
#pb.figtext(.5,.85,'Lorem ipsum dolor sit amet, consectetur adipiscing elit',fontsize=10,ha='center')
# actual plot for the noise
i = 11
X = np.linspace(0.,15.,201)
WN = 0*X
WN[100] = 1.
pb.subplot(3,4,i+1)
pb.plot(X,WN,'b')

View file

@ -0,0 +1,23 @@
.. _parameterized:
*******************
Parameterization handling
*******************
Parameterization in GPy is done through so called parameter handles. The parameter handles are handles to parameters of a model of any kind. A parameter handle can be constrained, fixed, randomized and others. All parameters in GPy have a name, with which they can be accessed in the model. The most common way of accesssing a parameter programmatically though, is by variable name.
Parameter handles
==============
A parameter handle in GPy is a handle on a parameter, as the name suggests. A parameter can be constrained, fixed, randomized and more (See e.g. `working with models`). This gives the freedom to the model to handle parameter distribution and model updates as efficiently as possible. All parameter handles share a common memory space, which is just a flat numpy array, stored in the highest parent of a model hierarchy.
In the following we will introduce and elucidate the different parameter handles which exist in GPy.
:py:class:`~GPy.core.parameterization.parameterized.Parameterized`
==========
A parameterized object itself holds parameter handles and is just a summarization of the parameters below. It can use those parameters to change the internal state of the model and GPy ensures those parameters to allways hold the right value when in an optimization routine or any other update.
:py:class:`~GPy.core.parameterization.param.Param`
===========
The lowest level of parameter is a numpy array. This Param class inherits all functionality of a numpy array and can simply be used as if it where a numpy array. These parameters can be accessed in the same way as a numpy array is indexed.