GPy.inference.latent_function_inference package¶
Submodules¶
GPy.inference.latent_function_inference.dtc module¶
-
class
GPy.inference.latent_function_inference.dtc.DTC[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInferenceAn object for inference when the likelihood is Gaussian, but we want to do sparse inference.
The function self.inference returns a Posterior object, which summarizes the posterior.
NB. It’s not recommended to use this function! It’s here for historical purposes.
GPy.inference.latent_function_inference.exact_gaussian_inference module¶
-
class
GPy.inference.latent_function_inference.exact_gaussian_inference.ExactGaussianInference[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInferenceAn object for inference when the likelihood is Gaussian.
The function self.inference returns a Posterior object, which summarizes the posterior.
For efficiency, we sometimes work with the cholesky of Y*Y.T. To save repeatedly recomputing this, we cache it.
GPy.inference.latent_function_inference.expectation_propagation module¶
-
class
GPy.inference.latent_function_inference.expectation_propagation.EP(epsilon=1e-06, eta=1.0, delta=1.0)[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInference
GPy.inference.latent_function_inference.expectation_propagation_dtc module¶
-
class
GPy.inference.latent_function_inference.expectation_propagation_dtc.EPDTC(epsilon=1e-06, eta=1.0, delta=1.0, limit=1)[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInference-
const_jitter= 1e-06¶
-
GPy.inference.latent_function_inference.fitc module¶
-
class
GPy.inference.latent_function_inference.fitc.FITC[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInferenceAn object for inference when the likelihood is Gaussian, but we want to do sparse inference.
The function self.inference returns a Posterior object, which summarizes the posterior.
-
const_jitter= 1e-06¶
-
GPy.inference.latent_function_inference.inferenceX module¶
-
class
GPy.inference.latent_function_inference.inferenceX.InferenceX(model, Y, name='inferenceX', init='L2')[source]¶ Bases:
GPy.core.model.ModelThe class for inference of new X with given new Y. (do_test_latent)
Parameters: - model (GPy.core.Model) – the GPy model used in inference
- Y (numpy.ndarray) – the new observed data for inference
-
GPy.inference.latent_function_inference.inferenceX.infer_newX(model, Y_new, optimize=True, init='L2')[source]¶ Infer the distribution of X for the new observed data Y_new.
Parameters: - model (GPy.core.Model) – the GPy model used in inference
- Y_new (numpy.ndarray) – the new observed data for inference
- optimize (boolean) – whether to optimize the location of new X (True by default)
Returns: a tuple containing the estimated posterior distribution of X and the model that optimize X
Return type: (GPy.core.parameterization.variational.VariationalPosterior, GPy.core.Model)
GPy.inference.latent_function_inference.laplace module¶
-
class
GPy.inference.latent_function_inference.laplace.Laplace[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInference-
inference(kern, X, likelihood, Y, Y_metadata=None)[source]¶ Returns a Posterior class containing essential quantities of the posterior
-
mode_computations(f_hat, Ki_f, K, Y, likelihood, kern, Y_metadata)[source]¶ At the mode, compute the hessian and effective covariance matrix.
- returns: logZ : approximation to the marginal likelihood
- woodbury_inv : variable required for calculating the approximation to the covariance matrix dL_dthetaL : array of derivatives (1 x num_kernel_params) dL_dthetaL : array of derivatives (1 x num_likelihood_params)
-
rasm_mode(K, Y, likelihood, Ki_f_init, Y_metadata=None)[source]¶ Rasmussen’s numerically stable mode finding For nomenclature see Rasmussen & Williams 2006 Influenced by GPML (BSD) code, all errors are our own
Parameters: - K (NxD matrix) – Covariance matrix evaluated at locations X
- Y (np.ndarray) – The data
- likelihood (a GPy.likelihood object) – the likelihood of the latent function value for the given data
- Ki_f_init (np.ndarray) – the initial guess at the mode
- Y_metadata (np.ndarray | None) – information about the data, e.g. which likelihood to take from a multi-likelihood object
Returns: f_hat, mode on which to make laplace approxmiation
Return type: np.ndarray
-
GPy.inference.latent_function_inference.posterior module¶
-
class
GPy.inference.latent_function_inference.posterior.Posterior(woodbury_chol=None, woodbury_vector=None, K=None, mean=None, cov=None, K_chol=None, woodbury_inv=None)[source]¶ Bases:
objectAn object to represent a Gaussian posterior over latent function values, p(f|D). This may be computed exactly for Gaussian likelihoods, or approximated for non-Gaussian likelihoods.
The purpose of this class is to serve as an interface between the inference schemes and the model classes. the model class can make predictions for the function at any new point x_* by integrating over this posterior.
-
K_chol¶ Cholesky of the prior covariance K
-
covariance¶ Posterior covariance $$ K_{xx} - K_{xx}W_{xx}^{-1}K_{xx} W_{xx} := exttt{Woodbury inv} $$
-
mean¶ Posterior mean $$ K_{xx}v v := exttt{Woodbury vector} $$
-
precision¶ Inverse of posterior covariance
-
woodbury_chol¶ return $L_{W}$ where L is the lower triangular Cholesky decomposition of the Woodbury matrix $$ L_{W}L_{W}^{ op} = W^{-1} W^{-1} := exttt{Woodbury inv} $$
-
woodbury_inv¶ The inverse of the woodbury matrix, in the gaussian likelihood case it is defined as $$ (K_{xx} + Sigma_{xx})^{-1} Sigma_{xx} := exttt{Likelihood.variance / Approximate likelihood covariance} $$
-
woodbury_vector¶ Woodbury vector in the gaussian likelihood case only is defined as $$ (K_{xx} + Sigma)^{-1}Y Sigma := exttt{Likelihood.variance / Approximate likelihood covariance} $$
-
GPy.inference.latent_function_inference.var_dtc module¶
-
class
GPy.inference.latent_function_inference.var_dtc.VarDTC(limit=1)[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInferenceAn object for inference when the likelihood is Gaussian, but we want to do sparse inference.
The function self.inference returns a Posterior object, which summarizes the posterior.
For efficiency, we sometimes work with the cholesky of Y*Y.T. To save repeatedly recomputing this, we cache it.
-
const_jitter= 1e-06¶
-
GPy.inference.latent_function_inference.var_dtc_gpu module¶
GPy.inference.latent_function_inference.var_dtc_parallel module¶
-
class
GPy.inference.latent_function_inference.var_dtc_parallel.VarDTC_minibatch(batchsize=None, limit=1, mpi_comm=None)[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInferenceAn object for inference when the likelihood is Gaussian, but we want to do sparse inference.
The function self.inference returns a Posterior object, which summarizes the posterior.
For efficiency, we sometimes work with the cholesky of Y*Y.T. To save repeatedly recomputing this, we cache it.
-
const_jitter= 1e-06¶
-
inference_likelihood(kern, X, Z, likelihood, Y)[source]¶ The first phase of inference: Compute: log-likelihood, dL_dKmm
Cached intermediate results: Kmm, KmmInv,
-
Module contents¶
Inference over Gaussian process latent functions
In all our GP models, the consistency propery means that we have a Gaussian prior over a finite set of points f. This prior is
math:: N(f | 0, K)
where K is the kernel matrix.
We also have a likelihood (see GPy.likelihoods) which defines how the data are related to the latent function: p(y | f). If the likelihood is also a Gaussian, the inference over f is tractable (see exact_gaussian_inference.py).
If the likelihood object is something other than Gaussian, then exact inference is not tractable. We then resort to a Laplace approximation (laplace.py) or expectation propagation (ep.py).
The inference methods return a
Posterior
instance, which is a simple
structure which contains a summary of the posterior. The model classes can then
use this posterior object for making predictions, optimizing hyper-parameters,
etc.
-
class
GPy.inference.latent_function_inference.InferenceMethodList[source]¶ Bases:
GPy.inference.latent_function_inference.LatentFunctionInference,list