GPy.examples package¶
Submodules¶
GPy.examples.classification module¶
Gaussian Processes classification examples
-
GPy.examples.classification.crescent_data(model_type='Full', num_inducing=10, seed=10000, kernel=None, optimize=True, plot=True)[source]¶ Run a Gaussian process classification on the crescent data. The demonstration calls the basic GP classification model and uses EP to approximate the likelihood.
Parameters: - model_type – type of model to fit [‘Full’, ‘FITC’, ‘DTC’].
- inducing (int) – number of inducing variables (only used for ‘FITC’ or ‘DTC’).
- seed (int) – seed value for data generation.
- kernel (a GPy kernel) – kernel to use in the model
-
GPy.examples.classification.oil(num_inducing=50, max_iters=100, kernel=None, optimize=True, plot=True)[source]¶ Run a Gaussian process classification on the three phase oil data. The demonstration calls the basic GP classification model and uses EP to approximate the likelihood.
-
GPy.examples.classification.sparse_toy_linear_1d_classification(num_inducing=10, seed=10000, optimize=True, plot=True)[source]¶ Sparse 1D classification example
Parameters: seed (int) – seed value for data generation (default is 4).
-
GPy.examples.classification.toy_heaviside(seed=10000, max_iters=100, optimize=True, plot=True)[source]¶ Simple 1D classification example using a heavy side gp transformation
Parameters: seed (int) – seed value for data generation (default is 4).
GPy.examples.coreg_example module¶
GPy.examples.dimensionality_reduction module¶
-
GPy.examples.dimensionality_reduction.bcgplvm_linear_stick(kernel=None, optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.bcgplvm_stick(kernel=None, optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.bgplvm_oil(optimize=True, verbose=1, plot=True, N=200, Q=7, num_inducing=40, max_iters=1000, **k)[source]¶
-
GPy.examples.dimensionality_reduction.bgplvm_simulation(optimize=True, verbose=1, plot=True, plot_sim=False, max_iters=20000.0)[source]¶
-
GPy.examples.dimensionality_reduction.bgplvm_simulation_missing_data(optimize=True, verbose=1, plot=True, plot_sim=False, max_iters=20000.0, percent_missing=0.1)[source]¶
-
GPy.examples.dimensionality_reduction.bgplvm_test_model(optimize=False, verbose=1, plot=False, output_dim=200, nan=False)[source]¶ model for testing purposes. Samples from a GP with rbf kernel and learns the samples with a new kernel. Normally not for optimization, just model cheking
-
GPy.examples.dimensionality_reduction.brendan_faces(optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.cmu_mocap(subject='35', motion=['01'], in_place=True, optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.mrd_simulation(optimize=True, verbose=True, plot=True, plot_sim=True, **kw)[source]¶
-
GPy.examples.dimensionality_reduction.mrd_simulation_missing_data(optimize=True, verbose=True, plot=True, plot_sim=True, **kw)[source]¶
-
GPy.examples.dimensionality_reduction.olivetti_faces(optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.robot_wireless(optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.sparse_gplvm_oil(optimize=True, verbose=0, plot=True, N=100, Q=6, num_inducing=15, max_iters=50)[source]¶
-
GPy.examples.dimensionality_reduction.ssgplvm_oil(optimize=True, verbose=1, plot=True, N=200, Q=7, num_inducing=40, max_iters=1000, **k)[source]¶
-
GPy.examples.dimensionality_reduction.ssgplvm_simulation(optimize=True, verbose=1, plot=True, plot_sim=False, max_iters=20000.0, useGPU=False)[source]¶
-
GPy.examples.dimensionality_reduction.stick(kernel=None, optimize=True, verbose=True, plot=True)[source]¶
-
GPy.examples.dimensionality_reduction.stick_bgplvm(model=None, optimize=True, verbose=True, plot=True)[source]¶
GPy.examples.non_gaussian module¶
GPy.examples.regression module¶
Gaussian Processes regression examples
-
GPy.examples.regression.coregionalization_sparse(optimize=True, plot=True)[source]¶ A simple demonstration of coregionalization on two sinusoidal functions using sparse approximations.
-
GPy.examples.regression.coregionalization_toy(optimize=True, plot=True)[source]¶ A simple demonstration of coregionalization on two sinusoidal functions.
-
GPy.examples.regression.epomeo_gpx(max_iters=200, optimize=True, plot=True)[source]¶ Perform Gaussian process regression on the latitude and longitude data from the Mount Epomeo runs. Requires gpxpy to be installed on your system to load in the data.
-
GPy.examples.regression.multiple_optima(gene_number=937, resolution=80, model_restarts=10, seed=10000, max_iters=300, optimize=True, plot=True)[source]¶ Show an example of a multimodal error surface for Gaussian process regression. Gene 939 has bimodal behaviour where the noisy mode is higher.
-
GPy.examples.regression.olympic_100m_men(optimize=True, plot=True)[source]¶ Run a standard Gaussian process regression on the Rogers and Girolami olympics data.
-
GPy.examples.regression.olympic_marathon_men(optimize=True, plot=True)[source]¶ Run a standard Gaussian process regression on the Olympic marathon data.
-
GPy.examples.regression.robot_wireless(max_iters=100, kernel=None, optimize=True, plot=True)[source]¶ Predict the location of a robot given wirelss signal strength readings.
-
GPy.examples.regression.silhouette(max_iters=100, optimize=True, plot=True)[source]¶ Predict the pose of a figure given a silhouette. This is a task from Agarwal and Triggs 2004 ICML paper.
-
GPy.examples.regression.sparse_GP_regression_1D(num_samples=400, num_inducing=5, max_iters=100, optimize=True, plot=True, checkgrad=False)[source]¶ Run a 1D example of a sparse GP regression.
-
GPy.examples.regression.sparse_GP_regression_2D(num_samples=400, num_inducing=50, max_iters=100, optimize=True, plot=True, nan=False)[source]¶ Run a 2D example of a sparse GP regression.
-
GPy.examples.regression.toy_ARD(max_iters=1000, kernel_type='linear', num_samples=300, D=4, optimize=True, plot=True)[source]¶
-
GPy.examples.regression.toy_ARD_sparse(max_iters=1000, kernel_type='linear', num_samples=300, D=4, optimize=True, plot=True)[source]¶
-
GPy.examples.regression.toy_poisson_rbf_1d_laplace(optimize=True, plot=True)[source]¶ Run a simple demonstration of a standard Gaussian process fitting it to data sampled from an RBF covariance.
-
GPy.examples.regression.toy_rbf_1d(optimize=True, plot=True)[source]¶ Run a simple demonstration of a standard Gaussian process fitting it to data sampled from an RBF covariance.