update kernel tutorial

This commit is contained in:
Homer Strong 2015-07-19 14:30:27 -07:00
parent 3a150198e8
commit 1d7712ecc8
372 changed files with 92313 additions and 121 deletions

View file

@ -0,0 +1,453 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Interacting with models &mdash; GPy documentation</title>
<link rel="stylesheet" href="_static//default.css" type="text/css" />
<link rel="stylesheet" href="_static/pygments.css" type="text/css" />
<script type="text/javascript">
var DOCUMENTATION_OPTIONS = {
URL_ROOT: './',
VERSION: '',
COLLAPSE_INDEX: false,
FILE_SUFFIX: '.html',
HAS_SOURCE: true
};
</script>
<script type="text/javascript" src="_static/jquery.js"></script>
<script type="text/javascript" src="_static/underscore.js"></script>
<script type="text/javascript" src="_static/doctools.js"></script>
<link rel="top" title="GPy documentation" href="index.html" />
</head>
<body role="document">
<div class="related" role="navigation" aria-label="related navigation">
<h3>Navigation</h3>
<ul>
<li class="right" style="margin-right: 10px">
<a href="genindex.html" title="General Index"
accesskey="I">index</a></li>
<li class="right" >
<a href="py-modindex.html" title="Python Module Index"
>modules</a> |</li>
<li class="nav-item nav-item-0"><a href="index.html">GPy documentation</a> &raquo;</li>
</ul>
</div>
<div class="document">
<div class="documentwrapper">
<div class="bodywrapper">
<div class="body" role="main">
<div class="section" id="interacting-with-models">
<span id="id1"></span><h1>Interacting with models<a class="headerlink" href="#interacting-with-models" title="Permalink to this headline"></a></h1>
<p>The GPy model class has a set of features which are
designed to make it simple to explore the parameter
space of the model. By default, the scipy optimisers
are used to fit GPy models (via model.optimize()),
for which we provide mechanisms for &#8216;free&#8217; optimisation:
GPy can ensure that naturally positive parameters
(such as variances) remain positive. But these mechanisms
are much more powerful than simple reparameterisation,
as we shall see.</p>
<p>Along this tutorial we&#8217;ll use a sparse GP regression model
as example. This example can be in <code class="docutils literal"><span class="pre">GPy.examples.regression</span></code>.
All of the examples included in GPy return an instance
of a model class, and therefore they can be called in
the following way:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="kn">import</span> <span class="nn">numpy</span> <span class="kn">as</span> <span class="nn">np</span>
<span class="kn">import</span> <span class="nn">pylab</span> <span class="kn">as</span> <span class="nn">pb</span>
<span class="n">pb</span><span class="o">.</span><span class="n">ion</span><span class="p">()</span>
<span class="kn">import</span> <span class="nn">GPy</span>
<span class="n">m</span> <span class="o">=</span> <span class="n">GPy</span><span class="o">.</span><span class="n">examples</span><span class="o">.</span><span class="n">regression</span><span class="o">.</span><span class="n">sparse_GP_regression_1D</span><span class="p">()</span>
</pre></div>
</div>
<div class="section" id="examining-the-model-using-print">
<h2>Examining the model using print<a class="headerlink" href="#examining-the-model-using-print" title="Permalink to this headline"></a></h2>
<p>To see the current state of the model parameters,
and the model&#8217;s (marginal) likelihood just print the model</p>
<div class="highlight-python"><div class="highlight"><pre><span class="k">print</span> <span class="n">m</span>
</pre></div>
</div>
<p>The first thing displayed on the screen is the log-likelihood
value of the model with its current parameters. Below the
log-likelihood, a table with all the model&#8217;s parameters
is shown. For each parameter, the table contains the name
of the parameter, the current value, and in case there are
defined: constraints, ties and prior distrbutions associated.</p>
<div class="highlight-python"><div class="highlight"><pre>Name : sparse gp
Log-likelihood : 588.947189413
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 1.91644016819 | +ve | |
rbf.lengthscale | 2.62103621347 | +ve | |
Gaussian_noise.variance | 0.00269870373421 | +ve | |
</pre></div>
</div>
<p>In this case the kernel parameters (<code class="docutils literal"><span class="pre">rbf.variance</span></code>,
<code class="docutils literal"><span class="pre">rbf.lengthscale</span></code>) as well as
the likelihood noise parameter (<code class="docutils literal"><span class="pre">Gaussian_noise.variance</span></code>), are constrained
to be positive, while the inducing inputs have no
constraints associated. Also there are no ties or prior defined.</p>
<p>You can also print all subparts of the model, by printing the
subcomponents individually:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="k">print</span> <span class="n">m</span><span class="o">.</span><span class="n">rbf</span>
</pre></div>
</div>
<p>This will print the details of this particular parameter handle:</p>
<div class="highlight-python"><div class="highlight"><pre>rbf. | Value | Constraint | Prior | Tied to
variance | 1.91644016819 | +ve | |
lengthscale | 2.62103621347 | +ve | |
</pre></div>
</div>
<p>When you want to get a closer look into
multivalue parameters, print them directly:</p>
<div class="highlight-python"><div class="highlight"><pre>print m.inducing_inputs
Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to
[0 0] | 2.7189499 | | | N/A
[1 0] | 0.02006533 | | | N/A
[2 0] | -1.5299386 | | | N/A
[3 0] | -2.7001675 | | | N/A
[4 0] | 1.4654162 | | | N/A
</pre></div>
</div>
</div>
<div class="section" id="interacting-with-parameters">
<h2>Interacting with Parameters:<a class="headerlink" href="#interacting-with-parameters" title="Permalink to this headline"></a></h2>
<p>The preferred way of interacting with parameters is to act on the
parameter handle itself.
Interacting with parameter handles is simple. The names, printed by <cite>print m</cite>
are accessible interactively and programatically. For example try to
set kernels (<cite>rbf</cite>) <cite>lengthscale</cite> to <cite>.2</cite> and print the result:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">m</span><span class="o">.</span><span class="n">rbf</span><span class="o">.</span><span class="n">lengthscale</span> <span class="o">=</span> <span class="o">.</span><span class="mi">2</span>
<span class="k">print</span> <span class="n">m</span>
</pre></div>
</div>
<p>You should see this:</p>
<div class="highlight-python"><div class="highlight"><pre>Name : sparse gp
Log-likelihood : 588.947189413
Number of Parameters : 8
Parameters:
sparse_gp. | Value | Constraint | Prior | Tied to
inducing inputs | (5, 1) | | |
rbf.variance | 1.91644016819 | +ve | |
rbf.lengthscale | 0.2 | +ve | |
Gaussian_noise.variance | 0.00269870373421 | +ve | |
</pre></div>
</div>
<p>This will already have updated the model&#8217;s inner state, so you can
plot it or see the changes in the posterior <cite>m.posterior</cite> of the model.</p>
<div class="section" id="regular-expressions">
<h3>Regular expressions<a class="headerlink" href="#regular-expressions" title="Permalink to this headline"></a></h3>
<p>The model&#8217;s parameters can also be accessed through regular
expressions, by &#8216;indexing&#8217; the model with a regular expression,
matching the parameter name. Through indexing by regular expression,
you can only retrieve leafs of the hierarchy, and you can retrieve the
values matched by calling <cite>values()</cite> on the returned object:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span>
<span class="go"> Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to</span>
<span class="go"> [0] | 2.1500132 | | | N/A</span>
<span class="go"> ----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------</span>
<span class="go"> [0] | 0.0024268215 | | | N/A</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span><span class="o">.</span><span class="n">values</span><span class="p">()</span>
<span class="go">[ 2.1500132 0.00242682]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;rbf&#39;</span><span class="p">]</span>
<span class="go"> Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to</span>
<span class="go"> [0] | 2.1500132 | | | N/A</span>
<span class="go"> ----- | sparse_gp.rbf.lengthscale | ---------- | ---------- | -------</span>
<span class="go"> [0] | 2.6782803 | | | N/A</span>
</pre></div>
</div>
<p>There is access to setting parameters by regular expression,
as well. Here are a few examples of how to set parameters by regular expression:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span> <span class="o">=</span> <span class="o">.</span><span class="mi">1</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span>
<span class="go"> Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to</span>
<span class="go"> [0] | 0.1 | | | N/A</span>
<span class="go"> ----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------</span>
<span class="go"> [0] | 0.1 | | | N/A</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span> <span class="o">=</span> <span class="p">[</span><span class="o">.</span><span class="mi">1</span><span class="p">,</span> <span class="o">.</span><span class="mi">2</span><span class="p">]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span>
<span class="go"> Index | sparse_gp.rbf.variance | Constraint | Prior | Tied to</span>
<span class="go"> [0] | 0.1 | | | N/A</span>
<span class="go"> ----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------</span>
<span class="go"> [0] | 0.2 | | | N/A</span>
</pre></div>
</div>
<p>The fact that only leaf nodes can be accesses we can print all
parameters in a flattened view, by printing the regular expression
match of matching all objects:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="p">[</span><span class="s">&#39;&#39;</span><span class="p">]</span>
<span class="go"> Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to</span>
<span class="go"> [0 0] | -2.6716041 | | | N/A</span>
<span class="go"> [1 0] | -1.4665111 | | | N/A</span>
<span class="go"> [2 0] | -0.031010293 | | | N/A</span>
<span class="go"> [3 0] | 1.4563711 | | | N/A</span>
<span class="go"> [4 0] | 2.6803046 | | | N/A</span>
<span class="go"> ----- | sparse_gp.rbf.variance | ---------- | ---------- | -------</span>
<span class="go"> [0] | 0.1 | | | N/A</span>
<span class="go"> ----- | sparse_gp.rbf.lengthscale | ---------- | ---------- | -------</span>
<span class="go"> [0] | 2.6782803 | | | N/A</span>
<span class="go"> ----- | sparse_gp.Gaussian_noise.variance | ---------- | ---------- | -------</span>
<span class="go"> [0] | 0.2 | | | N/A</span>
</pre></div>
</div>
</div>
<div class="section" id="setting-and-fetching-parameters-parameter-array">
<h3>Setting and fetching parameters <cite>parameter_array</cite><a class="headerlink" href="#setting-and-fetching-parameters-parameter-array" title="Permalink to this headline"></a></h3>
<p>Another way to interact with the model&#8217;s parameters is through the
<cite>parameter_array</cite>. The Parameter array holds all the parameters of the
model in one place and is editable. It can be accessed through
indexing the model for example you can set all the parameters through
this mechanism:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">new_params</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">r_</span><span class="p">[[</span><span class="o">-</span><span class="mi">4</span><span class="p">,</span><span class="o">-</span><span class="mi">2</span><span class="p">,</span><span class="mi">0</span><span class="p">,</span><span class="mi">2</span><span class="p">,</span><span class="mi">4</span><span class="p">],</span> <span class="p">[</span><span class="o">.</span><span class="mi">5</span><span class="p">,</span><span class="mi">2</span><span class="p">],</span> <span class="p">[</span><span class="o">.</span><span class="mi">3</span><span class="p">]]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">new_params</span>
<span class="go">array([-4. , -2. , 0. , 2. , 4. , 0.5, 2. , 0.3])</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="p">[:]</span> <span class="o">=</span> <span class="n">new_params</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span>
<span class="go">Name : sparse gp</span>
<span class="go">Log-likelihood : -147.561160209</span>
<span class="go">Number of Parameters : 8</span>
<span class="go">Parameters:</span>
<span class="go"> sparse_gp. | Value | Constraint | Prior | Tied to</span>
<span class="go"> inducing inputs | (5, 1) | | |</span>
<span class="go"> rbf.variance | 0.5 | +sq | |</span>
<span class="go"> rbf.lengthscale | 2.0 | +ve | |</span>
<span class="go"> Gaussian_noise.variance | 0.3 | +sq | |</span>
</pre></div>
</div>
<p>Parameters themselves (leafs of the hierarchy) can be indexed and used
the same way as numpy arrays. First let us set a slice of the
<cite>inducing_inputs</cite>:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">inducing_inputs</span><span class="p">[</span><span class="mi">2</span><span class="p">:,</span> <span class="mi">0</span><span class="p">]</span> <span class="o">=</span> <span class="p">[</span><span class="mi">1</span><span class="p">,</span><span class="mi">3</span><span class="p">,</span><span class="mi">5</span><span class="p">]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="o">.</span><span class="n">inducing_indputs</span>
<span class="go"> Index | sparse_gp.inducing_inputs | Constraint | Prior | Tied to</span>
<span class="go"> [0 0] | -4 | | | N/A</span>
<span class="go"> [1 0] | -2 | | | N/A</span>
<span class="go"> [2 0] | 1 | | | N/A</span>
<span class="go"> [3 0] | 3 | | | N/A</span>
<span class="go"> [4 0] | 5 | | | N/A</span>
</pre></div>
</div>
<p>Or you use the parameters as normal numpy arrays for calculations:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">precision</span> <span class="o">=</span> <span class="mf">1.</span><span class="o">/</span><span class="n">m</span><span class="o">.</span><span class="n">Gaussian_noise</span><span class="o">.</span><span class="n">variance</span>
<span class="go">array([ 3.33333333])</span>
</pre></div>
</div>
</div>
</div>
<div class="section" id="getting-the-model-s-log-likelihood">
<h2>Getting the model&#8217;s log likelihood<a class="headerlink" href="#getting-the-model-s-log-likelihood" title="Permalink to this headline"></a></h2>
<p>Appart form the printing the model, the marginal
log-likelihood can be obtained by using the function
<code class="docutils literal"><span class="pre">log_likelihood()</span></code>.:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">log_likelihood</span><span class="p">()</span>
<span class="go">array([-152.83377316])</span>
</pre></div>
</div>
<p>If you want to ensure the log likelihood as a float, call <cite>float()</cite>
around it:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="nb">float</span><span class="p">(</span><span class="n">m</span><span class="o">.</span><span class="n">log_likelihood</span><span class="p">())</span>
<span class="go">-152.83377316356177</span>
</pre></div>
</div>
</div>
<div class="section" id="getting-the-model-parameter-s-gradients">
<h2>Getting the model parameter&#8217;s gradients<a class="headerlink" href="#getting-the-model-parameter-s-gradients" title="Permalink to this headline"></a></h2>
<p>The gradients of a model can shed light on understanding the
(possibly hard) optimization process. The gradients of each parameter
handle can be accessed through their <cite>gradient</cite> field.:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="o">.</span><span class="n">gradient</span>
<span class="go">[ 5.51170031 9.71735112 -4.20282106 -3.45667035 -1.58828165</span>
<span class="go"> -2.11549358 12.40292787 -627.75467803]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="o">.</span><span class="n">rbf</span><span class="o">.</span><span class="n">gradient</span>
<span class="go">[ -2.11549358 12.40292787]</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">optimize</span><span class="p">()</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span><span class="o">.</span><span class="n">gradient</span>
<span class="go">[ -5.98046560e-04 -3.64576085e-04 1.98005930e-04 3.43381219e-04</span>
<span class="go">-6.85685104e-04 -1.28800748e-05 1.08552429e-03 2.74058081e-01]</span>
</pre></div>
</div>
</div>
<div class="section" id="adjusting-the-model-s-constraints">
<h2>Adjusting the model&#8217;s constraints<a class="headerlink" href="#adjusting-the-model-s-constraints" title="Permalink to this headline"></a></h2>
<p>When we initially call the example, it was optimized and hence the
log-likelihood gradients were close to zero. However, since
we have been changing the parameters, the gradients are far from zero now.
Next we are going to show how to optimize the model setting different
restrictions on the parameters.</p>
<p>Once a constraint has been set on a parameter, it is possible to remove
it with the command <code class="docutils literal"><span class="pre">unconstrain()</span></code>, which can be called on any
parameter handle of the model. The methods <cite>constrain()</cite> and
<cite>unconstrain()</cite> return the indices which were actually unconstrained,
relative to the parameter handle the method was called on. This is
particularly handy for reporting which parameters where reconstrained,
when reconstraining a parameter, which was already constrained:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">rbf</span><span class="o">.</span><span class="n">variance</span><span class="o">.</span><span class="n">unconstrain</span><span class="p">()</span>
<span class="go">array([0])</span>
<span class="go">&gt;&gt;&gt;m.unconstrain()</span>
<span class="go">array([6, 7])</span>
</pre></div>
</div>
<p>If you want to unconstrain only a specific constraint, you can pass it
as an argument of <code class="docutils literal"><span class="pre">unconstrain(Transformation)</span></code> (<code class="xref py py-class docutils literal"><span class="pre">Transformation</span></code>), or call
the respective method, such as <code class="docutils literal"><span class="pre">unconstrain_fixed()</span></code> (or
<code class="docutils literal"><span class="pre">unfix()</span></code>) to only unfix fixed parameters.:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">inducing_input</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span><span class="o">.</span><span class="n">fix</span><span class="p">()</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">unfix</span><span class="p">()</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">m</span><span class="o">.</span><span class="n">rbf</span><span class="o">.</span><span class="n">constrain_positive</span><span class="p">()</span>
<span class="gp">&gt;&gt;&gt; </span><span class="k">print</span> <span class="n">m</span>
<span class="go">Name : sparse gp</span>
<span class="go">Log-likelihood : 620.741066698</span>
<span class="go">Number of Parameters : 8</span>
<span class="go">Parameters:</span>
<span class="go"> sparse_gp. | Value | Constraint | Prior | Tied to</span>
<span class="go"> inducing inputs | (5, 1) | | |</span>
<span class="go"> rbf.variance | 1.48329711218 | +ve | |</span>
<span class="go"> rbf.lengthscale | 2.5430947048 | +ve | |</span>
<span class="go"> Gaussian_noise.variance | 0.00229714444128 | | |</span>
</pre></div>
</div>
<p>As you can see, <code class="docutils literal"><span class="pre">unfix()</span></code> only unfixed the inducing_input, and did
not change the positive constraint of the kernel.</p>
<p>The parameter handles come with default constraints, so you will
rarely be needing to adjust the constraints of a model. In the rare
cases of needing to adjust the constraints of a model, or in need of
fixing some parameters, you can do so with the functions
<code class="docutils literal"><span class="pre">constrain_{positive|negative|bounded|fixed}()</span></code>.:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">m</span><span class="p">[</span><span class="s">&#39;.*var&#39;</span><span class="p">]</span><span class="o">.</span><span class="n">constrain_positive</span><span class="p">()</span>
</pre></div>
</div>
</div>
<div class="section" id="available-constraints">
<h2>Available Constraints<a class="headerlink" href="#available-constraints" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><code class="xref py py-meth docutils literal"><span class="pre">Logexp()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">Exponent()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">Square()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">Logistic()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">LogexpNeg()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">NegativeExponent()</span></code></li>
<li><code class="xref py py-meth docutils literal"><span class="pre">NegativeLogexp()</span></code></li>
</ul>
</div>
<div class="section" id="tying-parameters">
<h2>Tying Parameters<a class="headerlink" href="#tying-parameters" title="Permalink to this headline"></a></h2>
<p>Not yet implemented for GPy version 0.6.0</p>
</div>
<div class="section" id="optimizing-the-model">
<h2>Optimizing the model<a class="headerlink" href="#optimizing-the-model" title="Permalink to this headline"></a></h2>
<p>Once we have finished defining the constraints,
we can now optimize the model with the function
<code class="docutils literal"><span class="pre">optimize</span></code>.:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">m</span><span class="o">.</span><span class="n">Gaussian_noise</span><span class="o">.</span><span class="n">constrain_positive</span><span class="p">()</span>
<span class="n">m</span><span class="o">.</span><span class="n">rbf</span><span class="o">.</span><span class="n">constrain_positive</span><span class="p">()</span>
<span class="n">m</span><span class="o">.</span><span class="n">optimize</span><span class="p">()</span>
</pre></div>
</div>
<p>By deafult, GPy uses the lbfgsb optimizer.</p>
<p>Some optional parameters may be discussed here.</p>
<ul class="simple">
<li><code class="docutils literal"><span class="pre">optimizer</span></code>: which optimizer to use, currently there are <code class="docutils literal"><span class="pre">lbfgsb,</span> <span class="pre">fmin_tnc,</span>
<span class="pre">scg,</span> <span class="pre">simplex</span></code> or any unique identifier uniquely identifying an
optimizer. Thus, you can say <code class="docutils literal"><span class="pre">m.optimize('bfgs')</span> <span class="pre">for</span> <span class="pre">using</span> <span class="pre">the</span>
<span class="pre">``lbfgsb</span></code> optimizer</li>
<li><code class="docutils literal"><span class="pre">messages</span></code>: if the optimizer is verbose. Each optimizer has its
own way of printing, so do not be confused by differing messages of
different optimizers</li>
<li><code class="docutils literal"><span class="pre">max_iters</span></code>: Maximum number of iterations to take. Some optimizers
see iterations as function calls, others as iterations of the
algorithm. Please be advised to look into <code class="docutils literal"><span class="pre">scipy.optimize</span></code> for
more instructions, if the number of iterations matter, so you can
give the right parameters to <code class="docutils literal"><span class="pre">optimize()</span></code></li>
<li><code class="docutils literal"><span class="pre">gtol</span></code>: only for some optimizers. Will determine the convergence
criterion, as the tolerance of gradient to finish the optimization.</li>
</ul>
</div>
<div class="section" id="further-reading">
<h2>Further Reading<a class="headerlink" href="#further-reading" title="Permalink to this headline"></a></h2>
<p>All of the mechansiams for dealing
with parameters are baked right into GPy.core.model, from which all of
the classes in GPy.models inherrit. To learn how to construct your own
model, you might want to read <a class="reference internal" href="tuto_creating_new_models.html#creating-new-models"><span>Creating new Models</span></a>. If you want
to learn how to create kernels, please refer to
<span class="xref std std-ref">creating_new_kernels</span></p>
</div>
</div>
</div>
</div>
</div>
<div class="sphinxsidebar" role="navigation" aria-label="main navigation">
<div class="sphinxsidebarwrapper">
<h3><a href="index.html">Table Of Contents</a></h3>
<ul>
<li><a class="reference internal" href="#">Interacting with models</a><ul>
<li><a class="reference internal" href="#examining-the-model-using-print">Examining the model using print</a></li>
<li><a class="reference internal" href="#interacting-with-parameters">Interacting with Parameters:</a><ul>
<li><a class="reference internal" href="#regular-expressions">Regular expressions</a></li>
<li><a class="reference internal" href="#setting-and-fetching-parameters-parameter-array">Setting and fetching parameters <cite>parameter_array</cite></a></li>
</ul>
</li>
<li><a class="reference internal" href="#getting-the-model-s-log-likelihood">Getting the model&#8217;s log likelihood</a></li>
<li><a class="reference internal" href="#getting-the-model-parameter-s-gradients">Getting the model parameter&#8217;s gradients</a></li>
<li><a class="reference internal" href="#adjusting-the-model-s-constraints">Adjusting the model&#8217;s constraints</a></li>
<li><a class="reference internal" href="#available-constraints">Available Constraints</a></li>
<li><a class="reference internal" href="#tying-parameters">Tying Parameters</a></li>
<li><a class="reference internal" href="#optimizing-the-model">Optimizing the model</a></li>
<li><a class="reference internal" href="#further-reading">Further Reading</a></li>
</ul>
</li>
</ul>
<div role="note" aria-label="source link">
<h3>This Page</h3>
<ul class="this-page-menu">
<li><a href="_sources/tuto_interacting_with_models.txt"
rel="nofollow">Show Source</a></li>
</ul>
</div>
<div id="searchbox" style="display: none" role="search">
<h3>Quick search</h3>
<form class="search" action="search.html" method="get">
<input type="text" name="q" />
<input type="submit" value="Go" />
<input type="hidden" name="check_keywords" value="yes" />
<input type="hidden" name="area" value="default" />
</form>
<p class="searchtip" style="font-size: 90%">
Enter search terms or a module, class or function name.
</p>
</div>
<script type="text/javascript">$('#searchbox').show(0);</script>
</div>
</div>
<div class="clearer"></div>
</div>
<div class="related" role="navigation" aria-label="related navigation">
<h3>Navigation</h3>
<ul>
<li class="right" style="margin-right: 10px">
<a href="genindex.html" title="General Index"
>index</a></li>
<li class="right" >
<a href="py-modindex.html" title="Python Module Index"
>modules</a> |</li>
<li class="nav-item nav-item-0"><a href="index.html">GPy documentation</a> &raquo;</li>
</ul>
</div>
<div class="footer" role="contentinfo">
&copy; Copyright 2013, Author.
Created using <a href="http://sphinx-doc.org/">Sphinx</a> 1.3.1.
</div>
</body>
</html>