diff --git a/doc/index.rst b/doc/index.rst index 29b4cf43..4d0833a4 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -12,6 +12,7 @@ For a quick start, you can have a look at one of the tutorials: * `A kernel overview `_ * `Writing new kernels `_ * `Writing new models `_ +* `Parameterization handles `_ You may also be interested by some examples in the GPy/examples folder. diff --git a/doc/tuto_creating_new_models.rst b/doc/tuto_creating_new_models.rst index 021b4950..5c51cdad 100644 --- a/doc/tuto_creating_new_models.rst +++ b/doc/tuto_creating_new_models.rst @@ -8,57 +8,44 @@ In GPy all models inherit from the base class :py:class:`~GPy.core.parameterized The :py:class:`~GPy.core.model.Model` class provides parameter introspection, objective function and optimization. -In order to fully use all functionality of :py:class:`~GPy.core.model.Model` some methods need to be implemented / overridden. In order to explain the functionality of those methods we will use a wrapper to the numpy ``rosen`` function, which holds input parameters :math:`\mathbf{X}`. Where :math:`\mathbf{X}\in\mathbb{R}^{N\times 1}`. +In order to fully use all functionality of +:py:class:`~GPy.core.model.Model` some methods need to be implemented +/ overridden. And the model needs to be told its parameters, such +that it can provide optimized parameter distribution and handling. +In order to explain the functionality of those methods +we will use a wrapper to the numpy ``rosen`` function, which holds +input parameters :math:`\mathbf{X}`. Where +:math:`\mathbf{X}\in\mathbb{R}^{N\times 1}`. Obligatory methods ================== :py:meth:`~GPy.core.model.Model.__init__` : - Initialize the model with the given parameters. In our example we have to store shape information of :math:`\mathbf X` and the parameters themselves:: + Initialize the model with the given parameters. These need to + be added to the model by calling + `self.add_parameter()`, where param needs to be a + parameter handle (See parameterized_ for details).:: - self.X = X - self.num_inputs = self.X.shape[0] - assert self.X.ndim == 1, only vector inputs allowed - -:py:meth:`~GPy.core.model.Model._get_params` : - Return parameters of the model as a flattened numpy array-like. So, in our example we have to return the input parameters:: - - return self.X.flatten() - -:py:meth:`~GPy.core.model.Model._set_params` : - Set parameters, which have been fetched through :py:meth:`~GPy.core.model.Model._get_params`. In other words, "invert" the functionality of :py:meth:`~GPy.core.model.Model._get_params`:: - - self.X = params[:self.num_inputs*self.input_dim].reshape(self.num_inputs) - + self.X = GPy.core.Param("input", X) + self.add_parameter(self.X) + :py:meth:`~GPy.core.model.Model.log_likelihood` : - Returns the log-likelihood of the new model. For our example this is just the call to ``rosen``:: + Returns the log-likelihood of the new model. For our example + this is just the call to ``rosen`` and as we want to minimize + it, we need to negate the objective.:: - return scipy.optimize.rosen(self.X) + return -scipy.optimize.rosen(self.X) -:py:meth:`~GPy.core.model.Model._log_likelihood_gradients` : - Returns the gradients with respect to all parameters:: +:py:meth:`~GPy.core.model.Model.parameters_changed` : + Updates the internal state of the model and sets the gradient of + each parameter handle in the hierarchy with respect to the + log_likelihod. Thus here we need to put the negative derivative of + the rosenbrock function: - return scipy.optimize.rosen_der(self.X) + self.X.gradient = -scipy.optimize.rosen_der(self.X) Optional methods ================ -If you want some special functionality please provide the following methods: - -Using the pickle functionality ------------------------------- - -To be able to use the pickle functionality ``m.pickle()`` the methods ``getstate(self)`` and ``setstate(self, state)`` have to be provided. The convention for a ``state`` in ``GPy`` is a list of all parameters, which are needed to restore the model. All classes provided in ``GPy`` follow this convention, thus you can just append to the state of the inherited class and call the inherited class' ``setstate`` with the appropriate state. - -:py:meth:`~GPy.core.model.Model.getstate` : - This method returns a state of the model, following the memento pattern. As we are inheriting from :py:class:`~GPy.core.model.Model`, we have to return the state of Model as well. In out example we have `X` and `num_inputs` as state:: - - return Model.getstate(self) + [self.X, self.num_inputs] - -:py:meth:`~GPy.core.model.Model.setstate` : - This method restores this model with the given ``state``:: - - self.num_inputs = state.pop() - self.X = state.pop() - return Model.setstate(self, state) \ No newline at end of file +Currently none. diff --git a/doc/tuto_parameterized.rst b/doc/tuto_parameterized.rst new file mode 100644 index 00000000..507ec109 --- /dev/null +++ b/doc/tuto_parameterized.rst @@ -0,0 +1,23 @@ +.. _parameterized: + +******************* +Parameterization handling +******************* + +Parameterization in GPy is done through so called parameter handles. The parameter handles are handles to parameters of a model of any kind. A parameter handle can be constrained, fixed, randomized and others. All parameters in GPy have a name, with which they can be accessed in the model. The most common way of accesssing a parameter programmatically though, is by variable name. + +Parameter handles +============== + +A parameter handle in GPy is a handle on a parameter, as the name suggests. A parameter can be constrained, fixed, randomized and more (See e.g. `working with models`). This gives the freedom to the model to handle parameter distribution and model updates as efficiently as possible. All parameter handles share a common memory space, which is just a flat numpy array, stored in the highest parent of a model hierarchy. +In the following we will introduce and elucidate the different parameter handles which exist in GPy. + +:py:class:`~GPy.core.parameterization.parameterized.Parameterized` +========== + +A parameterized object itself holds parameter handles and is just a summarization of the parameters below. It can use those parameters to change the internal state of the model and GPy ensures those parameters to allways hold the right value when in an optimization routine or any other update. + +:py:class:`~GPy.core.parameterization.param.Param` +=========== + +The lowest level of parameter is a numpy array. This Param class inherits all functionality of a numpy array and can simply be used as if it where a numpy array. These parameters can be accessed in the same way as a numpy array is indexed.