Low-level interfaceΒΆ

The class SVGPFAModelFactory creates an svGPFA model and an instance of the class SVEM optimizes its parameters. Please refer to the svGPFA class diagram.

There is a one-to-one mapping between classes in the stats.svGPFA package and equations in Duncker and Sahani, 2018.

  • Class SVLowerBound corresponds to the right-hand-side of Eq.4. This class uses the ExpectedLogLikelihood and KLDivergence classes, described next.

  • The abstract class ExpectedLogLikelihood corresponds to the first term of the right-hand-side of Eq.4.

    1. The abstract subclass PointProcessELL implements the functionality of ExpectedLogLikelihood for point-process observations, and corresponds to Eq.7. If the link function (i.e., g in Eq.7) is the exponential function, then the one-dimensional integral in the first term of Eq.7 can be solved analytically (concrete subclass PointProcessELLExpLink). For other link functions we can solve this integral using Gaussian quadrature (concrete subclass PointProcessELLQuad).

    2. Similarly, the subclasses PoissonELL, PoissonELLExpLink and PoissonELLQuad implement the functionality of ExpectedLogLikelihood for Poisson observations.

  • The concrete class KLDivergence corresponds to the second term of the right-hand-side of Eq.4 and implements the KL divergence between the prior on inducing points, \(p(\mathbf{u}_k^{(r)})\), the posterior on inducing points, \(q(\mathbf{u}_k^{(r)})\).

  • ExpectedLogLikelihood uses SVEmbedding, which calculates the mean and variance of the svGPFA embedding (\(h_n^{(r)}\) in Eq.1), given in Eq.5. SVEmbedding is an abstract class, which has LinearSVEmbedding as abstract sublcass. Two concrete subclasses of LinearSVEmbedding are provided, which optimize the calculation of the embedding for two different uses in Eq.7.

    1. The first term in the righ-hand-side of Eq.7 requires the calculation of the embedding at sample times in a grid, which are the same for all neurons. This calculation is implemented in LinearSVEmbeddingAllTimes.

    2. The second term in the right-hand-side of Eq.7 requires the calculation of the embedding at spike times, which are different for each neuron. This calculation is implemented in LinearSVEmbeddingAssocTimes.

  • SVEmbedding uses SVPosteriorOnLatents, which calculates the mean and variance of the latent variables, \(x_k^{(r)}\) in Eq.1. These means and variances are not described by their own equations in Duncker and Sahani, 2018, but are embedded in Eq.5. They are

    \[ \begin{align}\begin{aligned}\nu_k^{(r)}(t) &= \kappa_k(t,z_k)K_{zz}^{(k)^{-1}}m_k^{(r)}\\\sigma_k^{(r)}(t) &= \kappa_k(t,t)+\mathbf{\kappa}_k(t,\mathbf{z}_k)\left(K_{zz}^{(k)^{-1}}S_k^{(r)}K_{zz}^{(k)^{-1}}-K_{zz}^{(k)^{ -1}}\right)\mathbf{\kappa}_k(\mathbf{z}_k,t)\end{aligned}\end{align} \]

    SVPosteriorOnLatents is an abstract class. As above, two concrete subclasses are provided. SVPosteriorOnLatentsAllTimes computes the means and variances in a grid of time points and SVPosteriorOnLatentsAssocTimes calculates these statistics at spike times.

  • SVPosteriorOnLatents uses KernelMatricesStore, which stores kernel matrices between inducing points, \(K_{zz}\), between time points, \(K_{tt}\), and between time points and inducing points, \(K_{tz}\). KernelMatricesStore is an abstract class with two subclasses. IndPointsLocsKMS is a concrete subclass of KernelMatricesStore that stores kernel matrices between inducing points, and their Cholesky decompositions. IndPointsLocsAndTimesKMS is an abstract subclass of KernelMatricesStore which stores covariance matrices between time points and between time points and inducing points. As above, IndPointsLocsAndAllTimes and IndPointsLocsAndAssocTimes are concrete subclasses of IndPointsLocsAndTimesKMS for times points in a grid and for spike times, respectively.

  • KernelMatricesStore uses Kernel, which is an abstract class for constructing kernel matrices. Concrete subclasses contruct kernel matrices for specific types of kernels (e.g., ExponentialQuadraticKernel and PeriodicKernel).