MultiQuantityGPs

Documentation for MultiQuantityGPs.

MultiQuantityGPs.BoundsType
struct NamedTuple{(:lower, :upper), Tuple{Vector{Float64}, Vector{Float64}}}

The bounds of the region. Consists of the lower and upper bounds, each a list of floating-point values.

source
MultiQuantityGPs.LinearModelType

A multivariate linear model of $y$ dependent on $x$ with parameters $a$ and $b$ of the form

\[Y = a + b^T X\]

X and Y are matrices containing the points as columns.

source
MultiQuantityGPs.LinearModelMethod

Returns a linear model of set of variables Y conditioned on set X. Requires full mean vector and covariance matrix of the joint normal distribution.

source
MultiQuantityGPs.MQGPType
struct MQGP{T}

Belief model struct and function for multiple quantities with 2D inputs.

Designed on top of a Multi-Quantity Gaussian Process, but can still be used with a single quantity.

Its interface: X -> μ, σ (SampleInputs -> means, standard deviations)

source
MultiQuantityGPs.MQGPMethod
MQGP(
    samples,
    bounds::@NamedTuple{lower::Vector{Float64}, upper::Vector{Float64}};
    N,
    kernel,
    means,
    noise,
    use_cond_pdf
) -> MQGP{typeof(MultiQuantityGPs.Kernels.multiKernel)}

Creates and returns a MQGP with hyperparameters trained and conditioned on the samples given. Lower and upper bounds are used to initialize one of the hyperparameters.

A noise standard deviation can optionally be passed in either as a single scalar value for all samples or a vector of values, one for each sample.

Examples

# create a MQGP
beliefModel = MQGP([M.prior_samples; samples], bounds)
source
MultiQuantityGPs.MQGPMethod

Inputs:

  • X: a single sample input or an array of multiple
  • full_cov: (optional) if this is true, returns the full covariance matrix in place of the vector of standard deviations

Outputs:

  • μ, σ: a pair of expected value(s) and uncertainty(s) for the given point(s)

Examples

X = [([.1, .2], 1),
     ([.2, .1], 2)]
μ, σ = beliefModel(X) # result: [μ1, μ2], [σ1, σ2]
source
MultiQuantityGPs.createLossFuncMethod
createLossFunc(
    X,
    Y_vals,
    Y_errs,
    kernel,
    use_cond_pdf
) -> MultiQuantityGPs.var"#15#16"

This function creates the loss function for training the GP. The negative log marginal likelihood is used.

source
MultiQuantityGPs.fullCovMethod
fullCov(
    bm::MQGP,
    X::AbstractArray{Tuple{Vector{Float64}, Int64}}
) -> Any

Returns the full covariance matrix for the belief model.

source
MultiQuantityGPs.latentCovMatMethod
latentCovMat(
    bm::MQGP{typeof(MultiQuantityGPs.Kernels.multiKernel)}
) -> Any

Gives the covariance matrix between all latent functions from the hyperparameters.

source
MultiQuantityGPs.meanDerivAndVarMethod
meanDerivAndVar(
    bm::MQGP,
    x::Tuple{Vector{Float64}, Int64}
) -> Tuple{Any, Any}

Returns the normed gradient of the mean of the belief model and its variance.

source
MultiQuantityGPs.optimizeLossMethod
optimizeLoss(lossFunc, θ0; solver, iterations) -> Any

Routine to optimize the lossFunc and return the optimal parameters θ.

Can pass in a different solver. NelderMead is picked as default for better speed with about the same performance as LFBGS.

source
MultiQuantityGPs.quantityCovMatMethod
quantityCovMat(bm::MQGP) -> Any

Gives the covariance matrix between all quantities from the hyperparameters. The model of the quantities is the latent functions plus their measurement noise.

source
MultiQuantityGPs.Kernels.SLFMMOKernelType
SLFMMOKernel(g::AbstractVector{<:Kernel}, A::AbstractMatrix)

Kernel associated with the semiparametric latent factor model.

Definition

For inputs $x, x'$ and output dimensions $p, p''$, the kernel is defined as

\[k\big((x, p), (x', p')\big) = \sum^{Q}_{q=1} A_{p q}g_q(x, x')A_{p' q},\]

where $g_1, \ldots, g_Q$ are $Q$ kernels, one for each latent process, and $A$ is a matrix of weights for the kernels of size $m \times Q$.

source
MultiQuantityGPs.Kernels.customKernelMethod
customKernel(
    θ
) -> MultiQuantityGPs.Kernels.CustomMOKernel{_A, <:AbstractMatrix{T}} where {_A, T}

Creates a custom kernel function for the GP similar to the slfmKernel but with matrices of length-scales and amplitudes.

This one does not work and is likely not theoretically valid.

source
MultiQuantityGPs.Kernels.fullyConnectedCovMatMethod
fullyConnectedCovMat(a) -> Any

Creates an output covariance matrix from an array of parameters by filling a lower triangular matrix.

Inputs:

  • a: parameter vector, must hold (N+1)*N/2 parameters, where N = number of outputs
source
MultiQuantityGPs.Kernels.initHyperparamsMethod
initHyperparams(
    X,
    Y_vals,
    bounds,
    N,
    ::typeof(MultiQuantityGPs.Kernels.mtoKernel);
    kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}

Creates the structure of hyperparameters for a MTGP and gives them initial values. This is for a specialized quantity covariance matrix with separation.

source
MultiQuantityGPs.Kernels.initHyperparamsMethod
initHyperparams(
    X,
    Y_vals,
    bounds,
    N,
    ::typeof(MultiQuantityGPs.Kernels.multiKernel);
    kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}

Creates the structure of hyperparameters for a MTGP and gives them initial values.

source
MultiQuantityGPs.Kernels.initHyperparamsMethod
initHyperparams(
    X,
    Y_vals,
    bounds,
    N,
    ::typeof(MultiQuantityGPs.Kernels.slfmKernel);
    kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}

Creates the structure of hyperparameters for a SLFM and gives them initial values.

source
MultiQuantityGPs.Kernels.manyToOneCovMatMethod
manyToOneCovMat(a) -> Any

Creates an output covariance matrix from an array of parameters by filling the first column and diagonal of a lower triangular matrix.

Inputs:

  • a: parameter vector, must hold 2N-1 parameters, where N = number of outputs
source
MultiQuantityGPs.Kernels.multiKernelMethod
multiKernel(θ) -> KernelFunctions.IntrinsicCoregionMOKernel

A multi-task GP kernel, a variety of multi-output GP kernel based on the Intrinsic Coregionalization Model with a Squared Exponential base kernel and an output matrix formed from a lower triangular matrix.

This function creates the kernel function used within the GP.

source
MultiQuantityGPs.Kernels.multiMeanMethod
multiMean(
    θ
) -> AbstractGPs.CustomMean{Tf} where Tf<:MultiQuantityGPs.Kernels.var"#13#14"

Creates a quantity-specific constant mean function from the GP hyperparameters.

source
MultiQuantityGPs.Kernels.slfmKernelMethod
slfmKernel(
    θ
) -> MultiQuantityGPs.Kernels.SLFMMOKernel{_A, <:AbstractMatrix{T}} where {_A, T}

Creates a semi-parametric latent factor model (SLFM) kernel function for the GP.

source