MultiQuantityGPs
Documentation for MultiQuantityGPs.
MultiQuantityGPs.MultiQuantityGPsMultiQuantityGPs.BoundsMultiQuantityGPs.Kernels.SLFMMOKernelMultiQuantityGPs.LinearModelMultiQuantityGPs.LinearModelMultiQuantityGPs.LocationMultiQuantityGPs.MQGPMultiQuantityGPs.MQGPMultiQuantityGPs.MQGPMultiQuantityGPs.SampleInputMultiQuantityGPs.Kernels.customKernelMultiQuantityGPs.Kernels.fullyConnectedCovMatMultiQuantityGPs.Kernels.fullyConnectedCovNumMultiQuantityGPs.Kernels.initHyperparamsMultiQuantityGPs.Kernels.initHyperparamsMultiQuantityGPs.Kernels.initHyperparamsMultiQuantityGPs.Kernels.manyToOneCovMatMultiQuantityGPs.Kernels.manyToOneCovNumMultiQuantityGPs.Kernels.mtoKernelMultiQuantityGPs.Kernels.multiKernelMultiQuantityGPs.Kernels.multiMeanMultiQuantityGPs.Kernels.singleKernelMultiQuantityGPs.Kernels.slfmKernelMultiQuantityGPs.createLossFuncMultiQuantityGPs.fullCovMultiQuantityGPs.latentCovMatMultiQuantityGPs.meanDerivAndVarMultiQuantityGPs.optimizeLossMultiQuantityGPs.quantityCorMatMultiQuantityGPs.quantityCovMat
MultiQuantityGPs.MultiQuantityGPs — ModuleThis module contains everything to do with what is inferred about values in the environment. In practical terms: means, variances, and correlations. This is all built on Gaussian Processes.
Main public types and functions:
MultiQuantityGPs.Bounds — Typestruct NamedTuple{(:lower, :upper), Tuple{Vector{Float64}, Vector{Float64}}}The bounds of the region. Consists of the lower and upper bounds, each a list of floating-point values.
MultiQuantityGPs.LinearModel — TypeA multivariate linear model of $y$ dependent on $x$ with parameters $a$ and $b$ of the form
\[Y = a + b^T X\]
X and Y are matrices containing the points as columns.
MultiQuantityGPs.LinearModel — MethodReturns a linear model of set of variables Y conditioned on set X. Requires full mean vector and covariance matrix of the joint normal distribution.
MultiQuantityGPs.Location — Typemutable struct Array{Float64, 1} <: DenseVector{Float64}Location of sample
MultiQuantityGPs.MQGP — Typestruct MQGP{T}Belief model struct and function for multiple quantities with 2D inputs.
Designed on top of a Multi-Quantity Gaussian Process, but can still be used with a single quantity.
Its interface: X -> μ, σ (SampleInputs -> means, standard deviations)
MultiQuantityGPs.MQGP — MethodMQGP(
samples,
bounds::@NamedTuple{lower::Vector{Float64}, upper::Vector{Float64}};
N,
kernel,
means,
noise,
use_cond_pdf
) -> MQGP{typeof(MultiQuantityGPs.Kernels.multiKernel)}
Creates and returns a MQGP with hyperparameters trained and conditioned on the samples given. Lower and upper bounds are used to initialize one of the hyperparameters.
A noise standard deviation can optionally be passed in either as a single scalar value for all samples or a vector of values, one for each sample.
Examples
# create a MQGP
beliefModel = MQGP([M.prior_samples; samples], bounds)MultiQuantityGPs.MQGP — MethodInputs:
X: a single sample input or an array of multiplefull_cov: (optional) if this is true, returns the full covariance matrix in place of the vector of standard deviations
Outputs:
μ, σ: a pair of expected value(s) and uncertainty(s) for the given point(s)
Examples
X = [([.1, .2], 1),
([.2, .1], 2)]
μ, σ = beliefModel(X) # result: [μ1, μ2], [σ1, σ2]MultiQuantityGPs.SampleInput — Typestruct Tuple{Vector{Float64}, Int64}Sample input, the combination of: (Location, sensor index)
MultiQuantityGPs.createLossFunc — MethodcreateLossFunc(
X,
Y_vals,
Y_errs,
kernel,
use_cond_pdf
) -> MultiQuantityGPs.var"#15#16"
This function creates the loss function for training the GP. The negative log marginal likelihood is used.
MultiQuantityGPs.fullCov — MethodfullCov(
bm::MQGP,
X::AbstractArray{Tuple{Vector{Float64}, Int64}}
) -> Any
Returns the full covariance matrix for the belief model.
MultiQuantityGPs.latentCovMat — MethodlatentCovMat(
bm::MQGP{typeof(MultiQuantityGPs.Kernels.multiKernel)}
) -> Any
Gives the covariance matrix between all latent functions from the hyperparameters.
MultiQuantityGPs.meanDerivAndVar — MethodmeanDerivAndVar(
bm::MQGP,
x::Tuple{Vector{Float64}, Int64}
) -> Tuple{Any, Any}
Returns the normed gradient of the mean of the belief model and its variance.
MultiQuantityGPs.optimizeLoss — MethodoptimizeLoss(lossFunc, θ0; solver, iterations) -> Any
Routine to optimize the lossFunc and return the optimal parameters θ.
Can pass in a different solver. NelderMead is picked as default for better speed with about the same performance as LFBGS.
MultiQuantityGPs.quantityCorMat — MethodquantityCorMat(beliefModel::MQGP)Gives the correlation matrix between all quantities from the hyperparameters.
MultiQuantityGPs.quantityCovMat — MethodquantityCovMat(bm::MQGP) -> Any
Gives the covariance matrix between all quantities from the hyperparameters. The model of the quantities is the latent functions plus their measurement noise.
MultiQuantityGPs.Kernels.SLFMMOKernel — TypeSLFMMOKernel(g::AbstractVector{<:Kernel}, A::AbstractMatrix)Kernel associated with the semiparametric latent factor model.
Definition
For inputs $x, x'$ and output dimensions $p, p''$, the kernel is defined as
\[k\big((x, p), (x', p')\big) = \sum^{Q}_{q=1} A_{p q}g_q(x, x')A_{p' q},\]
where $g_1, \ldots, g_Q$ are $Q$ kernels, one for each latent process, and $A$ is a matrix of weights for the kernels of size $m \times Q$.
MultiQuantityGPs.Kernels.customKernel — MethodcustomKernel(
θ
) -> MultiQuantityGPs.Kernels.CustomMOKernel{_A, <:AbstractMatrix{T}} where {_A, T}
Creates a custom kernel function for the GP similar to the slfmKernel but with matrices of length-scales and amplitudes.
This one does not work and is likely not theoretically valid.
MultiQuantityGPs.Kernels.fullyConnectedCovMat — MethodfullyConnectedCovMat(a) -> Any
Creates an output covariance matrix from an array of parameters by filling a lower triangular matrix.
Inputs:
a: parameter vector, must hold (N+1)*N/2 parameters, where N = number of outputs
MultiQuantityGPs.Kernels.fullyConnectedCovNum — MethodfullyConnectedCovNum(num_outputs) -> Any
Gives the number of hyperparameters for to fill the fullyConnectedCovMat.
MultiQuantityGPs.Kernels.initHyperparams — MethodinitHyperparams(
X,
Y_vals,
bounds,
N,
::typeof(MultiQuantityGPs.Kernels.mtoKernel);
kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}
Creates the structure of hyperparameters for a MTGP and gives them initial values. This is for a specialized quantity covariance matrix with separation.
MultiQuantityGPs.Kernels.initHyperparams — MethodinitHyperparams(
X,
Y_vals,
bounds,
N,
::typeof(MultiQuantityGPs.Kernels.multiKernel);
kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}
Creates the structure of hyperparameters for a MTGP and gives them initial values.
MultiQuantityGPs.Kernels.initHyperparams — MethodinitHyperparams(
X,
Y_vals,
bounds,
N,
::typeof(MultiQuantityGPs.Kernels.slfmKernel);
kwargs...
) -> NamedTuple{(:σ, :ℓ), <:Tuple{Any, Any}}
Creates the structure of hyperparameters for a SLFM and gives them initial values.
MultiQuantityGPs.Kernels.manyToOneCovMat — MethodmanyToOneCovMat(a) -> Any
Creates an output covariance matrix from an array of parameters by filling the first column and diagonal of a lower triangular matrix.
Inputs:
a: parameter vector, must hold 2N-1 parameters, where N = number of outputs
MultiQuantityGPs.Kernels.manyToOneCovNum — MethodmanyToOneCovNum(num_outputs) -> Any
Gives the number of hyperparameters for to fill the manyToOneCovMat.
MultiQuantityGPs.Kernels.mtoKernel — MethodmtoKernel(θ) -> KernelFunctions.IntrinsicCoregionMOKernel
Creates a kernel function for the GP, which is similar to a multiKernel but instead uses a many-to-one quantity covariance matrix.
MultiQuantityGPs.Kernels.multiKernel — MethodmultiKernel(θ) -> KernelFunctions.IntrinsicCoregionMOKernel
A multi-task GP kernel, a variety of multi-output GP kernel based on the Intrinsic Coregionalization Model with a Squared Exponential base kernel and an output matrix formed from a lower triangular matrix.
This function creates the kernel function used within the GP.
MultiQuantityGPs.Kernels.multiMean — MethodmultiMean(
θ
) -> AbstractGPs.CustomMean{Tf} where Tf<:MultiQuantityGPs.Kernels.var"#13#14"
Creates a quantity-specific constant mean function from the GP hyperparameters.
MultiQuantityGPs.Kernels.singleKernel — MethodsingleKernel(θ) -> Any
A simple squared exponential kernel for the GP with parameters θ.
This function creates the kernel function used within the GP.
MultiQuantityGPs.Kernels.slfmKernel — MethodslfmKernel(
θ
) -> MultiQuantityGPs.Kernels.SLFMMOKernel{_A, <:AbstractMatrix{T}} where {_A, T}
Creates a semi-parametric latent factor model (SLFM) kernel function for the GP.