This paper proposes a novel ensemble learning algorithm by coordinating global and local Gaussian process regression (GPR) models, and this algorithm is able to capture global and local process behaviours for a
A new class of parameter estimation algorithms is introduced for Gaussian process regression (GPR) models. It is shown that the integration of the GPR model with probability distance measures of (i) the integrated square error and (ii) Kullback–Leibler (K–L) divergence are analytically ...
Utilising this adaptive strategy, the Gaussian process based stochastic model predictive control (GP㏒MPC) algorithm is designed by applying the adaptive tightened constraints in all prediction horizons. To reduce the computation load, the one﹕tep GP㏒MPC algorithm is further developed by imposing the...
Then, local GPR models are built for each of the LDs and further integrated as finite mixture Gaussian process regression (FMGPR) models through FMM. Next, a Genetic Algorithm (GA) based ensemble pruning strategy is used to select the highly influential FMGPR models. When an estimation is ...
evaluation,theproposedMA—GPRmodelsignificantlyimprovedthepredictionaccuracy ,comparedwiththeconjugategradient methodandthegeneticalgorithmoptimizationprocess. Keywords:Gaussianprocess;hyper-parametersoptimization;memeticalgorithm;regressionmodel lIntr0ducti0n Beinganewkernelmethoddevelopedonthebasis ...
(time-varying) covariates, nonlinear and non-stationary effects, and model inference. We present LonGP, an additive Gaussian process regression model that is specifically designed for statistical analysis of longitudinal data, which solves these commonly faced challenges. LonGP can model time-varying ...
most transductive approaches consider the case of classification only. In this paper we introduce a transductive variant of Gaussian process regression with automatic model selection, based on approximate moment matching between training and test data. Empirical results show the feasibility and competitivenes...
An algorithm for training a Gaussian process regression would generally estimate the following parameters: β, σ2, and θ. Such an algorithm would also allow one to specify the kernel (k) and the basis function (b) to use for model training [175], [176]. In the present work, two ...
Finally, we combine the three concepts (LCCM, GP and kernels) to define the Gaussian Process – Latent Class Choice Model (GP-LCCM) and derive an Expectation-Maximization (EM) algorithm for estimation (Section 3.4). 3.1. Latent class choice model LCCM consists of two components, a class ...