and here is some R code to check in an example: set.seed(100) # simulate some data X <- data.frame(x1=rnorm(1000), x2=rnorm(1000)) X$y <- X$x1 + X$x2 + rnorm(1000) # fit model model <- lm(y~., data=X) # find R^2 summary(model) # make predictions yhat <- pred...
BigML represents your test data and the model forecasts in a chart, so you can visually analyze the goodness-of-fit of your Time Series models. You will also see multiple performance metrics such as the Mean Absolute Error (MAE), the Mean Squared Error (MSE), the R squared, the ...
The departures from normality are curious, because they indicate clustering of values around $3$ and $6$ in the squared Mahalanobis distance. You need to investigate that and decide to what extent this clustering might affect whatever analyses you are interested in. $\endgroup$...
lrn=makeFilterWrapper(learner=makeLearner(cl=lrnName,par.vals=lrnPars))ps=makeParamSet(makeDiscreteParam("fw.abs",values=seq(3,5,1)),makeDiscreteParam("fw.method",values=c('chi.squared','information.gain')))if("minsplit"%in%names(getParamSet(lrn)$pars))ps$pars$minsplit=makeIntegerParam(...