Hellinger distance based univariate regression for proportions
Hellinger distance based univariate regression for proportions.
prophelling.reg(y, x, cov = FALSE, tol = 1e-07, maxiters = 100)
y |
The dependent variable, a numerical vector with percentages. |
x |
A numerical matrix with the indendent variables. We add, internally, the first column of ones. |
cov |
Should the sandwich covariance matrix and the standard errors be returned? If yes, set this equal to TRUE. |
tol |
The tolerance value to terminate the Newton-Raphson algorithm. |
maxiters |
The max number of iterations that can take place in each regression. |
We minimise the Jensen-Shannon divergence instead of the ordinarily used divergence, the Kullback-Leibler. Both of them fall under the φ-divergence class models and hance this one produces asympottically normal regression coefficients as well.
A list including:
be |
The regression coefficients. |
seb |
The sandwich standard errors of the beta coefficients, if the input argument argument was set to TRUE. |
covb |
The sandwich covariance matrix of the beta coefficients, if the input argument argument was set to TRUE. |
js |
The final Jensen-Shannon divergence. |
H |
The final Hellinger distance. |
iters |
The number of iterations required by Newton-Raphson. |
Michail Tsagris
R implementation and documentation: Michail Tsagris <mtsagris@uoc.gr>
Tsagris, Michail (2015). A novel, divergence based, regression for compositional data. Proceedings of the 28th Panhellenic Statistics Conference, 15-18/4/2015, Athens, Greece. https://arxiv.org/pdf/1511.07600.pdf
y <- rbeta(150, 3, 4) x <- iris a <- prophelling.reg(y, x)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.