The KL estimator for the inverse Gaussian regression model

Multicollinearity poses an undesirable effect on the efficiency of the maximum likelihood estimator (MLE) in both Gaussian and non‐Gaussian regression models. The ridge and the Liu estimators have been developed as an alternative to the MLE. Both estimators possess smaller mean squared error (MSE) o...

Full description

Saved in:
Bibliographic Details
Published in:Concurrency and computation Vol. 33; no. 13
Main Authors: Lukman, Adewale F., Algamal, Zakariya Y., Kibria, B. M. Golam, Ayinde, Kayode
Format: Journal Article
Language:English
Published: Hoboken, USA John Wiley & Sons, Inc 10-07-2021
Wiley Subscription Services, Inc
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multicollinearity poses an undesirable effect on the efficiency of the maximum likelihood estimator (MLE) in both Gaussian and non‐Gaussian regression models. The ridge and the Liu estimators have been developed as an alternative to the MLE. Both estimators possess smaller mean squared error (MSE) over the MLE. Recently, Kibria and Lukman developed KL estimator, which was found to outperform the ridge and the Liu estimators in the linear regression model. With this expectation, we developed the KL estimator for the inverse Gaussian regression model. We compare the proposed estimator's performance with some existing estimators in terms of theoretical comparison, the simulation study, and real‐life application. Smaller MSE criterion shows that the proposed estimator with one of its shrinkage parameter performs the best.
ISSN:1532-0626
1532-0634
DOI:10.1002/cpe.6222