In this chapter, we focus, on the one hand, on the estimation of the linear mixed model parameters and, on the other hand, on the solution of the generalized estimating equations. The most frequently used models for statistical data analysis are regression models. In linear regression, the objective is to study the relationship between a response variable (explained variable) and one or more explanatory variables, based on linear models (LM). The estimation of the considered parameters model is one of the most crucial processes in the statistical data analysis process. The greatest likelihood and least squares are the two most used estimation techniques. There is no preference for one approach over the other when the data are gaussian as the two procedures are equivalent. However, if the data are not gaussian, this equivalence is no longer valid. Also, if the normal equations are not linear, we make use of iterative methods (Newton-Raphson algorithm, Fisher, etc ...). In this work, we consider a particular case where the data are not normal and solving equations are not linear and that it leads to the equivalence between the maximum likelihood and the least squares methods, but the last is modified. In addition, we concluded by referring to the application of this modified method for solving the equations of Liang and Zeger. At the end of the work, we showed the existing relationship between the maximum likelihood for the GLM and the GEE resolution method which is the iteratively reweighted least squares ones (IRLS).
Author(s) Details:
Ahsene Lanani,
Department of Mathematics, Mentouri Brothers University of
Constantine 1, Algeria.
Rahima Benchabi,
Department
of Mechanical Engineering, Mentouri Brothers University of Constantine1,
Algeria.
Please see the link here: https://stm.bookpi.org/RUMCS-V5/article/view/14213
No comments:
Post a Comment