Contenido multimedia no disponible por derechos de autor o por acceso restringido. Contacte con la institución para más información.
MAP20230012444Chai Fung , TszFitting censored and truncated regression data using the mixture of experts models / Tsz Chai Fung, Andrei L. Badescu & X. Sheldon LinSumario: The logit-weighted reduced mixture of experts model (LRMoE) is a flexible yet analytically tractable non-linear regression model. Though it has shown usefulness in modeling insurance loss frequencies and severities, model calibration becomes challenging when censored and truncated data are involved, which is common in actuarial practice. In this article, we present an extended expectationconditional maximization (ECM) algorithm that efficiently fits the LRMoE to random censored and random truncated regression data. The effectiveness of the proposed algorithm is empirically examined through a simulation study. Using real automobile insurance data sets, the usefulness and importance of the proposed algorithm are demonstrated through two actuarial applications: individual claim reserving and deductible ratemakingEn: North American actuarial journal. - Schaumburg : Society of Actuaries, 1997- = ISSN 1092-0277. - 05/12/2022 Tomo 26 Número 4 - 2022 , p. 496-5201. Cálculo actuarial. 2. Regresión no lineal. 3. Bases de datos. 4. Análisis de datos. 5. Algoritmos. I. Badescu, Andrei L.. II. Sheldon Lin, X.. III. Title.