Contenido multimedia no disponible por derechos de autor o por acceso restringido. Contacte con la institución para más información.
Seção: ArtigosTítulo: Fitting censored and truncated regression data using the mixture of experts models / Tsz Chai Fung, Andrei L. Badescu & X. Sheldon LinAutor: Chai Fung , TszNotas: Sumario: The logit-weighted reduced mixture of experts model (LRMoE) is a flexible yet analytically tractable non-linear regression model. Though it has shown usefulness in modeling insurance loss frequencies and severities, model calibration becomes challenging when censored and truncated data are involved, which is common in actuarial practice. In this article, we present an extended expectationconditional maximization (ECM) algorithm that efficiently fits the LRMoE to random censored and random truncated regression data. The effectiveness of the proposed algorithm is empirically examined through a simulation study. Using real automobile insurance data sets, the usefulness and importance of the proposed algorithm are demonstrated through two actuarial applications: individual claim reserving and deductible ratemakingRegistros relacionados: En: North American actuarial journal. - Schaumburg : Society of Actuaries, 1997- = ISSN 1092-0277. - 05/12/2022 Tomo 26 Número 4 - 2022 , p. 496-520Materia / lugar / evento: Cálculo actuarialRegresión no linealBases de datosAnálisis de datosAlgoritmosOtros autores: Badescu, Andrei L. Sheldon Lin, X. Outras classificações: 6Direitos: In Copyright (InC)Referencias externas: