The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

To cite the package meteorits in a publication please use the following reference. To cite the corresponding paper for a specific package from meteorits (e.g NMoE, SNMoE, tMoE, StMoE, etc), please choose the reference(s) from the list provided below.

Chamroukhi F, Lecocq F, Bartcus M (2019). meteorits: Mixtures-of-Experts Modeling for Complex and Non-Normal Distributions ('MEteorits'). R package version 0.1.1, https://github.com/fchamroukhi/MEteorits.

Huynh B, Chamroukhi F (2019). “Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models.” Journal de la Société Française de Statistique. https://chamroukhi.com/papers/Chamroukhi_Huynh_jsfds-published.pdf.

Chamroukhi F, Huynh B (2019). “Regularized Maximum Likelihood Estimation and Feature Selection in Mixtures-of-Experts Models.” Journal de la Société Française de Statistique, 160(1), 57–85.

Nguyen H, Chamroukhi F (2018). “Practical and theoretical aspects of mixture-of-experts modeling: An overview.” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, e1246–n/a. doi:10.1002/widm.1246, https://chamroukhi.com/papers/Nguyen-Chamroukhi-MoE-DMKD-2018.

Chamroukhi F (2017). “Skew t mixture of experts.” Neurocomputing - Elsevier, 266, 390–408. https://chamroukhi.com/papers/STMoE.pdf.

Chamroukhi F (2016). “Robust mixture of experts modeling using the t-distribution.” Neural Networks - Elsevier, 79, 20–36. https://chamroukhi.com/papers/TMoE.pdf.

Chamroukhi F (2016). “Skew-Normal Mixture of Experts.” In The International Joint Conference on Neural Networks (IJCNN). https://chamroukhi.com/papers/Chamroukhi-SNMoE-IJCNN2016.pdf.

Chamroukhi F (2015). Statistical learning of latent data models for complex data analysis. Habilitation Thesis (HDR), Université de Toulon. https://chamroukhi.com/Dossier/FChamroukhi-Habilitation.pdf.

Chamroukhi F (2010). Hidden process regression for curve modeling, classification and tracking. Ph.D. Thesis, Université de Technologie de Compiègne. https://chamroukhi.com/papers/FChamroukhi-Thesis.pdf.

Chamroukhi F, Samé A, Govaert G, Aknin P (2009). “Time series modeling by a regression approach based on a latent process.” Neural Networks Elsevier Science Ltd., 22(5-6), 593–602.

Corresponding BibTeX entries:

  @Manual{,
    title = {meteorits: Mixtures-of-Experts Modeling for Complex and
      Non-Normal Distributions ('MEteorits')},
    author = {F. Chamroukhi and F. Lecocq and M. Bartcus},
    year = {2019},
    note = {R package version 0.1.1},
    url = {https://github.com/fchamroukhi/MEteorits},
  }
  @Article{,
    author = {B-T. Huynh and F. Chamroukhi},
    journal = {Journal de la Soci\'{e}t\'{e} Fran\c{c}aise de
      Statistique},
    title = {Estimation and Feature Selection in Mixtures of
      Generalized Linear Experts Models},
    year = {2019},
    url =
      {https://chamroukhi.com/papers/Chamroukhi_Huynh_jsfds-published.pdf},
  }
  @Article{,
    title = {Regularized Maximum Likelihood Estimation and Feature
      Selection in Mixtures-of-Experts Models},
    author = {F. Chamroukhi and Bao T. Huynh},
    journal = {Journal de la Soci\'{e}t\'{e} Fran\c{c}aise de
      Statistique},
    volume = {160},
    number = {1},
    pages = {57--85},
    year = {2019},
  }
  @Article{,
    title = {Practical and theoretical aspects of mixture-of-experts
      modeling: An overview},
    author = {Hien D. Nguyen and F. Chamroukhi},
    journal = {Wiley Interdisciplinary Reviews: Data Mining and
      Knowledge Discovery},
    publisher = {Wiley Periodicals, Inc},
    year = {2018},
    pages = {e1246--n/a},
    doi = {10.1002/widm.1246},
    url =
      {https://chamroukhi.com/papers/Nguyen-Chamroukhi-MoE-DMKD-2018},
  }
  @Article{,
    title = {Skew t mixture of experts},
    author = {F. Chamroukhi},
    journal = {Neurocomputing - Elsevier},
    year = {2017},
    volume = {266},
    pages = {390--408},
    url = {https://chamroukhi.com/papers/STMoE.pdf},
  }
  @Article{,
    title = {Robust mixture of experts modeling using the
      t-distribution},
    author = {F. Chamroukhi},
    journal = {Neural Networks - Elsevier},
    year = {2016},
    volume = {79},
    pages = {20--36},
    url = {https://chamroukhi.com/papers/TMoE.pdf},
  }
  @InProceedings{,
    title = {Skew-Normal Mixture of Experts},
    author = {F. Chamroukhi},
    booktitle = {The International Joint Conference on Neural Networks
      (IJCNN)},
    year = {2016},
    url =
      {https://chamroukhi.com/papers/Chamroukhi-SNMoE-IJCNN2016.pdf},
  }
  @PhdThesis{,
    title = {Statistical learning of latent data models for complex
      data analysis},
    author = {F. Chamroukhi},
    school = {Universit\'{e} de Toulon},
    year = {2015},
    type = {Habilitation Thesis (HDR)},
    url =
      {https://chamroukhi.com/Dossier/FChamroukhi-Habilitation.pdf},
  }
  @PhdThesis{,
    title = {Hidden process regression for curve modeling,
      classification and tracking},
    author = {F. Chamroukhi},
    school = {Universit\'{e} de Technologie de Compi\`{e}gne},
    year = {2010},
    type = {Ph.D. Thesis},
    url = {https://chamroukhi.com/papers/FChamroukhi-Thesis.pdf},
  }
  @Article{,
    title = {Time series modeling by a regression approach based on a
      latent process},
    author = {F. Chamroukhi and A. Sam\'{e} and G. Govaert and P.
      Aknin},
    journal = {Neural Networks Elsevier Science Ltd.},
    year = {2009},
    volume = {22},
    number = {5-6},
    pages = {593--602},
  }

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.