mixture of experts
Recently Published Documents


TOTAL DOCUMENTS

336
(FIVE YEARS 93)

H-INDEX

23
(FIVE YEARS 4)

2021 ◽  
Author(s):  
Xiangnan Xu ◽  
Michal Lubomski ◽  
Andrew J Holmes ◽  
Carolyn M Sue ◽  
Ryan L Davis ◽  
...  

The microbiome plays a fundamental role in human health and diet is one of the strongest modulators of the gut microbiome. However, interactions between microbiota and host health are complex and diverse. Understanding the interplay between diet, the microbiome and health state could enable the design of personalized intervention strategies and improve the health and wellbeing of affected individuals. A common approach to this is to divide the study population into smaller cohorts based on dietary preferences in the hope of identifying specific microbial signatures. However, classification of patients based solely on diet is unlikely to reflect the microbiome-host health relationship or the taxonomic microbiome makeup. To this end, we present a novel approach, the Nutrition-Ecotype Mixture of Experts (NEMoE) model, for establishing associations between gut microbiota and health state that accounts for diet-specific cohort variability using a regularized mixture of experts model framework with an integrated parameter sharing strategy to ensure data driven diet-cohort identification consistency across taxonomic levels. The success of our approach was demonstrated through a series of simulation studies, in which NEMoE showed robustness with regard to parameter selection and varying degrees of data heterogeneity. Further application to real-world microbiome data from a Parkinson's disease cohort revealed that NEMoE is capable of not only improving predictive performance for Parkinson's Disease but also for identifying diet-specific microbiome markers of disease. Our results indicate that NEMoE can be used to uncover diet-specific relationships between nutritional-ecotype and patient health and to contextualize precision nutrition for different diseases.


2021 ◽  
Author(s):  
Pai Peng ◽  
Keke Geng ◽  
Shangjie Li ◽  
Ziwei Wang ◽  
Min Qian ◽  
...  

Stat ◽  
2021 ◽  
Author(s):  
Afsaneh Sepahdar ◽  
Mohsen Madadi ◽  
Narayanaswamy Balakrishnan ◽  
Ahad Jamalizadeh

2021 ◽  
pp. 100071
Author(s):  
Kodai Minoura ◽  
Ko Abe ◽  
Hyunha Nam ◽  
Hiroyoshi Nishikawa ◽  
Teppei Shimamura

Author(s):  
Hien Duy Nguyen ◽  
TrungTin Nguyen ◽  
Faicel Chamroukhi ◽  
Geoffrey John McLachlan

AbstractMixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.


Sign in / Sign up

Export Citation Format

Share Document