2024
DOI: 10.21203/rs.3.rs-3978298/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Merging Mixture of Experts and Retrieval Augmented Generation for Enhanced Information Retrieval and Reasoning

Xingyu Xiong,
Mingliang Zheng

Abstract: This study investigates the integration of Retrieval Augmented Generation (RAG) into the Mistral 8x7B Large Language Model (LLM), which already uses Mixture of Experts (MoE), to address its existing limitations in complex information retrieval and reasoning tasks. By leveraging the Google BIG-Bench dataset, we conducted extensive quantitative and qualitative analyses to evaluate the augmented model's performance. The results demonstrate significant improvements in accuracy, precision, recall, and F1 score, hig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 13 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?