MoA

1.Mixture of Agents Enhances Large Language Model Capabilities

post-thumbnail

2.MoA is All You Need :Building LLM Research Team using Mixture of Agents

post-thumbnail

3.SMoA: Improving Multi-agent Large Language Models with Sparse Mixture-of-Agents

post-thumbnail

4.CoMM: Collaborative Multi-Agent, Multi-Reasoning-Path Prompting for Complex Problem Solving

post-thumbnail

5.Rethinking Mixture-of-Agents: Is Mixing Different Large Language Models Beneficial?

post-thumbnail

6.Distributed Mixture-of-Agents for Edge Inference with Large Language Models

post-thumbnail

7.Multi-LLM Collaborative Search for Complex Problem Solving

post-thumbnail