mirror of
https://github.com/FoundationAgents/MetaGPT.git
synced 2026-04-25 00:36:55 +02:00
update readme format
This commit is contained in:
parent
1b165f060e
commit
6f278cf362
6 changed files with 19 additions and 44 deletions
60
docs/ACADEMIC_WORK.md
Normal file
60
docs/ACADEMIC_WORK.md
Normal file
|
|
@ -0,0 +1,60 @@
|
|||
```bibtex
|
||||
@inproceedings{hong2024metagpt,
|
||||
title={Meta{GPT}: Meta Programming for A Multi-Agent Collaborative Framework},
|
||||
author={Sirui Hong and Mingchen Zhuge and Jonathan Chen and Xiawu Zheng and Yuheng Cheng and Jinlin Wang and Ceyao Zhang and Zili Wang and Steven Ka Shing Yau and Zijuan Lin and Liyang Zhou and Chenyu Ran and Lingfeng Xiao and Chenglin Wu and J{\"u}rgen Schmidhuber},
|
||||
booktitle={The Twelfth International Conference on Learning Representations},
|
||||
year={2024},
|
||||
url={https://openreview.net/forum?id=VtmBAGCN7o}
|
||||
}
|
||||
|
||||
@misc{teng2025atom,
|
||||
title={Atom of Thoughts for Markov LLM Test-Time Scaling},
|
||||
author={Fengwei Teng and Zhaoyang Yu and Quan Shi and Jiayi Zhang and Chenglin Wu and Yuyu Luo},
|
||||
year={2025},
|
||||
eprint={2502.12018},
|
||||
archivePrefix={arXiv},
|
||||
primaryClass={cs.CL},
|
||||
url={https://arxiv.org/abs/2502.12018},
|
||||
}
|
||||
@misc{xiang2025self,
|
||||
title={Self-Supervised Prompt Optimization},
|
||||
author={Jinyu Xiang and Jiayi Zhang and Zhaoyang Yu and Fengwei Teng and Jinhao Tu and Xinbing Liang and Sirui Hong and Chenglin Wu and Yuyu Luo},
|
||||
year={2025},
|
||||
eprint={2502.06855},
|
||||
archivePrefix={arXiv},
|
||||
primaryClass={cs.CL},
|
||||
url={https://arxiv.org/abs/2502.06855},
|
||||
}
|
||||
@inproceedings{wang2025fact,
|
||||
title={FACT: Examining the Effectiveness of Iterative Context Rewriting for Multi-fact Retrieval},
|
||||
author={Jinlin Wang and Suyuchen Wang and Ziwen Xia and Sirui Hong and Yun Zhu and Bang Liu and Chenglin Wu},
|
||||
booktitle={The 2025 Annual Conference of the Nations of the Americas Chapter of the ACL},
|
||||
year={2025},
|
||||
url={https://openreview.net/forum?id=VXOircx5h3}
|
||||
}
|
||||
@misc{chi2024sela,
|
||||
title={SELA: Tree-Search Enhanced LLM Agents for Automated Machine Learning},
|
||||
author={Yizhou Chi and Yizhang Lin and Sirui Hong and Duyi Pan and Yaying Fei and Guanghao Mei and Bangbang Liu and Tianqi Pang and Jacky Kwok and Ceyao Zhang and Bang Liu and Chenglin Wu},
|
||||
year={2024},
|
||||
eprint={2410.17238},
|
||||
archivePrefix={arXiv},
|
||||
primaryClass={cs.AI},
|
||||
url={https://arxiv.org/abs/2410.17238},
|
||||
}
|
||||
@inproceedings{zhang2025aflow,
|
||||
title={{AF}low: Automating Agentic Workflow Generation},
|
||||
author={Jiayi Zhang and Jinyu Xiang and Zhaoyang Yu and Fengwei Teng and Xiong-Hui Chen and Jiaqi Chen and Mingchen Zhuge and Xin Cheng and Sirui Hong and Jinlin Wang and Bingnan Zheng and Bang Liu and Yuyu Luo and Chenglin Wu},
|
||||
booktitle={The Thirteenth International Conference on Learning Representations},
|
||||
year={2025},
|
||||
url={https://openreview.net/forum?id=z5uVAKwmjf}
|
||||
}
|
||||
@misc{hong2024data,
|
||||
title={Data Interpreter: An LLM Agent For Data Science},
|
||||
author={Sirui Hong and Yizhang Lin and Bang Liu and Bangbang Liu and Binhao Wu and Danyang Li and Jiaqi Chen and Jiayi Zhang and Jinlin Wang and Li Zhang and Lingyao Zhang and Min Yang and Mingchen Zhuge and Taicheng Guo and Tuo Zhou and Wei Tao and Wenyi Wang and Xiangru Tang and Xiangtao Lu and Xiawu Zheng and Xinbing Liang and Yaying Fei and Yuheng Cheng and Zongze Xu and Chenglin Wu},
|
||||
year={2024},
|
||||
eprint={2402.18679},
|
||||
archivePrefix={arXiv},
|
||||
primaryClass={cs.AI},
|
||||
url={https://arxiv.org/abs/2402.18679},
|
||||
}
|
||||
```
|
||||
22
docs/NEWS.md
Normal file
22
docs/NEWS.md
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
## Earlier news
|
||||
|
||||
🚀 Oct. 29, 2024: We introduced three papers: [AFLOW](https://arxiv.org/abs/2410.10762), [FACT](https://arxiv.org/abs/2410.21012), and [SELA](https://arxiv.org/abs/2410.17238), check the [code](examples)!
|
||||
|
||||
🚀 Mar. 29, 2024: [v0.8.0](https://github.com/geekan/MetaGPT/releases/tag/v0.8.0) released. Now you can use Data Interpreter ([arxiv](https://arxiv.org/abs/2402.18679), [example](https://docs.deepwisdom.ai/main/en/DataInterpreter/), [code](https://github.com/geekan/MetaGPT/tree/main/examples/di)) via pypi package import. Meanwhile, we integrated the RAG module and supported multiple new LLMs.
|
||||
|
||||
🚀 Feb. 08, 2024: [v0.7.0](https://github.com/geekan/MetaGPT/releases/tag/v0.7.0) released, supporting assigning different LLMs to different Roles. We also introduced [Data Interpreter](https://github.com/geekan/MetaGPT/blob/main/examples/di/README.md), a powerful agent capable of solving a wide range of real-world problems.
|
||||
|
||||
🚀 Jan. 16, 2024: Our paper [MetaGPT: Meta Programming for A Multi-Agent Collaborative Framework
|
||||
](https://openreview.net/forum?id=VtmBAGCN7o) accepted for **oral presentation (top 1.2%)** at ICLR 2024, **ranking #1** in the LLM-based Agent category.
|
||||
|
||||
🚀 Jan. 03, 2024: [v0.6.0](https://github.com/geekan/MetaGPT/releases/tag/v0.6.0) released, new features include serialization, upgraded OpenAI package and supported multiple LLM, provided [minimal example for debate](https://github.com/geekan/MetaGPT/blob/main/examples/debate_simple.py) etc.
|
||||
|
||||
🚀 Dec. 15, 2023: [v0.5.0](https://github.com/geekan/MetaGPT/releases/tag/v0.5.0) released, introducing some experimental features such as incremental development, multilingual, multiple programming languages, etc.
|
||||
|
||||
🔥 Nov. 08, 2023: MetaGPT is selected into [Open100: Top 100 Open Source achievements](https://www.benchcouncil.org/evaluation/opencs/annual.html).
|
||||
|
||||
🔥 Sep. 01, 2023: MetaGPT tops GitHub Trending Monthly for the **17th time** in August 2023.
|
||||
|
||||
🌟 Jun. 30, 2023: MetaGPT is now open source.
|
||||
|
||||
🌟 Apr. 24, 2023: First line of MetaGPT code committed.
|
||||
|
|
@ -5,6 +5,10 @@ # MetaGPT: 多智能体框架
|
|||
</p>
|
||||
|
||||
<p align="center">
|
||||
[ <a href="../README.md"> En </a> |
|
||||
<b> 中 </b> |
|
||||
<a href="README_FR.md"> Fr </a> |
|
||||
<a href="README_JA.md"> 日 </a> ]
|
||||
<b>使 GPTs 组成软件公司,协作处理更复杂的任务</b>
|
||||
</p>
|
||||
|
||||
|
|
@ -12,19 +16,8 @@ # MetaGPT: 多智能体框架
|
|||
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
|
||||
<a href="https://discord.gg/DYn29wFk9z"><img src="https://dcbadge.vercel.app/api/server/DYn29wFk9z?style=flat" alt="Discord Follow"></a>
|
||||
<a href="https://twitter.com/MetaGPT_"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
|
||||
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
|
||||
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
|
||||
</p>
|
||||
|
||||
<h4 align="center">
|
||||
<p>
|
||||
<a href="../README.md">English</a> |
|
||||
<b>简体中文</b> |
|
||||
<a href="README_FR.md">Français</a> |
|
||||
<a href="README_JA.md">日本語</a>
|
||||
</p>
|
||||
</h4>
|
||||
|
||||
1. MetaGPT输入**一句话的老板需求**,输出**用户故事 / 竞品分析 / 需求 / 数据结构 / APIs / 文件等**
|
||||
2. MetaGPT内部包括**产品经理 / 架构师 / 项目经理 / 工程师**,它提供了一个**软件公司**的全过程与精心调配的SOP
|
||||
1. `Code = SOP(Team)` 是核心哲学。我们将SOP具象化,并且用于LLM构成的团队
|
||||
|
|
|
|||
|
|
@ -6,6 +6,10 @@ # MetaGPT: Architecture Multi-Agent
|
|||
</p>
|
||||
|
||||
<p align="center">
|
||||
[ <a href="../README.md"> En </a> |
|
||||
<a href="README_CN.md"> 中 </a> |
|
||||
<b> Fr </b> |
|
||||
<a href="README_JA.md"> 日 </a> ]
|
||||
<b>Assigner différents rôles aux GPTs pour former une entité collaborative capable de gérer des tâches complexes.</b>
|
||||
</p>
|
||||
|
||||
|
|
@ -13,19 +17,8 @@ # MetaGPT: Architecture Multi-Agent
|
|||
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
|
||||
<a href="https://discord.gg/DYn29wFk9z"><img src="https://dcbadge.vercel.app/api/server/DYn29wFk9z?style=flat" alt="Discord Follow"></a>
|
||||
<a href="https://twitter.com/MetaGPT_"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
|
||||
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
|
||||
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
|
||||
</p>
|
||||
|
||||
<h4 align="center">
|
||||
<p>
|
||||
<a href="../README.md">English</a> |
|
||||
<a href="README_CN.md">简体中文</a> |
|
||||
<b>Français</b> |
|
||||
<a href="README_JA.md">日本語</a>
|
||||
</p>
|
||||
</h4>
|
||||
|
||||
## Nouveautés
|
||||
🚀 29 mars 2024: La version [v0.8.0](https://github.com/geekan/MetaGPT/releases/tag/v0.8.0) a été publiée. Vous pouvez désormais utiliser le Data Interpreter ([arxiv](https://arxiv.org/abs/2402.18679), [example](https://docs.deepwisdom.ai/main/en/DataInterpreter/), [code](https://github.com/geekan/MetaGPT/tree/main/examples/di)) via l'importation du package PyPI. De plus, le module RAG (Génération Augmentée par Récupération) a été intégré, et plusieurs nouveaux modèles de LLMs sont désormais pris en charge.
|
||||
|
||||
|
|
|
|||
|
|
@ -5,6 +5,10 @@ # MetaGPT: マルチエージェントフレームワーク
|
|||
</p>
|
||||
|
||||
<p align="center">
|
||||
[ <a href="../README.md"> En </a> |
|
||||
<a href="README_CN.md"> 中 </a> |
|
||||
<a href="README_FR.md"> Fr </a> |
|
||||
<b> 日 </b> ]
|
||||
<b>GPT にさまざまな役割を割り当てることで、複雑なタスクのための共同ソフトウェアエンティティを形成します。</b>
|
||||
</p>
|
||||
|
||||
|
|
@ -12,18 +16,6 @@ # MetaGPT: マルチエージェントフレームワーク
|
|||
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
|
||||
<a href="https://discord.gg/DYn29wFk9z"><img src="https://dcbadge.vercel.app/api/server/DYn29wFk9z?style=flat" alt="Discord Follow"></a>
|
||||
<a href="https://twitter.com/MetaGPT_"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
|
||||
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
|
||||
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
|
||||
</p>
|
||||
|
||||
<h4 align="center">
|
||||
<p>
|
||||
<a href="../README.md">English</a> |
|
||||
<a href="README_CN.md">简体中文</a> |
|
||||
<a href="README_FR.md">Français</a> |
|
||||
<b>日本語</b>
|
||||
</p>
|
||||
</h4>
|
||||
|
||||
1. MetaGPT は、**1 行の要件** を入力とし、**ユーザーストーリー / 競合分析 / 要件 / データ構造 / API / 文書など** を出力します。
|
||||
2. MetaGPT には、**プロダクト マネージャー、アーキテクト、プロジェクト マネージャー、エンジニア** が含まれています。MetaGPT は、**ソフトウェア会社のプロセス全体を、慎重に調整された SOP とともに提供します。**
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue