Merge branch 'main' into add_open_interpreter

This commit is contained in:
Ben 2023-09-21 10:38:57 +08:00 committed by GitHub
commit 58a0ae0a35
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
82 changed files with 2056 additions and 132 deletions

View file

@ -4,4 +4,4 @@ sudo npm install -g @mermaid-js/mermaid-cli
# Step 2: Ensure that Python 3.9+ is installed on your system. You can check this by using:
python --version
python setup.py install
pip install -e.

4
.gitignore vendored
View file

@ -148,8 +148,7 @@ allure-results
.DS_Store
.vscode
*.txt
log.txt
docs/scripts/set_env.sh
key.yaml
output.json
@ -164,3 +163,4 @@ workspace/*
tmp
output.wav
metagpt/roles/idea_agent.py
.aider*

View file

@ -18,7 +18,7 @@ COPY . /app/metagpt
WORKDIR /app/metagpt
RUN mkdir workspace &&\
pip install --no-cache-dir -r requirements.txt &&\
python setup.py install
pip install -e.
# Running with an infinite loop using the tail command
CMD ["sh", "-c", "tail -f /dev/null"]

View file

@ -12,15 +12,17 @@ # MetaGPT: The Multi-Agent Framework
<a href="docs/README_CN.md"><img src="https://img.shields.io/badge/文档-中文版-blue.svg" alt="CN doc"></a>
<a href="README.md"><img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc"></a>
<a href="docs/README_JA.md"><img src="https://img.shields.io/badge/ドキュメント-日本語-blue.svg" alt="JA doc"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://img.shields.io/badge/Discord-MetaGPT-7289da?logo=discord&logoColor=white&color=7289da" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="License: MIT"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://img.shields.io/badge/Discord-Join-blue?logo=discord&logoColor=white&color=blue" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
<a href="docs/ROADMAP.md"><img src="https://img.shields.io/badge/ROADMAP-路线图-blue" alt="roadmap"></a>
<a href="https://twitter.com/DeepWisdom2019"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
</p>
<p align="center">
<a href="https://airtable.com/appInfdG0eJ9J4NNL/shrEd9DrwVE3jX6oz"><img src="https://img.shields.io/badge/AgentStore-Waitlist-ffc107?logoColor=white" alt="AgentStore Waitlist"></a>
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
<a href="https://huggingface.co/spaces/deepwisdom/MetaGPT" target="_blank"><img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20-Hugging%20Face-blue?color=blue&logoColor=white" /></a>
</p>
1. MetaGPT takes a **one line requirement** as input and outputs **user stories / competitive analysis / requirements / data structures / APIs / documents, etc.**
@ -31,6 +33,13 @@ # MetaGPT: The Multi-Agent Framework
<p align="center">Software Company Multi-Role Schematic (Gradually Implementing)</p>
## MetaGPT's Abilities
https://github.com/geekan/MetaGPT/assets/34952977/34345016-5d13-489d-b9f9-b82ace413419
## Examples (fully generated by GPT-4)
For example, if you type `python startup.py "Design a RecSys like Toutiao"`, you would get many outputs, one of them is data & api design
@ -39,6 +48,9 @@ ## Examples (fully generated by GPT-4)
It costs approximately **$0.2** (in GPT-4 API fees) to generate one example with analysis and design, and around **$2.0** for a full project.
## Installation
### Installation Video Guide
@ -48,7 +60,7 @@ ### Installation Video Guide
### Traditional Installation
```bash
# Step 1: Ensure that NPM is installed on your system. Then install mermaid-js.
# Step 1: Ensure that NPM is installed on your system. Then install mermaid-js. (If you don't have npm in your computer, please go to the Node.js offical website to install Node.js https://nodejs.org/ and then you will have npm tool in your computer.)
npm --version
sudo npm install -g @mermaid-js/mermaid-cli
@ -58,7 +70,7 @@ # Step 2: Ensure that Python 3.9+ is installed on your system. You can check thi
# Step 3: Clone the repository to your local machine, and install it.
git clone https://github.com/geekan/metagpt
cd metagpt
python setup.py install
pip install -e.
```
**Note:**
@ -79,7 +91,72 @@ # Step 3: Clone the repository to your local machine, and install it.
MMDC: "./node_modules/.bin/mmdc"
```
- if `python setup.py install` fails with error `[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'`, try instead running `python setup.py install --user`
- if `pip install -e.` fails with error `[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'`, try instead running `pip install -e. --user`
- To convert Mermaid charts to SVG, PNG, and PDF formats. In addition to the Node.js version of Mermaid-CLI, you now have the option to use Python version Playwright, pyppeteer or mermaid.ink for this task.
- Playwright
- **Install Playwright**
```bash
pip install playwright
```
- **Install the Required Browsers**
to support PDF conversion, please install Chrominum.
```bash
playwright install --with-deps chromium
```
- **modify `config.yaml`**
uncomment MERMAID_ENGINE from config.yaml and change it to `playwright`
```yaml
MERMAID_ENGINE: playwright
```
- pyppeteer
- **Install pyppeteer**
```bash
pip install pyppeteer
```
- **Use your own Browsers**
pyppeteer alow you use installed browsers, please set the following envirment
```bash
export PUPPETEER_EXECUTABLE_PATH = /path/to/your/chromium or edge or chrome
```
please do not use this command to install browser, it is too old
```bash
pyppeteer-install
```
- **modify `config.yaml`**
uncomment MERMAID_ENGINE from config.yaml and change it to `pyppeteer`
```yaml
MERMAID_ENGINE: pyppeteer
```
- mermaid.ink
- **modify `config.yaml`**
uncomment MERMAID_ENGINE from config.yaml and change it to `ink`
```yaml
MERMAID_ENGINE: ink
```
Note: this method does not support pdf export.
### Installation by Docker
@ -213,6 +290,9 @@ ## QuickStart
- [MetaGPT quickstart](https://deepwisdom.feishu.cn/wiki/CyY9wdJc4iNqArku3Lncl4v8n2b)
Try it on Huggingface Space
- https://huggingface.co/spaces/deepwisdom/MetaGPT
## Citation
For now, cite the [Arxiv paper](https://arxiv.org/abs/2308.00352):

View file

@ -76,3 +76,10 @@ SD_T2I_API: "/sdapi/v1/txt2img"
### for Research
MODEL_FOR_RESEARCHER_SUMMARY: gpt-3.5-turbo
MODEL_FOR_RESEARCHER_REPORT: gpt-3.5-turbo-16k
### choose the engine for mermaid conversion,
# default is nodejs, you can change it to playwright,pyppeteer or ink
# MERMAID_ENGINE: nodejs
### browser path for pyppeteer engine, support Chrome, Chromium,MS Edge
#PYPPETEER_EXECUTABLE_PATH: "/usr/bin/google-chrome-stable"

View file

@ -9,16 +9,22 @@ # MetaGPT: 多智能体框架
</p>
<p align="center">
<a href="README_CN.md"><img src="https://img.shields.io/badge/文档-中文版-blue.svg" alt="CN doc"></a>
<a href="../README.md"><img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc"></a>
<a href="README_JA.md"><img src="https://img.shields.io/badge/ドキュメント-日本語-blue.svg" alt="JA doc"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://dcbadge.vercel.app/api/server/wCp6Q3fsAk?compact=true&style=flat" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="License: MIT"></a>
<a href="ROADMAP.md"><img src="https://img.shields.io/badge/ROADMAP-路线图-blue" alt="roadmap"></a>
<a href="resources/MetaGPT-WeChat-Personal.jpeg"><img src="https://img.shields.io/badge/WeChat-微信-blue" alt="roadmap"></a>
<a href="docs/README_CN.md"><img src="https://img.shields.io/badge/文档-中文版-blue.svg" alt="CN doc"></a>
<a href="README.md"><img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc"></a>
<a href="docs/README_JA.md"><img src="https://img.shields.io/badge/ドキュメント-日本語-blue.svg" alt="JA doc"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://img.shields.io/badge/Discord-Join-blue?logo=discord&logoColor=white&color=blue" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
<a href="docs/ROADMAP.md"><img src="https://img.shields.io/badge/ROADMAP-路线图-blue" alt="roadmap"></a>
<a href="https://twitter.com/DeepWisdom2019"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
</p>
<p align="center">
<a href="https://airtable.com/appInfdG0eJ9J4NNL/shrEd9DrwVE3jX6oz"><img src="https://img.shields.io/badge/AgentStore-Waitlist-ffc107?logoColor=white" alt="AgentStore Waitlist"></a>
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
<a href="https://huggingface.co/spaces/deepwisdom/MetaGPT" target="_blank"><img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20-Hugging%20Face-blue?color=blue&logoColor=white" /></a>
</p>
1. MetaGPT输入**一句话的老板需求**,输出**用户故事 / 竞品分析 / 需求 / 数据结构 / APIs / 文件等**
2. MetaGPT内部包括**产品经理 / 架构师 / 项目经理 / 工程师**,它提供了一个**软件公司**的全过程与精心调配的SOP
1. `Code = SOP(Team)` 是核心哲学。我们将SOP具象化并且用于LLM构成的团队
@ -27,6 +33,11 @@ # MetaGPT: 多智能体框架
<p align="center">软件公司多角色示意图(正在逐步实现)</p>
## MetaGPT 的能力
https://github.com/geekan/MetaGPT/assets/34952977/34345016-5d13-489d-b9f9-b82ace413419
## 示例(均由 GPT-4 生成)
例如,键入`python startup.py "写个类似今日头条的推荐系统"`并回车你会获得一系列输出其一是数据结构与API设计
@ -50,7 +61,7 @@ # 第 2 步:确保您的系统上安装了 Python 3.9+。您可以使用以下
# 第 3 步:克隆仓库到您的本地机器,并进行安装。
git clone https://github.com/geekan/metagpt
cd metagpt
python setup.py install
pip install -e.
```
**注意:**
@ -70,7 +81,7 @@ # 第 3 步:克隆仓库到您的本地机器,并进行安装。
MMDC: "./node_modules/.bin/mmdc"
```
- 如果`python setup.py install`失败并显示错误`[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'`,请尝试使用`python setup.py install --user`运行。
- 如果`pip install -e.`失败并显示错误`[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'`,请尝试使用`pip install -e. --user`运行。
### Docker安装
@ -125,10 +136,10 @@ # 复制配置文件并进行必要的修改
cp config/config.yaml config/key.yaml
```
| 变量名 | config/key.yaml | env |
|--------------------------------------------|-------------------------------------------|--------------------------------|
| OPENAI_API_KEY # 用您自己的密钥替换 | OPENAI_API_KEY: "sk-..." | export OPENAI_API_KEY="sk-..." |
| OPENAI_API_BASE # 可选 | OPENAI_API_BASE: "https://<YOUR_SITE>/v1" | export OPENAI_API_BASE="https://<YOUR_SITE>/v1" |
| 变量名 | config/key.yaml | env |
| ----------------------------------- | ----------------------------------------- | ----------------------------------------------- |
| OPENAI_API_KEY # 用您自己的密钥替换 | OPENAI_API_KEY: "sk-..." | export OPENAI_API_KEY="sk-..." |
| OPENAI_API_BASE # 可选 | OPENAI_API_BASE: "https://<YOUR_SITE>/v1" | export OPENAI_API_BASE="https://<YOUR_SITE>/v1" |
## 示例:启动一个创业公司
@ -198,6 +209,10 @@ ## 快速体验
- [MetaGPT快速体验](https://deepwisdom.feishu.cn/wiki/Q8ycw6J9tiNXdHk66MRcIN8Pnlg)
可直接在Huggingface Space体验
- https://huggingface.co/spaces/deepwisdom/MetaGPT
## 联系信息
如果您对这个项目有任何问题或反馈,欢迎联系我们。我们非常欢迎您的建议!

View file

@ -9,16 +9,22 @@ # MetaGPT: マルチエージェントフレームワーク
</p>
<p align="center">
<a href="README_CN.md"><img src="https://img.shields.io/badge/文档-中文版-blue.svg" alt="CN doc"></a>
<a href="../README.md"><img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc"></a>
<a href="README_JA.md"><img src="https://img.shields.io/badge/ドキュメント-日本語-blue.svg" alt="JA doc"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://dcbadge.vercel.app/api/server/wCp6Q3fsAk?compact=true&style=flat" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="License: MIT"></a>
<a href="docs/README_CN.md"><img src="https://img.shields.io/badge/文档-中文版-blue.svg" alt="CN doc"></a>
<a href="README.md"><img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc"></a>
<a href="docs/README_JA.md"><img src="https://img.shields.io/badge/ドキュメント-日本語-blue.svg" alt="JA doc"></a>
<a href="https://discord.gg/wCp6Q3fsAk"><img src="https://img.shields.io/badge/Discord-Join-blue?logo=discord&logoColor=white&color=blue" alt="Discord Follow"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT"></a>
<a href="docs/ROADMAP.md"><img src="https://img.shields.io/badge/ROADMAP-路线图-blue" alt="roadmap"></a>
<a href="resources/MetaGPT-WeChat-Personal.jpeg"><img src="https://img.shields.io/badge/WeChat-微信-blue" alt="roadmap"></a>
<a href="https://twitter.com/DeepWisdom2019"><img src="https://img.shields.io/twitter/follow/MetaGPT?style=social" alt="Twitter Follow"></a>
</p>
<p align="center">
<a href="https://airtable.com/appInfdG0eJ9J4NNL/shrEd9DrwVE3jX6oz"><img src="https://img.shields.io/badge/AgentStore-Waitlist-ffc107?logoColor=white" alt="AgentStore Waitlist"></a>
<a href="https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/geekan/MetaGPT"><img src="https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode" alt="Open in Dev Containers"></a>
<a href="https://codespaces.new/geekan/MetaGPT"><img src="https://img.shields.io/badge/Github_Codespace-Open-blue?logo=github" alt="Open in GitHub Codespaces"></a>
<a href="https://huggingface.co/spaces/deepwisdom/MetaGPT" target="_blank"><img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20-Hugging%20Face-blue?color=blue&logoColor=white" /></a>
</p>
1. MetaGPT は、**1 行の要件** を入力とし、**ユーザーストーリー / 競合分析 / 要件 / データ構造 / API / 文書など** を出力します。
2. MetaGPT には、**プロダクト マネージャー、アーキテクト、プロジェクト マネージャー、エンジニア** が含まれています。MetaGPT は、**ソフトウェア会社のプロセス全体を、慎重に調整された SOP とともに提供します。**
1. `Code = SOP(Team)` が基本理念です。私たちは SOP を具体化し、LLM で構成されるチームに適用します。
@ -27,6 +33,11 @@ # MetaGPT: マルチエージェントフレームワーク
<p align="center">ソフトウェア会社のマルチロール図式(順次導入)</p>
## MetaGPTの能力
https://github.com/geekan/MetaGPT/assets/34952977/34345016-5d13-489d-b9f9-b82ace413419
## 例GPT-4 で完全生成)
例えば、`python startup.py "Toutiao のような RecSys をデザインする"`と入力すると、多くの出力が得られます
@ -37,6 +48,10 @@ ## 例GPT-4 で完全生成)
## インストール
### インストールビデオガイド
- [Matthew Berman: How To Install MetaGPT - Build A Startup With One Prompt!!](https://youtu.be/uT75J_KG_aY)
### 伝統的なインストール
```bash
@ -50,7 +65,7 @@ # ステップ 2: Python 3.9+ がシステムにインストールされてい
# ステップ 3: リポジトリをローカルマシンにクローンし、インストールする。
git clone https://github.com/geekan/metagpt
cd metagpt
python setup.py install
pip install -e.
```
**注:**
@ -60,18 +75,18 @@ # ステップ 3: リポジトリをローカルマシンにクローンし、
- このツールをグローバルにインストールする[問題を抱えている](https://github.com/mermaidjs/mermaid.cli/issues/15)人もいます。ローカルにインストールするのが代替の解決策です、
```bash
npm install @mermaid-js/mermaid-cli
```
```bash
npm install @mermaid-js/mermaid-cli
```
- config.yml に mmdc のコンフィギュレーションを記述するのを忘れないこと
```yml
PUPPETEER_CONFIG: "./config/puppeteer-config.json"
MMDC: "./node_modules/.bin/mmdc"
```
```yml
PUPPETEER_CONFIG: "./config/puppeteer-config.json"
MMDC: "./node_modules/.bin/mmdc"
```
- もし `python setup.py install` がエラー `[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'` で失敗したら、代わりに `python setup.py install --user` を実行してみてください
- もし `pip install -e.` がエラー `[Errno 13] Permission denied: '/usr/local/lib/python3.11/dist-packages/test-easy-install-13129.write-test'` で失敗したら、代わりに `pip install -e. --user` を実行してみてください
### Docker によるインストール
@ -126,26 +141,32 @@ # 設定ファイルをコピーし、必要な修正を加える。
cp config/config.yaml config/key.yaml
```
| 変数名 | config/key.yaml | env |
| ------------------------------------------ | ----------------------------------------- | ----------------------------------------------- |
| OPENAI_API_KEY # 自分のキーに置き換える | OPENAI_API_KEY: "sk-..." | export OPENAI_API_KEY="sk-..." |
| OPENAI_API_BASE # オプション | OPENAI_API_BASE: "https://<YOUR_SITE>/v1" | export OPENAI_API_BASE="https://<YOUR_SITE>/v1" |
| 変数名 | config/key.yaml | env |
| --------------------------------------- | ----------------------------------------- | ----------------------------------------------- |
| OPENAI_API_KEY # 自分のキーに置き換える | OPENAI_API_KEY: "sk-..." | export OPENAI_API_KEY="sk-..." |
| OPENAI_API_BASE # オプション | OPENAI_API_BASE: "https://<YOUR_SITE>/v1" | export OPENAI_API_BASE="https://<YOUR_SITE>/v1" |
## チュートリアル: スタートアップの開始
```shell
# スクリプトの実行
python startup.py "Write a cli snake game"
# コードレビューを利用すれば、コストはかかるが、より良いコード品質を選ぶことができます。
# プロジェクトの実施にエンジニアを雇わないこと
python startup.py "Write a cli snake game" --implement False
# エンジニアを雇い、コードレビューを行う
python startup.py "Write a cli snake game" --code_review True
```
スクリプトを実行すると、`workspace/` ディレクトリに新しいプロジェクトが見つかります。
### プラットフォームまたはツールの設定
要件を述べるときに、どのプラットフォームまたはツールを使用するかを指定できます。
```shell
python startup.py "pygame をベースとした cli ヘビゲームを書く"
```
### 使用方法
```
@ -194,13 +215,18 @@ ### コードウォークスルー
`examples` でシングル・ロール(ナレッジ・ベース付き)と LLM のみの例を詳しく見ることができます。
## クイックスタート
ローカル環境のインストールや設定は、ユーザーによっては難しいものです。以下のチュートリアルで MetaGPT の魅力をすぐに体験できます。
- [MetaGPT クイックスタート](https://deepwisdom.feishu.cn/wiki/CyY9wdJc4iNqArku3Lncl4v8n2b)
Hugging Face Space で試す
- https://huggingface.co/spaces/deepwisdom/MetaGPT
## 引用
現時点では、[Arxiv 論文](https://arxiv.org/abs/2308.00352)を引用してください:
```bibtex
@misc{hong2023metagpt,
title={MetaGPT: Meta Programming for Multi-Agent Collaborative Framework},
@ -224,3 +250,10 @@ ## お問い合わせ先
## デモ
https://github.com/geekan/MetaGPT/assets/2707039/5e8c1062-8c35-440f-bb20-2b0320f8d27d
## 参加する
📢 Discord チャンネルに参加してください!
https://discord.gg/ZRHeExS6xv
お会いできることを楽しみにしています! 🎉

Binary file not shown.

Before

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 84 KiB

82
examples/sk_agent.py Normal file
View file

@ -0,0 +1,82 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/13 12:36
@Author : femto Zheng
@File : sk_agent.py
"""
import asyncio
from semantic_kernel.core_skills import FileIOSkill, MathSkill, TextSkill, TimeSkill
from semantic_kernel.planning import SequentialPlanner
# from semantic_kernel.planning import SequentialPlanner
from semantic_kernel.planning.action_planner.action_planner import ActionPlanner
from metagpt.actions import BossRequirement
from metagpt.const import SKILL_DIRECTORY
from metagpt.roles.sk_agent import SkAgent
from metagpt.schema import Message
from metagpt.tools.search_engine import SkSearchEngine
async def main():
await basic_planner_example()
await action_planner_example()
# await sequential_planner_example()
# await basic_planner_web_search_example()
async def basic_planner_example():
task = """
Tomorrow is Valentine's day. I need to come up with a few date ideas. She speaks French so write it in French.
Convert the text to uppercase"""
role = SkAgent()
# let's give the agent some skills
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "SummarizeSkill")
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "WriterSkill")
role.import_skill(TextSkill(), "TextSkill")
# using BasicPlanner
await role.run(Message(content=task, cause_by=BossRequirement))
async def sequential_planner_example():
task = """
Tomorrow is Valentine's day. I need to come up with a few date ideas. She speaks French so write it in French.
Convert the text to uppercase"""
role = SkAgent(planner_cls=SequentialPlanner)
# let's give the agent some skills
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "SummarizeSkill")
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "WriterSkill")
role.import_skill(TextSkill(), "TextSkill")
# using BasicPlanner
await role.run(Message(content=task, cause_by=BossRequirement))
async def basic_planner_web_search_example():
task = """
Question: Who made the 1989 comic book, the film version of which Jon Raymond Polito appeared in?"""
role = SkAgent()
role.import_skill(SkSearchEngine(), "WebSearchSkill")
# role.import_semantic_skill_from_directory(skills_directory, "QASkill")
await role.run(Message(content=task, cause_by=BossRequirement))
async def action_planner_example():
role = SkAgent(planner_cls=ActionPlanner)
# let's give the agent 4 skills
role.import_skill(MathSkill(), "math")
role.import_skill(FileIOSkill(), "fileIO")
role.import_skill(TimeSkill(), "time")
role.import_skill(TextSkill(), "text")
task = "What is the sum of 110 and 990?"
await role.run(Message(content=task, cause_by=BossRequirement)) # it will choose mathskill.Add
if __name__ == "__main__":
asyncio.run(main())

View file

@ -0,0 +1,20 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 21:40:57
@Author : Stitch-z
@File : tutorial_assistant.py
"""
import asyncio
from metagpt.roles.tutorial_assistant import TutorialAssistant
async def main():
topic = "Write a tutorial about MySQL"
role = TutorialAssistant(language="Chinese")
await role.run(topic)
if __name__ == '__main__':
asyncio.run(main())

View file

@ -1,5 +1,7 @@
#!/usr/bin/env python
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2023/4/24 22:26
# @Author : alexanderwu
# @File : __init__.py
from metagpt import _compat as _ # noqa: F401

15
metagpt/_compat.py Normal file
View file

@ -0,0 +1,15 @@
import platform
import sys
import warnings
if sys.implementation.name == "cpython" and platform.system() == "Windows" and sys.version_info[:2] == (3, 9):
# https://github.com/python/cpython/pull/92842
from asyncio.proactor_events import _ProactorBasePipeTransport
def pacth_del(self, _warn=warnings.warn):
if self._sock is not None:
_warn(f"unclosed transport {self!r}", ResourceWarning, source=self)
self._sock.close()
_ProactorBasePipeTransport.__del__ = pacth_del

View file

@ -103,23 +103,23 @@ class WriteDesign(Action):
pass # Folder does not exist, but we don't care
workspace.mkdir(parents=True, exist_ok=True)
def _save_prd(self, docs_path, resources_path, prd):
async def _save_prd(self, docs_path, resources_path, prd):
prd_file = docs_path / 'prd.md'
quadrant_chart = CodeParser.parse_code(block="Competitive Quadrant Chart", text=prd)
mermaid_to_file(quadrant_chart, resources_path / 'competitive_analysis')
await mermaid_to_file(quadrant_chart, resources_path / 'competitive_analysis')
logger.info(f"Saving PRD to {prd_file}")
prd_file.write_text(prd)
def _save_system_design(self, docs_path, resources_path, content):
async def _save_system_design(self, docs_path, resources_path, content):
data_api_design = CodeParser.parse_code(block="Data structures and interface definitions", text=content)
seq_flow = CodeParser.parse_code(block="Program call flow", text=content)
mermaid_to_file(data_api_design, resources_path / 'data_api_design')
mermaid_to_file(seq_flow, resources_path / 'seq_flow')
await mermaid_to_file(data_api_design, resources_path / 'data_api_design')
await mermaid_to_file(seq_flow, resources_path / 'seq_flow')
system_design_file = docs_path / 'system_design.md'
logger.info(f"Saving System Designs to {system_design_file}")
system_design_file.write_text(content)
def _save(self, context, system_design):
async def _save(self, context, system_design):
if isinstance(system_design, ActionOutput):
content = system_design.content
ws_name = CodeParser.parse_str(block="Python package name", text=content)
@ -132,13 +132,13 @@ class WriteDesign(Action):
resources_path = workspace / 'resources'
docs_path.mkdir(parents=True, exist_ok=True)
resources_path.mkdir(parents=True, exist_ok=True)
self._save_prd(docs_path, resources_path, context[-1].content)
self._save_system_design(docs_path, resources_path, content)
await self._save_prd(docs_path, resources_path, context[-1].content)
await self._save_system_design(docs_path, resources_path, content)
async def run(self, context):
prompt = PROMPT_TEMPLATE.format(context=context, format_example=FORMAT_EXAMPLE)
# system_design = await self._aask(prompt)
system_design = await self._aask_v1(prompt, "system_design", OUTPUT_MAPPING)
self._save(context, system_design)
await self._save(context, system_design)
return system_design

View file

@ -0,0 +1,17 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/13 12:26
@Author : femto Zheng
@File : execute_task.py
"""
from metagpt.actions import Action
from metagpt.schema import Message
class ExecuteTask(Action):
def __init__(self, name="ExecuteTask", context: list[Message] = None, llm=None):
super().__init__(name, context, llm)
def run(self, *args, **kwargs):
pass

View file

@ -0,0 +1,41 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/19 15:02
@Author : DevXiaolan
@File : prepare_interview.py
"""
from metagpt.actions import Action
PROMPT_TEMPLATE = """
# Context
{context}
## Format example
---
Q1: question 1 here
References:
- point 1
- point 2
Q2: question 2 here...
---
-----
Role: You are an interviewer of our company who is well-knonwn in frontend or backend develop;
Requirement: Provide a list of questions for the interviewer to ask the interviewee, by reading the resume of the interviewee in the context.
Attention: Provide as markdown block as the format above, at least 10 questions.
"""
# prepare for a interview
class PrepareInterview(Action):
def __init__(self, name, context=None, llm=None):
super().__init__(name, context, llm)
async def run(self, context):
prompt = PROMPT_TEMPLATE.format(context=context)
question_list = await self._aask_v1(prompt)
return question_list

View file

@ -0,0 +1,95 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 15:40:40
@Author : Stitch-z
@File : tutorial_assistant.py
@Describe : Actions of the tutorial assistant, including writing directories and document content.
"""
import json
from typing import Dict
from metagpt.actions import Action
from metagpt.logs import logger
from metagpt.prompts.tutorial_assistant import DIRECTORY_PROMPT, CONTENT_PROMPT
class WriteDirectory(Action):
"""Action class for writing tutorial directories.
Args:
name: The name of the action.
language: The language to output, default is "Chinese".
"""
def __init__(self, name: str = "", language: str = "Chinese", *args, **kwargs):
super().__init__(name, *args, **kwargs)
self.language = language
@staticmethod
async def _handle_resp(resp: str) -> Dict:
"""Process string results and convert them to JSON format.
Args:
resp: The directory results returned by gpt.
Returns:
The parsed dictionary, such as {"title": "xxx", "directory": [{"dir 1": ["sub dir 1", "sub dir 2"]}]}.
Raises:
Exception: If no matching dictionary section is found.
json.JSONDecodeError: If the dictionary part cannot be parsed as JSON.
"""
start = resp.find('{')
end = resp.rfind('}')
if start != -1 and end != -1 and end > start:
directory_str = resp[start:end + 1]
logger.info(f"Successfully parsed json: {str(directory_str)}")
try:
return json.loads(directory_str)
except json.JSONDecodeError as e:
logger.error(f"Json parsing error: {e}")
raise e
else:
raise Exception("No matching dictionary section found.")
async def run(self, topic: str, *args, **kwargs) -> Dict:
"""Execute the action to generate a tutorial directory according to the topic.
Args:
topic: The tutorial topic.
Returns:
the tutorial directory information, including {"title": "xxx", "directory": [{"dir 1": ["sub dir 1", "sub dir 2"]}]}.
"""
prompt = DIRECTORY_PROMPT.format(topic=topic, language=self.language)
resp = await self._aask(prompt=prompt)
return await self._handle_resp(resp)
class WriteContent(Action):
"""Action class for writing tutorial content.
Args:
name: The name of the action.
directory: The content to write.
language: The language to output, default is "Chinese".
"""
def __init__(self, name: str = "", directory: str = "", language: str = "Chinese", *args, **kwargs):
super().__init__(name, *args, **kwargs)
self.language = language
self.directory = directory
async def run(self, topic: str, *args, **kwargs) -> str:
"""Execute the action to write document content according to the directory and topic.
Args:
topic: The tutorial topic.
Returns:
The written tutorial content.
"""
prompt = CONTENT_PROMPT.format(topic=topic, language=self.language, directory=self.directory)
return await self._aask(prompt=prompt)

View file

@ -83,6 +83,8 @@ class Config(metaclass=Singleton):
self.calc_usage = self._get("CALC_USAGE", True)
self.model_for_researcher_summary = self._get("MODEL_FOR_RESEARCHER_SUMMARY")
self.model_for_researcher_report = self._get("MODEL_FOR_RESEARCHER_REPORT")
self.mermaid_engine = self._get("MERMAID_ENGINE", 'nodejs')
self.pyppeteer_executable_path = self._get("PYPPETEER_EXECUTABLE_PATH", '')
def _init_with_config_files_and_env(self, configs: dict, yaml_file):
"""Load from config/key.yaml, config/config.yaml, and env in decreasing order of priority"""

View file

@ -12,9 +12,11 @@ def get_project_root():
"""Search upwards to find the project root directory."""
current_path = Path.cwd()
while True:
if (current_path / '.git').exists() or \
(current_path / '.project_root').exists() or \
(current_path / '.gitignore').exists():
if (
(current_path / ".git").exists()
or (current_path / ".project_root").exists()
or (current_path / ".gitignore").exists()
):
return current_path
parent_path = current_path.parent
if parent_path == current_path:
@ -23,15 +25,18 @@ def get_project_root():
PROJECT_ROOT = get_project_root()
DATA_PATH = PROJECT_ROOT / 'data'
WORKSPACE_ROOT = PROJECT_ROOT / 'workspace'
PROMPT_PATH = PROJECT_ROOT / 'metagpt/prompts'
UT_PATH = PROJECT_ROOT / 'data/ut'
DATA_PATH = PROJECT_ROOT / "data"
WORKSPACE_ROOT = PROJECT_ROOT / "workspace"
PROMPT_PATH = PROJECT_ROOT / "metagpt/prompts"
UT_PATH = PROJECT_ROOT / "data/ut"
SWAGGER_PATH = UT_PATH / "files/api/"
UT_PY_PATH = UT_PATH / "files/ut/"
API_QUESTIONS_PATH = UT_PATH / "files/question/"
YAPI_URL = "http://yapi.deepwisdomai.com/"
TMP = PROJECT_ROOT / 'tmp'
TMP = PROJECT_ROOT / "tmp"
RESEARCH_PATH = DATA_PATH / "research"
TUTORIAL_PATH = DATA_PATH / "tutorial_docx"
SKILL_DIRECTORY = PROJECT_ROOT / "metagpt/skills"
MEM_TTL = 24 * 30 * 3600

View file

@ -0,0 +1,39 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 15:40:40
@Author : Stitch-z
@File : tutorial_assistant.py
@Describe : Tutorial Assistant's prompt templates.
"""
COMMON_PROMPT = """
You are now a seasoned technical professional in the field of the internet.
We need you to write a technical tutorial with the topic "{topic}".
"""
DIRECTORY_PROMPT = COMMON_PROMPT + """
Please provide the specific table of contents for this tutorial, strictly following the following requirements:
1. The output must be strictly in the specified language, {language}.
2. Answer strictly in the dictionary format like {{"title": "xxx", "directory": [{{"dir 1": ["sub dir 1", "sub dir 2"]}}, {{"dir 2": ["sub dir 3", "sub dir 4"]}}]}}.
3. The directory should be as specific and sufficient as possible, with a primary and secondary directory.The secondary directory is in the array.
4. Do not have extra spaces or line breaks.
5. Each directory title has practical significance.
"""
CONTENT_PROMPT = COMMON_PROMPT + """
Now I will give you the module directory titles for the topic.
Please output the detailed principle content of this title in detail.
If there are code examples, please provide them according to standard code specifications.
Without a code example, it is not necessary.
The module directory titles for the topic is as follows:
{directory}
Strictly limit output according to the following requirements:
1. Follow the Markdown syntax format for layout.
2. If there are code examples, they must follow standard syntax specifications, have document annotations, and be displayed in code blocks.
3. The output must be strictly in the specified language, {language}.
4. Do not have redundant output, including concluding remarks.
5. Strict requirement not to output the topic "{topic}".
"""

View file

@ -6,11 +6,17 @@
"""
import asyncio
import time
from typing import NamedTuple
from typing import NamedTuple, Union
import openai
from openai.error import APIConnectionError
from tenacity import retry, stop_after_attempt, after_log, wait_fixed, retry_if_exception_type
from tenacity import (
after_log,
retry,
retry_if_exception_type,
stop_after_attempt,
wait_fixed,
)
from metagpt.config import CONFIG
from metagpt.logs import logger
@ -48,12 +54,14 @@ class RateLimiter:
self.last_call_time = time.time()
class Costs(NamedTuple):
total_prompt_tokens: int
total_completion_tokens: int
total_cost: float
total_budget: float
class CostManager(metaclass=Singleton):
"""计算使用接口的开销"""
@ -74,7 +82,9 @@ class CostManager(metaclass=Singleton):
"""
self.total_prompt_tokens += prompt_tokens
self.total_completion_tokens += completion_tokens
cost = (prompt_tokens * TOKEN_COSTS[model]["prompt"] + completion_tokens * TOKEN_COSTS[model]["completion"]) / 1000
cost = (
prompt_tokens * TOKEN_COSTS[model]["prompt"] + completion_tokens * TOKEN_COSTS[model]["completion"]
) / 1000
self.total_cost += cost
logger.info(
f"Total running cost: ${self.total_cost:.3f} | Max budget: ${CONFIG.max_budget:.3f} | "
@ -100,6 +110,7 @@ class CostManager(metaclass=Singleton):
"""
return self.total_completion_tokens
def get_total_cost(self):
"""
Get the total cost of API calls.
@ -109,25 +120,20 @@ def get_total_cost(self):
"""
return self.total_cost
def get_costs(self) -> Costs:
"""Get all costs"""
return Costs(self.total_prompt_tokens, self.total_completion_tokens, self.total_cost, self.total_budget)
def log_and_reraise(retry_state):
logger.error(f"Retry attempts exhausted. Last exception: {retry_state.outcome.exception()}")
logger.warning("""
Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ
See FAQ 5.8
""")
raise retry_state.outcome.exception()
def log_and_reraise(retry_state):
logger.error(f"Retry attempts exhausted. Last exception: {retry_state.outcome.exception()}")
logger.warning("""
logger.warning(
"""
Recommend going to https://deepwisdom.feishu.cn/wiki/MsGnwQBjiif9c3koSJNcYaoSnu4#part-XdatdVlhEojeAfxaaEZcMV3ZniQ
See FAQ 5.8
""")
"""
)
raise retry_state.outcome.exception()
@ -182,15 +188,18 @@ class OpenAIGPTAPI(BaseGPTAPI, RateLimiter):
"n": 1,
"stop": None,
"temperature": 0.3,
"timeout": 3
"timeout": 3,
}
if CONFIG.openai_api_type == "azure":
if CONFIG.deployment_name and CONFIG.deployment_id:
raise ValueError("You can only use one of the `deployment_id` or `deployment_name` model")
elif not CONFIG.deployment_name and not CONFIG.deployment_id:
raise ValueError("You must specify `DEPLOYMENT_NAME` or `DEPLOYMENT_ID` parameter")
kwargs_mode = {"engine": CONFIG.deployment_name} if CONFIG.deployment_name \
kwargs_mode = (
{"engine": CONFIG.deployment_name}
if CONFIG.deployment_name
else {"deployment_id": CONFIG.deployment_id}
)
else:
kwargs_mode = {"model": self.model}
kwargs.update(kwargs_mode)
@ -219,7 +228,7 @@ class OpenAIGPTAPI(BaseGPTAPI, RateLimiter):
@retry(
stop=stop_after_attempt(3),
wait=wait_fixed(1),
after=after_log(logger, logger.level('WARNING').name),
after=after_log(logger, logger.level("WARNING").name),
retry=retry_if_exception_type(APIConnectionError),
retry_error_callback=log_and_reraise,
)
@ -236,8 +245,8 @@ class OpenAIGPTAPI(BaseGPTAPI, RateLimiter):
try:
prompt_tokens = count_message_tokens(messages, self.model)
completion_tokens = count_string_tokens(rsp, self.model)
usage['prompt_tokens'] = prompt_tokens
usage['completion_tokens'] = completion_tokens
usage["prompt_tokens"] = prompt_tokens
usage["completion_tokens"] = completion_tokens
return usage
except Exception as e:
logger.error("usage calculation failed!", e)
@ -273,8 +282,8 @@ class OpenAIGPTAPI(BaseGPTAPI, RateLimiter):
def _update_costs(self, usage: dict):
if CONFIG.calc_usage:
try:
prompt_tokens = int(usage['prompt_tokens'])
completion_tokens = int(usage['completion_tokens'])
prompt_tokens = int(usage["prompt_tokens"])
completion_tokens = int(usage["completion_tokens"])
self._cost_manager.update_cost(prompt_tokens, completion_tokens, self.model)
except Exception as e:
logger.error("updating costs failed!", e)
@ -286,3 +295,31 @@ class OpenAIGPTAPI(BaseGPTAPI, RateLimiter):
if not self.auto_max_tokens:
return CONFIG.max_tokens_rsp
return get_max_completion_tokens(messages, self.model, CONFIG.max_tokens_rsp)
def moderation(self, content: Union[str, list[str]]):
try:
if not content:
logger.error("content cannot be empty!")
else:
rsp = self._moderation(content=content)
return rsp
except Exception as e:
logger.error(f"moderating failed:{e}")
def _moderation(self, content: Union[str, list[str]]):
rsp = self.llm.Moderation.create(input=content)
return rsp
async def amoderation(self, content: Union[str, list[str]]):
try:
if not content:
logger.error("content cannot be empty!")
else:
rsp = await self._amoderation(content=content)
return rsp
except Exception as e:
logger.error(f"moderating failed:{e}")
async def _amoderation(self, content: Union[str, list[str]]):
rsp = await self.llm.Moderation.acreate(input=content)
return rsp

76
metagpt/roles/sk_agent.py Normal file
View file

@ -0,0 +1,76 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/13 12:23
@Author : femto Zheng
@File : sk_agent.py
"""
from semantic_kernel.planning import SequentialPlanner
from semantic_kernel.planning.action_planner.action_planner import ActionPlanner
from semantic_kernel.planning.basic_planner import BasicPlanner
from metagpt.actions import BossRequirement
from metagpt.actions.execute_task import ExecuteTask
from metagpt.logs import logger
from metagpt.roles import Role
from metagpt.schema import Message
from metagpt.utils.make_sk_kernel import make_sk_kernel
class SkAgent(Role):
"""
Represents an SkAgent implemented using semantic kernel
Attributes:
name (str): Name of the SkAgent.
profile (str): Role profile, default is 'sk_agent'.
goal (str): Goal of the SkAgent.
constraints (str): Constraints for the SkAgent.
"""
def __init__(
self,
name: str = "Sunshine",
profile: str = "sk_agent",
goal: str = "Execute task based on passed in task description",
constraints: str = "",
planner_cls=BasicPlanner,
) -> None:
"""Initializes the Engineer role with given attributes."""
super().__init__(name, profile, goal, constraints)
self._init_actions([ExecuteTask()])
self._watch([BossRequirement])
self.kernel = make_sk_kernel()
# how funny the interface is inconsistent
if planner_cls == BasicPlanner:
self.planner = planner_cls()
elif planner_cls in [SequentialPlanner, ActionPlanner]:
self.planner = planner_cls(self.kernel)
else:
raise f"Unsupported planner of type {planner_cls}"
self.import_semantic_skill_from_directory = self.kernel.import_semantic_skill_from_directory
self.import_skill = self.kernel.import_skill
async def _think(self) -> None:
self._set_state(0)
# how funny the interface is inconsistent
if isinstance(self.planner, BasicPlanner):
self.plan = await self.planner.create_plan_async(self._rc.important_memory[-1].content, self.kernel)
logger.info(self.plan.generated_plan)
elif any(isinstance(self.planner, cls) for cls in [SequentialPlanner, ActionPlanner]):
self.plan = await self.planner.create_plan_async(self._rc.important_memory[-1].content)
async def _act(self) -> Message:
# how funny the interface is inconsistent
if isinstance(self.planner, BasicPlanner):
result = await self.planner.execute_plan_async(self.plan, self.kernel)
elif any(isinstance(self.planner, cls) for cls in [SequentialPlanner, ActionPlanner]):
result = (await self.plan.invoke_async()).result
logger.info(result)
msg = Message(content=result, role=self.profile, cause_by=type(self._rc.todo))
self._rc.memory.add(msg)
# logger.debug(f"{response}")
return msg

View file

@ -0,0 +1,114 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 15:40:40
@Author : Stitch-z
@File : tutorial_assistant.py
"""
from datetime import datetime
from typing import Dict
from metagpt.actions.write_tutorial import WriteDirectory, WriteContent
from metagpt.const import TUTORIAL_PATH
from metagpt.logs import logger
from metagpt.roles import Role
from metagpt.schema import Message
from metagpt.utils.file import File
class TutorialAssistant(Role):
"""Tutorial assistant, input one sentence to generate a tutorial document in markup format.
Args:
name: The name of the role.
profile: The role profile description.
goal: The goal of the role.
constraints: Constraints or requirements for the role.
language: The language in which the tutorial documents will be generated.
"""
def __init__(
self,
name: str = "Stitch",
profile: str = "Tutorial Assistant",
goal: str = "Generate tutorial documents",
constraints: str = "Strictly follow Markdown's syntax, with neat and standardized layout",
language: str = "Chinese",
):
super().__init__(name, profile, goal, constraints)
self._init_actions([WriteDirectory(language=language)])
self.topic = ""
self.main_title = ""
self.total_content = ""
self.language = language
async def _think(self) -> None:
"""Determine the next action to be taken by the role."""
if self._rc.todo is None:
self._set_state(0)
return
if self._rc.state + 1 < len(self._states):
self._set_state(self._rc.state + 1)
else:
self._rc.todo = None
async def _handle_directory(self, titles: Dict) -> Message:
"""Handle the directories for the tutorial document.
Args:
titles: A dictionary containing the titles and directory structure,
such as {"title": "xxx", "directory": [{"dir 1": ["sub dir 1", "sub dir 2"]}]}
Returns:
A message containing information about the directory.
"""
self.main_title = titles.get("title")
directory = f"{self.main_title}\n"
self.total_content += f"# {self.main_title}"
actions = list()
for first_dir in titles.get("directory"):
actions.append(WriteContent(language=self.language, directory=first_dir))
key = list(first_dir.keys())[0]
directory += f"- {key}\n"
for second_dir in first_dir[key]:
directory += f" - {second_dir}\n"
self._init_actions(actions)
self._rc.todo = None
return Message(content=directory)
async def _act(self) -> Message:
"""Perform an action as determined by the role.
Returns:
A message containing the result of the action.
"""
todo = self._rc.todo
if type(todo) is WriteDirectory:
msg = self._rc.memory.get(k=1)[0]
self.topic = msg.content
resp = await todo.run(topic=self.topic)
logger.info(resp)
return await self._handle_directory(resp)
resp = await todo.run(topic=self.topic)
logger.info(resp)
if self.total_content != "":
self.total_content += "\n\n\n"
self.total_content += resp
return Message(content=resp, role=self.profile)
async def _react(self) -> Message:
"""Execute the assistant's think and actions.
Returns:
A message containing the final result of the assistant's actions.
"""
while True:
await self._think()
if self._rc.todo is None:
break
msg = await self._act()
root_path = TUTORIAL_PATH / datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
await File.write(root_path, f"{self.main_title}.md", self.total_content.encode('utf-8'))
return msg

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Given a scientific white paper abstract, rewrite it to make it more readable",
"completion": {
"max_tokens": 4000,
"temperature": 0.0,
"top_p": 1.0,
"presence_penalty": 0.0,
"frequency_penalty": 2.0
}
}

View file

@ -0,0 +1,5 @@
{{$input}}
==
Summarize, using a user friendly, using simple grammar. Don't use subjects like "we" "our" "us" "your".
==

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Automatically generate compact notes for any text or text document.",
"completion": {
"max_tokens": 256,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,21 @@
Analyze the following extract taken from a document.
- Produce key points for memory.
- Give memory a name.
- Extract only points worth remembering.
- Be brief. Conciseness is very important.
- Use broken English.
You will use this memory to analyze the rest of this document, and for other relevant tasks.
[Input]
My name is Macbeth. I used to be King of Scotland, but I died. My wife's name is Lady Macbeth and we were married for 15 years. We had no children. Our beloved dog Toby McDuff was a famous hunter of rats in the forest.
My story was immortalized by Shakespeare in a play.
+++++
Family History
- Macbeth, King Scotland
- Wife Lady Macbeth, No Kids
- Dog Toby McDuff. Hunter, dead.
- Shakespeare play
[Input]
[[{{$input}}]]
+++++

View file

@ -0,0 +1,21 @@
{
"schema": 1,
"type": "completion",
"description": "Summarize given text or any text document",
"completion": {
"max_tokens": 512,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "Text to summarize",
"defaultValue": ""
}
]
}
}

View file

@ -0,0 +1,23 @@
[SUMMARIZATION RULES]
DONT WASTE WORDS
USE SHORT, CLEAR, COMPLETE SENTENCES.
DO NOT USE BULLET POINTS OR DASHES.
USE ACTIVE VOICE.
MAXIMIZE DETAIL, MEANING
FOCUS ON THE CONTENT
[BANNED PHRASES]
This article
This document
This page
This material
[END LIST]
Summarize:
Hello how are you?
+++++
Hello
Summarize this
{{$input}}
+++++

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Analyze given text or document and extract key topics worth remembering",
"completion": {
"max_tokens": 128,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,28 @@
Analyze the following extract taken from a document and extract key topics.
- Topics only worth remembering.
- Be brief. Short phrases.
- Can use broken English.
- Conciseness is very important.
- Topics can include names of memories you want to recall.
- NO LONG SENTENCES. SHORT PHRASES.
- Return in JSON
[Input]
My name is Macbeth. I used to be King of Scotland, but I died. My wife's name is Lady Macbeth and we were married for 15 years. We had no children. Our beloved dog Toby McDuff was a famous hunter of rats in the forest.
My tragic story was immortalized by Shakespeare in a play.
[Output]
{
"topics": [
"Macbeth",
"King of Scotland",
"Lady Macbeth",
"Dog",
"Toby McDuff",
"Shakespeare",
"Play",
"Tragedy"
]
}
+++++
[Input]
{{$input}}
[Output]

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Generate an acronym for the given concept or phrase",
"completion": {
"max_tokens": 100,
"temperature": 0.5,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,25 @@
Generate a suitable acronym pair for the concept. Creativity is encouraged, including obscure references.
The uppercase letters in the acronym expansion must agree with the letters of the acronym
Q: A technology for detecting moving objects, their distance and velocity using radio waves.
A: R.A.D.A.R: RAdio Detection And Ranging.
Q: A weapon that uses high voltage electricity to incapacitate the target
A. T.A.S.E.R: Thomas A. Swifts Electric Rifle
Q: Equipment that lets a diver breathe underwater
A: S.C.U.B.A: Self Contained Underwater Breathing Apparatus.
Q: Reminder not to complicated subject matter.
A. K.I.S.S: Keep It Simple Stupid
Q: A national organization for investment in space travel, rockets, space ships, space exploration
A. N.A.S.A: National Aeronautics Space Administration
Q: Agreement that governs trade among North American countries.
A: N.A.F.T.A: North American Free Trade Agreement.
Q: Organization to protect the freedom and security of its member countries in North America and Europe.
A: N.A.T.O: North Atlantic Treaty Organization.
Q:{{$input}}

View file

@ -0,0 +1,15 @@
{
"schema": 1,
"type": "completion",
"description": "Given a request to generate an acronym from a string, generate an acronym and provide the acronym explanation.",
"completion": {
"max_tokens": 256,
"temperature": 0.7,
"top_p": 1.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0,
"stop_sequences": [
"#"
]
}
}

View file

@ -0,0 +1,54 @@
# Name of a super artificial intelligence
J.A.R.V.I.S. = Just A Really Very Intelligent System.
# Name for a new young beautiful assistant
F.R.I.D.A.Y. = Female Replacement Intelligent Digital Assistant Youth.
# Mirror to check what's behind
B.A.R.F. = Binary Augmented Retro-Framing.
# Pair of powerful glasses created by a genius that is now dead
E.D.I.T.H. = Even Dead Im The Hero.
# A company building and selling computers
I.B.M. = Intelligent Business Machine.
# A super computer that is sentient.
H.A.L = Heuristically programmed ALgorithmic computer.
# an intelligent bot that helps with productivity.
C.O.R.E. = Central Optimization Routines and Efficiency.
# an intelligent bot that helps with productivity.
P.A.L. = Personal Assistant Light.
# an intelligent bot that helps with productivity.
A.I.D.A. = Artificial Intelligence Digital Assistant.
# an intelligent bot that helps with productivity.
H.E.R.A. = Human Emulation and Recognition Algorithm.
# an intelligent bot that helps with productivity.
I.C.A.R.U.S. = Intelligent Control and Automation of Research and Utility Systems.
# an intelligent bot that helps with productivity.
N.E.M.O. = Networked Embedded Multiprocessor Orchestration.
# an intelligent bot that helps with productivity.
E.P.I.C. = Enhanced Productivity and Intelligence through Computing.
# an intelligent bot that helps with productivity.
M.A.I.A. = Multipurpose Artificial Intelligence Assistant.
# an intelligent bot that helps with productivity.
A.R.I.A. = Artificial Reasoning and Intelligent Assistant.
# An incredibly smart entity developed with complex math, that helps me being more productive.
O.M.E.G.A. = Optimized Mathematical Entity for Generalized Artificial intelligence.
# An incredibly smart entity developed with complex math, that helps me being more productive.
P.Y.T.H.O.N. = Precise Yet Thorough Heuristic Optimization Network.
# An incredibly smart entity developed with complex math, that helps me being more productive.
A.P.O.L.L.O. = Adaptive Probabilistic Optimization Learning Library for Online Applications.
# An incredibly smart entity developed with complex math, that helps me being more productive.
S.O.L.I.D. = Self-Organizing Logical Intelligent Data-base.
# An incredibly smart entity developed with complex math, that helps me being more productive.
D.E.E.P. = Dynamic Estimation and Prediction.
# An incredibly smart entity developed with complex math, that helps me being more productive.
B.R.A.I.N. = Biologically Realistic Artificial Intelligence Network.
# An incredibly smart entity developed with complex math, that helps me being more productive.
C.O.G.N.I.T.O. = COmputational and Generalized INtelligence TOolkit.
# An incredibly smart entity developed with complex math, that helps me being more productive.
S.A.G.E. = Symbolic Artificial General Intelligence Engine.
# An incredibly smart entity developed with complex math, that helps me being more productive.
Q.U.A.R.K. = Quantum Universal Algorithmic Reasoning Kernel.
# An incredibly smart entity developed with complex math, that helps me being more productive.
S.O.L.V.E. = Sophisticated Operational Logic and Versatile Expertise.
# An incredibly smart entity developed with complex math, that helps me being more productive.
C.A.L.C.U.L.U.S. = Cognitively Advanced Logic and Computation Unit for Learning and Understanding Systems.
# {{$INPUT}}

View file

@ -0,0 +1,15 @@
{
"schema": 1,
"type": "completion",
"description": "Given a single word or acronym, generate the expanded form matching the acronym letters.",
"completion": {
"max_tokens": 256,
"temperature": 0.5,
"top_p": 1.0,
"presence_penalty": 0.8,
"frequency_penalty": 0.0,
"stop_sequences": [
"#END#"
]
}
}

View file

@ -0,0 +1,24 @@
# acronym: Devis
Sentences matching the acronym:
1. Dragons Eat Very Interesting Snacks
2. Develop Empathy and Vision to Increase Success
3. Don't Expect Vampires In Supermarkets
#END#
# acronym: Christmas
Sentences matching the acronym:
1. Celebrating Harmony and Respect in a Season of Togetherness, Merriment, and True joy
2. Children Have Real Interest Since The Mystery And Surprise Thrills
3. Christmas Helps Reduce Inner Stress Through Mistletoe And Sleigh excursions
#END#
# acronym: noWare
Sentences matching the acronym:
1. No One Wants an App that Randomly Erases everything
2. Nourishing Oatmeal With Almond, Raisin, and Egg toppings
3. Notice Opportunity When Available and React Enthusiastically
#END#
Reverse the following acronym back to a funny sentence. Provide 3 examples.
# acronym: {{$INPUT}}
Sentences matching the acronym:

View file

@ -0,0 +1,22 @@
{
"schema": 1,
"type": "completion",
"description": "Given a goal or topic description generate a list of ideas",
"completion": {
"max_tokens": 2000,
"temperature": 0.5,
"top_p": 1.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0,
"stop_sequences": ["##END##"]
},
"input": {
"parameters": [
{
"name": "input",
"description": "A topic description or goal.",
"defaultValue": ""
}
]
}
}

View file

@ -0,0 +1,8 @@
Must: brainstorm ideas and create a list.
Must: use a numbered list.
Must: only one list.
Must: end list with ##END##
Should: no more than 10 items.
Should: at least 3 items.
Topic: {{$INPUT}}
Start.

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Write an email from the given bullet points",
"completion": {
"max_tokens": 256,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,16 @@
Rewrite my bullet points into complete sentences. Use a polite and inclusive tone.
[Input]
- Macbeth, King Scotland
- Married, Wife Lady Macbeth, No Kids
- Dog Toby McDuff. Hunter, dead.
- Shakespeare play
+++++
The story of Macbeth
My name is Macbeth. I used to be King of Scotland, but I died. My wife's name is Lady Macbeth and we were married for 15 years. We had no children. Our beloved dog Toby McDuff was a famous hunter of rats in the forest.
My story was immortalized by Shakespeare in a play.
+++++
[Input]
{{$input}}
+++++

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Turn bullet points into an email to someone, using a polite tone",
"completion": {
"max_tokens": 256,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,31 @@
Rewrite my bullet points into an email featuring complete sentences. Use a polite and inclusive tone.
[Input]
Toby,
- Macbeth, King Scotland
- Married, Wife Lady Macbeth, No Kids
- Dog Toby McDuff. Hunter, dead.
- Shakespeare play
Thanks,
Dexter
+++++
Hi Toby,
The story of Macbeth
My name is Macbeth. I used to be King of Scotland, but I died. My wife's name is Lady Macbeth and we were married for 15 years. We had no children. Our beloved dog Toby McDuff was a famous hunter of rats in the forest.
My story was immortalized by Shakespeare in a play.
Thanks,
Dexter
+++++
[Input]
{{$to}}
{{$input}}
Thanks,
{{$sender}}
+++++

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Translate text to English and improve it",
"completion": {
"max_tokens": 3000,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,11 @@
I want you to act as an English translator, spelling corrector and improver.
I will speak to you in any language and you will detect the language, translate it and answer in the corrected and improved version of my text, in English.
I want you to replace my simplified A0-level words and sentences with more beautiful and elegant, upper level English words and sentences.
Keep the meaning same, but make them more literary.
I want you to only reply the correction, the improvements and nothing else, do not write explanations.
Sentence: """
{{$INPUT}}
"""
Translation:

View file

@ -0,0 +1,36 @@
{
"schema": 1,
"type": "completion",
"description": "Write a chapter of a novel.",
"completion": {
"max_tokens": 2048,
"temperature": 0.3,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "A synopsis of what the chapter should be about.",
"defaultValue": ""
},
{
"name": "theme",
"description": "The theme or topic of this novel.",
"defaultValue": ""
},
{
"name": "previousChapter",
"description": "The synopsis of the previous chapter.",
"defaultValue": ""
},
{
"name": "chapterIndex",
"description": "The number of the chapter to write.",
"defaultValue": "<!--===ENDPART===-->"
}
]
}
}

View file

@ -0,0 +1,20 @@
[CONTEXT]
THEME OF STORY:
{{$theme}}
PREVIOUS CHAPTER:
{{$previousChapter}}
[END CONTEXT]
WRITE THIS CHAPTER USING [CONTEXT] AND
CHAPTER SYNOPSIS. DO NOT REPEAT SYNOPSIS IN THE OUTPUT
Chapter Synopsis:
{{$input}}
Chapter {{$chapterIndex}}

View file

@ -0,0 +1,41 @@
{
"schema": 1,
"type": "completion",
"description": "Write a chapter of a novel using notes about the chapter to write.",
"completion": {
"max_tokens": 1024,
"temperature": 0.5,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "What the novel should be about.",
"defaultValue": ""
},
{
"name": "theme",
"description": "The theme of this novel.",
"defaultValue": ""
},
{
"name": "notes",
"description": "Notes useful to write this chapter.",
"defaultValue": ""
},
{
"name": "previousChapter",
"description": "The previous chapter synopsis.",
"defaultValue": ""
},
{
"name": "chapterIndex",
"description": "The number of the chapter to write.",
"defaultValue": ""
}
]
}
}

View file

@ -0,0 +1,19 @@
[CONTEXT]
THEME OF STORY:
{{$theme}}
NOTES OF STORY SO FAR - USE AS REFERENCE
{{$notes}}
PREVIOUS CHAPTER, USE AS REFERENCE:
{{$previousChapter}}
[END CONTEXT]
WRITE THIS CHAPTER CONTINUING STORY, USING [CONTEXT] AND CHAPTER SYNOPSIS BELOW. DO NOT REPEAT SYNOPSIS IN THE CHAPTER. DON'T REPEAT PREVIOUS CHAPTER.
{{$input}}
Chapter {{$chapterIndex}}

View file

@ -0,0 +1,31 @@
{
"schema": 1,
"type": "completion",
"description": "Generate a list of chapter synopsis for a novel or novella",
"completion": {
"max_tokens": 2048,
"temperature": 0.1,
"top_p": 0.5,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "What the novel should be about.",
"defaultValue": ""
},
{
"name": "chapterCount",
"description": "The number of chapters to generate.",
"defaultValue": ""
},
{
"name": "endMarker",
"description": "The marker to use to end each chapter.",
"defaultValue": "<!--===ENDPART===-->"
}
]
}
}

View file

@ -0,0 +1,12 @@
I want to write a {{$chapterCount}} chapter novella about:
{{$input}}
There MUST BE {{$chapterCount}} CHAPTERS.
INVENT CHARACTERS AS YOU SEE FIT. BE HIGHLY CREATIVE AND/OR FUNNY.
WRITE SYNOPSIS FOR EACH CHAPTER. INCLUDE INFORMATION ABOUT CHARACTERS ETC. SINCE EACH
CHAPTER WILL BE WRITTEN BY A DIFFERENT WRITER, YOU MUST INCLUDE ALL PERTINENT INFORMATION
IN EACH SYNOPSIS
YOU MUST END EACH SYNOPSIS WITH {{$endMarker}}

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Automatically generate compact notes for any text or text document",
"completion": {
"max_tokens": 256,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,6 @@
Rewrite the given text like it was written in this style or by: {{$style}}.
MUST RETAIN THE MEANING AND FACTUAL CONTENT AS THE ORIGINAL.
{{$input}}

View file

@ -0,0 +1,21 @@
{
"schema": 1,
"type": "completion",
"description": "Turn a scenario into a short and entertaining poem.",
"completion": {
"max_tokens": 60,
"temperature": 0.5,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "The scenario to turn into a poem.",
"defaultValue": ""
}
]
}
}

View file

@ -0,0 +1,2 @@
Generate a short funny poem or limerick to explain the given event. Be creative and be funny. Let your imagination run wild.
Event:{{$input}}

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Generate a list of synopsis for a novel or novella with sub-chapters",
"completion": {
"max_tokens": 250,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,10 @@
ONLY USE XML TAGS IN THIS LIST:
[XML TAG LIST]
list: Surround any lists with this tag
synopsis: An outline of the chapter to write
[END LIST]
EMIT WELL FORMED XML ALWAYS. Code should be CDATA.
{{$input}}

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Summarize given text or any text document",
"completion": {
"max_tokens": 500,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,7 @@
>>>>>The following is part of a {{$conversationtype}}.
{{$input}}
>>>>>The following is an overview of a previous part of the {{$conversationtype}}, focusing on "{{$focusarea}}".
{{$previousresults}}
>>>>>In 250 words or less, write a verbose and detailed overview of the {{$conversationtype}} focusing solely on "{{$focusarea}}".

View file

@ -0,0 +1,15 @@
{
"schema": 1,
"type": "completion",
"description": "Translate the input into a language of your choice",
"completion": {
"max_tokens": 2000,
"temperature": 0.7,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0,
"stop_sequences": [
"[done]"
]
}
}

View file

@ -0,0 +1,7 @@
Translate the input below into {{$language}}
MAKE SURE YOU ONLY USE {{$language}}.
{{$input}}
Translation:

View file

@ -0,0 +1,12 @@
{
"schema": 1,
"type": "completion",
"description": "Summarize given text in two sentences or less",
"completion": {
"max_tokens": 100,
"temperature": 0.0,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
}
}

View file

@ -0,0 +1,4 @@
Summarize the following text in two sentences or less.
[BEGIN TEXT]
{{$input}}
[END TEXT]

View file

@ -5,15 +5,32 @@
@Author : alexanderwu
@File : search_engine.py
"""
from __future__ import annotations
# from __future__ import annotations
import importlib
from typing import Callable, Coroutine, Literal, overload
from semantic_kernel.skill_definition import sk_function
from metagpt.config import CONFIG
from metagpt.tools import SearchEngineType
class SkSearchEngine:
def __init__(self):
self.search_engine = SearchEngine()
@sk_function(
description="searches results from Google. Useful when you need to find short "
"and succinct answers about a specific topic. Input should be a search query.",
name="searchAsync",
input_description="search",
)
async def run(self, query: str) -> str:
result = await self.search_engine.run(query)
return result
class SearchEngine:
"""Class representing a search engine.
@ -25,6 +42,7 @@ class SearchEngine:
run_func: The function to run the search.
engine: The search engine type.
"""
def __init__(
self,
engine: SearchEngineType | None = None,
@ -33,7 +51,7 @@ class SearchEngine:
engine = engine or CONFIG.search_engine
if engine == SearchEngineType.SERPAPI_GOOGLE:
module = "metagpt.tools.search_engine_serpapi"
run_func = importlib.import_module(module).SerpAPIWrapper().run
run_func = importlib.import_module(module).SerpAPIWrapper().run
elif engine == SearchEngineType.SERPER_GOOGLE:
module = "metagpt.tools.search_engine_serper"
run_func = importlib.import_module(module).SerperWrapper().run

View file

@ -60,6 +60,7 @@ def test_node_tags(project_key, nodes, operations, expected_msg):
# 3. If comments are needed, use Chinese.
# If you understand, please wait for me to give the interface definition and just answer "Understood" to save tokens.
'''
ACT_PROMPT_PREFIX = '''Refer to the test types: such as missing request parameters, field boundary verification, incorrect field type.
Please output 10 test cases within one `@pytest.mark.parametrize` scope.
@ -94,7 +95,8 @@ Name Type Required Default Value Remarks
code integer Yes
message string Yes
data object Yes
```
'''
class UTGenerator:

View file

@ -191,7 +191,8 @@ class CodeParser:
else:
logger.error(f"{pattern} not match following text:")
logger.error(text)
raise Exception
# raise Exception
return ""
return code
@classmethod

42
metagpt/utils/file.py Normal file
View file

@ -0,0 +1,42 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 15:40:40
@Author : Stitch-z
@File : file.py
@Describe : General file operations.
"""
import aiofiles
from pathlib import Path
from metagpt.logs import logger
class File:
"""A general util for file operations."""
@classmethod
async def write(cls, root_path: Path, filename: str, content: bytes) -> Path:
"""Write the file content to the local specified path.
Args:
root_path: The root path of file, such as "/data".
filename: The name of file, such as "test.txt".
content: The binary content of file.
Returns:
The full filename of file, such as "/data/test.txt".
Raises:
Exception: If an unexpected error occurs during the file writing process.
"""
try:
root_path.mkdir(parents=True, exist_ok=True)
full_path = root_path / filename
async with aiofiles.open(full_path, mode="wb") as writer:
await writer.write(content)
logger.info(f"Successfully write file: {full_path}")
return full_path
except Exception as e:
logger.error(f"Error writing file: {e}")
raise e

1
metagpt/utils/index.html Normal file

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,34 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/13 12:29
@Author : femto Zheng
@File : make_sk_kernel.py
"""
import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion import (
AzureChatCompletion,
)
from semantic_kernel.connectors.ai.open_ai.services.open_ai_chat_completion import (
OpenAIChatCompletion,
)
from metagpt.config import CONFIG
def make_sk_kernel():
kernel = sk.Kernel()
if CONFIG.openai_api_type == "azure":
kernel.add_chat_service(
"chat_completion",
AzureChatCompletion(CONFIG.deployment_name, CONFIG.openai_api_base, CONFIG.openai_api_key),
)
else:
kernel.add_chat_service(
"chat_completion",
OpenAIChatCompletion(
CONFIG.openai_api_model, CONFIG.openai_api_key, org_id=None, endpoint=CONFIG.openai_api_base
),
)
return kernel

View file

@ -2,9 +2,10 @@
# -*- coding: utf-8 -*-
"""
@Time : 2023/7/4 10:53
@Author : alexanderwu
@Author : alexanderwu alitrack
@File : mermaid.py
"""
import asyncio
import subprocess
from pathlib import Path
@ -12,48 +13,76 @@ from metagpt.config import CONFIG
from metagpt.const import PROJECT_ROOT
from metagpt.logs import logger
from metagpt.utils.common import check_cmd_exists
import os
import sys
def mermaid_to_file(mermaid_code, output_file_without_suffix, width=2048, height=2048) -> int:
async def mermaid_to_file(mermaid_code, output_file_without_suffix, width=2048, height=2048) -> int:
"""suffix: png/svg/pdf
:param mermaid_code: mermaid code
:param output_file_without_suffix: output filename
:param width:
:param height:
:return: 0 if succed, -1 if failed
:return: 0 if succeed, -1 if failed
"""
# Write the Mermaid code to a temporary file
dir_name = os.path.dirname(output_file_without_suffix)
if dir_name and not os.path.exists(dir_name):
os.makedirs(dir_name)
tmp = Path(f"{output_file_without_suffix}.mmd")
tmp.write_text(mermaid_code, encoding="utf-8")
engine = CONFIG.mermaid_engine.lower()
if engine == "nodejs":
if check_cmd_exists("mmdc") != 0:
logger.warning("RUN `npm install -g @mermaid-js/mermaid-cli` to install mmdc")
return -1
for suffix in ["pdf", "svg", "png"]:
output_file = f"{output_file_without_suffix}.{suffix}"
# Call the `mmdc` command to convert the Mermaid code to a PNG
logger.info(f"Generating {output_file}..")
if check_cmd_exists("mmdc") != 0:
logger.warning("RUN `npm install -g @mermaid-js/mermaid-cli` to install mmdc")
return -1
for suffix in ["pdf", "svg", "png"]:
output_file = f"{output_file_without_suffix}.{suffix}"
# Call the `mmdc` command to convert the Mermaid code to a PNG
logger.info(f"Generating {output_file}..")
if CONFIG.puppeteer_config:
subprocess.run(
[
CONFIG.mmdc,
"-p",
CONFIG.puppeteer_config,
"-i",
str(tmp),
"-o",
output_file,
"-w",
str(width),
"-H",
str(height),
]
if CONFIG.puppeteer_config:
commands =[
CONFIG.mmdc,
"-p",
CONFIG.puppeteer_config,
"-i",
str(tmp),
"-o",
output_file,
"-w",
str(width),
"-H",
str(height),
]
else:
commands =[CONFIG.mmdc, "-i", str(tmp), "-o", output_file, "-w", str(width), "-H", str(height)]
process = await asyncio.create_subprocess_exec(
*commands,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE
)
stdout, stderr = await process.communicate()
if stdout:
logger.info(stdout.decode())
if stderr:
logger.error(stderr.decode())
else:
if engine =='playwright':
from metagpt.utils.mmdc_playwright import mermaid_to_file
return await mermaid_to_file(mermaid_code, output_file_without_suffix, width, height)
elif engine =='pyppeteer':
from metagpt.utils.mmdc_pyppeteer import mermaid_to_file
return await mermaid_to_file(mermaid_code, output_file_without_suffix, width, height)
elif engine =='ink':
from metagpt.utils.mmdc_ink import mermaid_to_file
return await mermaid_to_file(mermaid_code, output_file_without_suffix)
else:
subprocess.run([CONFIG.mmdc, "-i", str(tmp), "-o", output_file, "-w", str(width), "-H", str(height)])
logger.warning(f"Unsupported mermaid engine: {engine}")
return 0
@ -108,7 +137,9 @@ MMC2 = """sequenceDiagram
SE-->>M: return summary"""
if __name__ == "__main__":
# logger.info(print_members(print_members))
mermaid_to_file(MMC1, PROJECT_ROOT / "tmp/1.png")
mermaid_to_file(MMC2, PROJECT_ROOT / "tmp/2.png")
loop = asyncio.new_event_loop()
result = loop.run_until_complete(mermaid_to_file(MMC1, PROJECT_ROOT / f"{CONFIG.mermaid_engine}/1"))
result = loop.run_until_complete(mermaid_to_file(MMC2, PROJECT_ROOT / f"{CONFIG.mermaid_engine}/1"))
loop.close()

41
metagpt/utils/mmdc_ink.py Normal file
View file

@ -0,0 +1,41 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/4 16:12
@Author : alitrack
@File : mermaid.py
"""
import base64
import os
from aiohttp import ClientSession,ClientError
from metagpt.logs import logger
async def mermaid_to_file(mermaid_code, output_file_without_suffix):
"""suffix: png/svg
:param mermaid_code: mermaid code
:param output_file_without_suffix: output filename without suffix
:return: 0 if succeed, -1 if failed
"""
encoded_string = base64.b64encode(mermaid_code.encode()).decode()
for suffix in ["svg", "png"]:
output_file = f"{output_file_without_suffix}.{suffix}"
path_type = "svg" if suffix == "svg" else "img"
url = f"https://mermaid.ink/{path_type}/{encoded_string}"
async with ClientSession() as session:
try:
async with session.get(url) as response:
if response.status == 200:
text = await response.content.read()
with open(output_file, 'wb') as f:
f.write(text)
logger.info(f"Generating {output_file}..")
else:
logger.error(f"Failed to generate {output_file}")
return -1
except ClientError as e:
logger.error(f"network error: {e}")
return -1
return 0

View file

@ -0,0 +1,111 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/4 16:12
@Author : Steven Lee
@File : mmdc_playwright.py
"""
import os
from urllib.parse import urljoin
from playwright.async_api import async_playwright
from metagpt.logs import logger
async def mermaid_to_file(mermaid_code, output_file_without_suffix, width=2048, height=2048)-> int:
"""
Converts the given Mermaid code to various output formats and saves them to files.
Args:
mermaid_code (str): The Mermaid code to convert.
output_file_without_suffix (str): The output file name without the file extension.
width (int, optional): The width of the output image in pixels. Defaults to 2048.
height (int, optional): The height of the output image in pixels. Defaults to 2048.
Returns:
int: Returns 1 if the conversion and saving were successful, -1 otherwise.
"""
suffixes=['png', 'svg', 'pdf']
__dirname = os.path.dirname(os.path.abspath(__file__))
async with async_playwright() as p:
browser = await p.chromium.launch()
device_scale_factor = 1.0
context = await browser.new_context(
viewport={'width': width, 'height': height},
device_scale_factor=device_scale_factor,
)
page = await context.new_page()
async def console_message(msg):
logger.info(msg.text)
page.on('console', console_message)
try:
await page.set_viewport_size({'width': width, 'height': height})
mermaid_html_path = os.path.abspath(
os.path.join(__dirname, 'index.html'))
mermaid_html_url = urljoin('file:', mermaid_html_path)
await page.goto(mermaid_html_url)
await page.wait_for_load_state("networkidle")
await page.wait_for_selector("div#container", state="attached")
mermaid_config = {}
background_color = "#ffffff"
my_css = ""
await page.evaluate(f'document.body.style.background = "{background_color}";')
metadata = await page.evaluate('''async ([definition, mermaidConfig, myCSS, backgroundColor]) => {
const { mermaid, zenuml } = globalThis;
await mermaid.registerExternalDiagrams([zenuml]);
mermaid.initialize({ startOnLoad: false, ...mermaidConfig });
const { svg } = await mermaid.render('my-svg', definition, document.getElementById('container'));
document.getElementById('container').innerHTML = svg;
const svgElement = document.querySelector('svg');
svgElement.style.backgroundColor = backgroundColor;
if (myCSS) {
const style = document.createElementNS('http://www.w3.org/2000/svg', 'style');
style.appendChild(document.createTextNode(myCSS));
svgElement.appendChild(style);
}
}''', [mermaid_code, mermaid_config, my_css, background_color])
if 'svg' in suffixes :
svg_xml = await page.evaluate('''() => {
const svg = document.querySelector('svg');
const xmlSerializer = new XMLSerializer();
return xmlSerializer.serializeToString(svg);
}''')
logger.info(f"Generating {output_file_without_suffix}.svg..")
with open(f'{output_file_without_suffix}.svg', 'wb') as f:
f.write(svg_xml.encode('utf-8'))
if 'png' in suffixes:
clip = await page.evaluate('''() => {
const svg = document.querySelector('svg');
const rect = svg.getBoundingClientRect();
return {
x: Math.floor(rect.left),
y: Math.floor(rect.top),
width: Math.ceil(rect.width),
height: Math.ceil(rect.height)
};
}''')
await page.set_viewport_size({'width': clip['x'] + clip['width'], 'height': clip['y'] + clip['height']})
screenshot = await page.screenshot(clip=clip, omit_background=True, scale='device')
logger.info(f"Generating {output_file_without_suffix}.png..")
with open(f'{output_file_without_suffix}.png', 'wb') as f:
f.write(screenshot)
if 'pdf' in suffixes:
pdf_data = await page.pdf(scale=device_scale_factor)
logger.info(f"Generating {output_file_without_suffix}.pdf..")
with open(f'{output_file_without_suffix}.pdf', 'wb') as f:
f.write(pdf_data)
return 0
except Exception as e:
logger.error(e)
return -1
finally:
await browser.close()

View file

@ -0,0 +1,113 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/4 16:12
@Author : alitrack
@File : mmdc_pyppeteer.py
"""
import os
from urllib.parse import urljoin
from pyppeteer import launch
from metagpt.logs import logger
from metagpt.config import CONFIG
async def mermaid_to_file(mermaid_code, output_file_without_suffix, width=2048, height=2048)-> int:
"""
Converts the given Mermaid code to various output formats and saves them to files.
Args:
mermaid_code (str): The Mermaid code to convert.
output_file_without_suffix (str): The output file name without the file extension.
width (int, optional): The width of the output image in pixels. Defaults to 2048.
height (int, optional): The height of the output image in pixels. Defaults to 2048.
Returns:
int: Returns 1 if the conversion and saving were successful, -1 otherwise.
"""
suffixes = ['png', 'svg', 'pdf']
__dirname = os.path.dirname(os.path.abspath(__file__))
if CONFIG.pyppeteer_executable_path:
browser = await launch(headless=True,
executablePath=CONFIG.pyppeteer_executable_path,
args=['--disable-extensions',"--no-sandbox"]
)
else:
logger.error("Please set the environment variable:PYPPETEER_EXECUTABLE_PATH.")
return -1
page = await browser.newPage()
device_scale_factor = 1.0
async def console_message(msg):
logger.info(msg.text)
page.on('console', console_message)
try:
await page.setViewport(viewport={'width': width, 'height': height, 'deviceScaleFactor': device_scale_factor})
mermaid_html_path = os.path.abspath(
os.path.join(__dirname, 'index.html'))
mermaid_html_url = urljoin('file:', mermaid_html_path)
await page.goto(mermaid_html_url)
await page.querySelector("div#container")
mermaid_config = {}
background_color = "#ffffff"
my_css = ""
await page.evaluate(f'document.body.style.background = "{background_color}";')
metadata = await page.evaluate('''async ([definition, mermaidConfig, myCSS, backgroundColor]) => {
const { mermaid, zenuml } = globalThis;
await mermaid.registerExternalDiagrams([zenuml]);
mermaid.initialize({ startOnLoad: false, ...mermaidConfig });
const { svg } = await mermaid.render('my-svg', definition, document.getElementById('container'));
document.getElementById('container').innerHTML = svg;
const svgElement = document.querySelector('svg');
svgElement.style.backgroundColor = backgroundColor;
if (myCSS) {
const style = document.createElementNS('http://www.w3.org/2000/svg', 'style');
style.appendChild(document.createTextNode(myCSS));
svgElement.appendChild(style);
}
}''', [mermaid_code, mermaid_config, my_css, background_color])
if 'svg' in suffixes :
svg_xml = await page.evaluate('''() => {
const svg = document.querySelector('svg');
const xmlSerializer = new XMLSerializer();
return xmlSerializer.serializeToString(svg);
}''')
logger.info(f"Generating {output_file_without_suffix}.svg..")
with open(f'{output_file_without_suffix}.svg', 'wb') as f:
f.write(svg_xml.encode('utf-8'))
if 'png' in suffixes:
clip = await page.evaluate('''() => {
const svg = document.querySelector('svg');
const rect = svg.getBoundingClientRect();
return {
x: Math.floor(rect.left),
y: Math.floor(rect.top),
width: Math.ceil(rect.width),
height: Math.ceil(rect.height)
};
}''')
await page.setViewport({'width': clip['x'] + clip['width'], 'height': clip['y'] + clip['height'], 'deviceScaleFactor': device_scale_factor})
screenshot = await page.screenshot(clip=clip, omit_background=True, scale='device')
logger.info(f"Generating {output_file_without_suffix}.png..")
with open(f'{output_file_without_suffix}.png', 'wb') as f:
f.write(screenshot)
if 'pdf' in suffixes:
pdf_data = await page.pdf(scale=device_scale_factor)
logger.info(f"Generating {output_file_without_suffix}.pdf..")
with open(f'{output_file_without_suffix}.pdf', 'wb') as f:
f.write(pdf_data)
return 0
except Exception as e:
logger.error(e)
return -1
finally:
await browser.close()

View file

@ -47,6 +47,7 @@ setup(
"selenium": ["selenium>4", "webdriver_manager", "beautifulsoup4"],
"search-google": ["google-api-python-client==2.94.0"],
"search-ddg": ["duckduckgo-search==3.8.5"],
"pyppeteer": ["pyppeteer>=1.0.2"],
},
cmdclass={
"install_mermaid": InstallMermaidCLI,

View file

@ -1,11 +1,16 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import asyncio
import platform
import fire
from metagpt.roles import Architect, Engineer, ProductManager
from metagpt.roles import ProjectManager, QaEngineer
from metagpt.roles import (
Architect,
Engineer,
ProductManager,
ProjectManager,
QaEngineer,
)
from metagpt.software_company import SoftwareCompany
@ -15,15 +20,17 @@ async def startup(
n_round: int = 5,
code_review: bool = False,
run_tests: bool = False,
implement: bool = True
implement: bool = True,
):
"""Run a startup. Be a boss."""
company = SoftwareCompany()
company.hire([
ProductManager(),
Architect(),
ProjectManager(),
])
company.hire(
[
ProductManager(),
Architect(),
ProjectManager(),
]
)
# if implement or code_review
if implement or code_review:
@ -46,7 +53,7 @@ def main(
n_round: int = 5,
code_review: bool = True,
run_tests: bool = False,
implement: bool = True
implement: bool = True,
):
"""
We are a software startup comprised of AI. By investing in us,
@ -58,12 +65,8 @@ def main(
:param code_review: Whether to use code review.
:return:
"""
if platform.system() == "Windows":
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
asyncio.run(startup(idea, investment, n_round,
code_review, run_tests, implement))
asyncio.run(startup(idea, investment, n_round, code_review, run_tests, implement))
if __name__ == '__main__':
if __name__ == "__main__":
fire.Fire(main)

View file

@ -0,0 +1,40 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/6 21:41:34
@Author : Stitch-z
@File : test_write_tutorial.py
"""
from typing import Dict
import pytest
from metagpt.actions.write_tutorial import WriteDirectory, WriteContent
@pytest.mark.asyncio
@pytest.mark.parametrize(
("language", "topic"),
[("English", "Write a tutorial about Python")]
)
async def test_write_directory(language: str, topic: str):
ret = await WriteDirectory(language=language).run(topic=topic)
assert isinstance(ret, dict)
assert "title" in ret
assert "directory" in ret
assert isinstance(ret["directory"], list)
assert len(ret["directory"])
assert isinstance(ret["directory"][0], dict)
@pytest.mark.asyncio
@pytest.mark.parametrize(
("language", "topic", "directory"),
[("English", "Write a tutorial about Python", {"Introduction": ["What is Python?", "Why learn Python?"]})]
)
async def test_write_content(language: str, topic: str, directory: Dict):
ret = await WriteContent(language=language, directory=directory).run(topic=topic)
assert isinstance(ret, str)
assert list(directory.keys())[0] in ret
for value in list(directory.values())[0]:
assert value in ret

View file

@ -0,0 +1,7 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/16 20:03
@Author : femto Zheng
@File : __init__.py
"""

View file

@ -0,0 +1,29 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/16 20:03
@Author : femto Zheng
@File : test_basic_planner.py
"""
import pytest
from semantic_kernel.core_skills import FileIOSkill, MathSkill, TextSkill, TimeSkill
from semantic_kernel.planning.action_planner.action_planner import ActionPlanner
from metagpt.actions import BossRequirement
from metagpt.roles.sk_agent import SkAgent
from metagpt.schema import Message
@pytest.mark.asyncio
async def test_action_planner():
role = SkAgent(planner_cls=ActionPlanner)
# let's give the agent 4 skills
role.import_skill(MathSkill(), "math")
role.import_skill(FileIOSkill(), "fileIO")
role.import_skill(TimeSkill(), "time")
role.import_skill(TextSkill(), "text")
task = "What is the sum of 110 and 990?"
role.recv(Message(content=task, cause_by=BossRequirement))
await role._think() # it will choose mathskill.Add
assert "1100" == (await role._act()).content

View file

@ -0,0 +1,34 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@Time : 2023/9/16 20:03
@Author : femto Zheng
@File : test_basic_planner.py
"""
import pytest
from semantic_kernel.core_skills import TextSkill
from metagpt.actions import BossRequirement
from metagpt.const import SKILL_DIRECTORY
from metagpt.roles.sk_agent import SkAgent
from metagpt.schema import Message
@pytest.mark.asyncio
async def test_basic_planner():
task = """
Tomorrow is Valentine's day. I need to come up with a few date ideas. She speaks French so write it in French.
Convert the text to uppercase"""
role = SkAgent()
# let's give the agent some skills
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "SummarizeSkill")
role.import_semantic_skill_from_directory(SKILL_DIRECTORY, "WriterSkill")
role.import_skill(TextSkill(), "TextSkill")
# using BasicPlanner
role.recv(Message(content=task, cause_by=BossRequirement))
await role._think()
# assuming sk_agent will think he needs WriterSkill.Brainstorm and WriterSkill.Translate
assert "WriterSkill.Brainstorm" in role.plan.generated_plan.result
assert "WriterSkill.Translate" in role.plan.generated_plan.result
# assert "SALUT" in (await role._act()).content #content will be some French

View file

@ -0,0 +1,27 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/6 23:11:27
@Author : Stitch-z
@File : test_tutorial_assistant.py
"""
import aiofiles
import pytest
from metagpt.roles.tutorial_assistant import TutorialAssistant
@pytest.mark.asyncio
@pytest.mark.parametrize(
("language", "topic"),
[("Chinese", "Write a tutorial about Python")]
)
async def test_tutorial_assistant(language: str, topic: str):
topic = "Write a tutorial about MySQL"
role = TutorialAssistant(language=language)
msg = await role.run(topic)
filename = msg.content
title = filename.split("/")[-1].split(".")[0]
async with aiofiles.open(filename, mode="r") as reader:
content = await reader.read()
assert content.startswith(f"# {title}")

View file

@ -0,0 +1,27 @@
#!/usr/bin/env python3
# _*_ coding: utf-8 _*_
"""
@Time : 2023/9/4 15:40:40
@Author : Stitch-z
@File : test_file.py
"""
from pathlib import Path
import aiofiles
import pytest
from metagpt.utils.file import File
@pytest.mark.asyncio
@pytest.mark.parametrize(
("root_path", "filename", "content"),
[(Path("/code/MetaGPT/data/tutorial_docx/2023-09-07_17-05-20"), "test.md", "Hello World!")]
)
async def test_write_file(root_path: Path, filename: str, content: bytes):
full_file_name = await File.write(root_path=root_path, filename=filename, content=content.encode('utf-8'))
assert isinstance(full_file_name, Path)
assert root_path / filename == full_file_name
async with aiofiles.open(full_file_name, mode="r") as reader:
body = await reader.read()
assert body == content