Commit graph

255 commits

Author SHA1 Message Date
better629
4e13eaca6e update zhipu api due to new model and api; repair extra invalid generate output; update its unittest 2024-01-17 16:28:13 +08:00
better629
17479a2360 fix system_prompt param that llm not support from issue 725 2024-01-10 11:26:23 +08:00
shenchucheng
9ce0182fab Log newline character after receiving llm stream response 2024-01-05 16:50:03 +08:00
better629
1d35cab9d7 rm useless code and increase UT ratio 2024-01-02 21:21:10 +08:00
geekan
786f862a8b fix azure 2024-01-02 15:16:59 +08:00
better629
d5d20d8869 fix format 2024-01-02 11:53:32 +08:00
better629
81334b733d fix issue 654 and re-add system_msg judgement 2024-01-02 11:42:43 +08:00
better629
539e1c7dce
Merge branch 'dev' into dev 2023-12-29 02:48:39 +08:00
better629
0f047e5693 update provider unittests to update coverage rate 2023-12-29 02:39:00 +08:00
geekan
4e32ee120c fix tests 2023-12-28 18:06:02 +08:00
geekan
fe697ac095 fix openai 2023-12-28 17:42:28 +08:00
better629
255e2d3fa7 update provider uniform name and check tests 2023-12-28 17:18:18 +08:00
better629
1f9234eee8 fix client_kwargs due to previous PR delete sync client 2023-12-28 09:34:51 +08:00
geekan
5c7cdf5ee7 merge main 2023-12-26 22:17:15 +08:00
geekan
0435b1321f refine code 2023-12-26 17:54:52 +08:00
geekan
ba8bf01870 remove code 2023-12-26 16:39:19 +08:00
geekan
4007fc87d6 remove sync api in openai 2023-12-26 16:33:15 +08:00
geekan
bb1b9823d0 remove sync api in openai 2023-12-26 15:59:11 +08:00
geekan
e15de55368 refactor openai api and brain memory 2023-12-26 15:09:37 +08:00
geekan
8351c8ec35 remove generator para in acompletion_text 2023-12-26 14:31:26 +08:00
shenchucheng
b766550a4f update log_llm_stream in log_llm_stream.py/ollama_api.py 2023-12-26 01:05:47 +08:00
shenchucheng
0eef8a8607 add llm stream log 2023-12-26 01:05:44 +08:00
xiaofenggang
a87b5056d7 [Bugfix] Set openai proxy for class ZhiPuAPTAPI
When using ZHIPUAI as the large model provider, it is not possible to access ZHIPUAI in an HTTP proxy environment, and the following error will be reported:
openai.error.APIConnectionError: Error communicating with OpenAI

So we need set proxy for class ZhiPuAPTAPI.
2023-12-25 16:15:52 +00:00
莘权 马
ef1bc01c99 Merge branch 'dev' of https://github.com/geekan/MetaGPT into geekan/dev 2023-12-25 22:39:17 +08:00
莘权 马
0fdb552468 fixbug: 修复通用智能体role及其相关的TalkAction和SkillAction 2023-12-25 22:39:03 +08:00
better629
454e6164fb update provider unittests 2023-12-25 18:00:51 +08:00
shenchucheng
b113aa246f update log_llm_stream in log_llm_stream.py/ollama_api.py 2023-12-25 17:22:30 +08:00
geekan
3b9e20d056
Merge pull request #616 from better629/new_dev_updatellm
update fireworks/open_llm api due to new openai sdk
2023-12-24 15:37:47 +08:00
better629
7e0c62a7a9 update fireworks/open_llm api due to new openai sdk 2023-12-24 15:34:32 +08:00
莘权 马
ad639b9906 feat: merge geekan:dev 2023-12-24 12:49:08 +08:00
莘权 马
f441c88156 fixbug: timeout & prompt_format 2023-12-24 12:30:08 +08:00
geekan
6c278bcfd6 fix main process 2023-12-24 11:46:05 +08:00
geekan
6465b2eaa9 fix pep8 2023-12-24 10:44:33 +08:00
shenchucheng
b4552938e6 add llm stream log 2023-12-23 22:45:20 +08:00
better629
bd119de2c1 format general_api_requestor params type 2023-12-23 19:48:13 +08:00
geekan
2502dd3651 fix conflict 2023-12-23 19:48:01 +08:00
莘权 马
a90f52d4b6 fixbug: Fix the confusion caused by the merging of _client, client, and async_client in the openai_api.py;Split Azure LLM and MetaGPT LLM from OpenAI LLM to reduce the number of variables defined in the Config class for compatibility. 2023-12-23 18:07:42 +08:00
莘权 马
5d97a20e08 fixbug: OpenAIGPTAPI:_achat_completion_stream 2023-12-22 17:43:59 +08:00
莘权 马
9a1909bb95 feat: merge geekan:main 2023-12-22 16:40:04 +08:00
better629
40d3cc5f81 format general_api_requestor params type 2023-12-22 09:51:26 +08:00
better629
4b0cb0084a add ollama support 2023-12-22 02:20:43 +08:00
geekan
4bf1844022
Merge pull request #596 from orange-crow/update-Message-instance
Message(msg) -> Message(content=msg)
2023-12-21 15:13:31 +08:00
geekan
a4843cd974
Merge pull request #595 from better629/feat_gemini
Feat gemini
2023-12-21 15:10:40 +08:00
better629
6af9fecf65 fix format 2023-12-21 15:06:59 +08:00
geekan
d46b7c4018 fix moderation, remove claude from LLM, refine exceptions handler 2023-12-21 14:45:53 +08:00
better629
9bd900452c fix conflicts 2023-12-21 14:29:01 +08:00
better629
bdb427d5b7 add gemini minimal python version warning 2023-12-21 14:18:50 +08:00
刘棒棒
44e648eabf Message(msg) -> Message(content=msg) 2023-12-21 14:17:05 +08:00
better629
f3eb9f638e add other llm for LLMProviderRegistry 2023-12-21 12:55:45 +08:00
seehi
163da9a2e7 format code 2023-12-21 12:44:43 +08:00