Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

不好使啊,大兄弟 #2

Open
bing0037 opened this issue Dec 8, 2023 · 1 comment
Open

不好使啊,大兄弟 #2

bing0037 opened this issue Dec 8, 2023 · 1 comment
Labels
question Further information is requested

Comments

@bing0037
Copy link

bing0037 commented Dec 8, 2023

Env setup:

pip install void-terminal

使用的代码:

import void_terminal as vt
# For more available configurations (including network proxy, api, using chatglm etc.), 
# see config.py of in the mother project:
# https://github.com/binary-husky/gpt_academic.git
vt.set_conf(key="API_KEY", value="sk-xxxxxxxxxxxxxx")   #key已经改成我的了。
vt.set_conf(key="LLM_MODEL", value="gpt-3.5-turbo")

chat_kwargs = vt.get_chat_default_kwargs()
chat_kwargs['inputs'] = 'Hello, world!'
result = vt.get_chat_handle()(**chat_kwargs)
print('\n*************\n' + result + '\n*************\n' )

Result: Error!

(gptac_venv) C:\Users\libin\Desktop\work\code\LLM\gpt_academic-master\void-terminal>python test_1.py
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
 gpt-3.5-turbo : 0 : Hello, world! ..........
Traceback (most recent call last):
  File "C:\Users\libin\Desktop\work\code\LLM\gpt_academic-master\void-terminal\test_1.py", line 10, in <module>
    result = vt.get_chat_handle()(**chat_kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\libin\anaconda3\envs\gptac_venv\Lib\site-packages\void_terminal\request_llms\bridge_all.py", line 597, in predict_no_ui_long_connection
    return method(inputs, llm_kwargs, history, sys_prompt, observe_window, console_slience)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\libin\anaconda3\envs\gptac_venv\Lib\site-packages\void_terminal\request_llms\bridge_chatgpt.py", line 115, in predict_no_ui_long_connection
    raise RuntimeError("OpenAI rejected the request:" + error_msg)
RuntimeError: OpenAI rejected the request:{"object":"error","message":"Unauthorized.1 bad forward key {sk-xxxxxxx}","code":40002}

Key确认是好使的:

import os
import openai
openai.api_key = "sk-xxxxxx"
# openai.organization = os.getenv("OPENAI_ORGANIZATION") 

response = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Who won the world series in 2020?"},
        {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
        {"role": "user", "content": "Where was it played?"}
    ]
)
print(response['choices'][0]['message']['content'])

Result:

The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which is the home stadium of the Texas Rangers.
@binary-husky
Copy link
Owner

我无法复现该错误

{"object":"error","message":"Unauthorized.1 bad forward key {sk-xxxxxxx}","code":40002}
这个错误信息怎么看也不像openai的官方风格

@binary-husky binary-husky reopened this Dec 9, 2023
@binary-husky binary-husky added the question Further information is requested label Dec 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants