You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I got the below error when i just try to use open-ai api key with open-ai key constant but It worked after i updated with Gemini.
Two questions
I used the below one and it worked. Can you provide the same for other LLM models. I tried for the ChatGpt and it did not work and got the below error.
Also i see in your youtube video that you're using a GUI for that. Was that available to use? Also thank you for the wonderful work, I am doing a POC and this helped me a lot.
client = genai.Client(
api_key=os.getenv("GEMINI_API_KEY", ""),
)
model = os.getenv("GEMINI_MODEL", "gemini-2.5-pro-preview-03-25")
Below is the error info.
Traceback (most recent call last):
File "/Users/vyom/Sites/codebase-documentation/main.py", line 97, in
main()
File "/Users/vyom/Sites/codebase-documentation/main.py", line 94, in main
tutorial_flow.run(shared)
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 16, in run
return self._run(shared)
^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 50, in _run
def _run(self,shared): p=self.prep(shared); o=self._orch(shared); return self.post(shared,p,o)
^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 48, in _orch
while curr: curr.set_params(p); last_action=curr._run(shared); curr=copy.copy(self.get_next_node(curr,last_action))
^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 13, in _run
def _run(self,shared): p=self.prep(shared); e=self._exec(p); return self.post(shared,p,e)
^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 33, in _exec
if self.cur_retry==self.max_retries-1: return self.exec_fallback(prep_res,e)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 28, in exec_fallback
def exec_fallback(self,prep_res,exc): raise exc
^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 31, in _exec
try: return self.exec(prep_res)
^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/Sites/codebase-documentation/nodes.py", line 176, in exec
response = call_llm(prompt, use_cache=(use_cache and self.cur_retry == 0)) # Use cache only if enabled and not retrying
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/Sites/codebase-documentation/utils/call_llm.py", line 55, in call_llm
client = OpenAI(api_key=api_key)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_client.py", line 122, in init
super().init(
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 825, in init
self._client = http_client or SyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 723, in init
super().init(**kwargs)
TypeError: Client.init() got an unexpected keyword argument 'proxies'
The text was updated successfully, but these errors were encountered:
I got the below error when i just try to use open-ai api key with open-ai key constant but It worked after i updated with Gemini.
Two questions
I used the below one and it worked. Can you provide the same for other LLM models. I tried for the ChatGpt and it did not work and got the below error.
Also i see in your youtube video that you're using a GUI for that. Was that available to use? Also thank you for the wonderful work, I am doing a POC and this helped me a lot.
client = genai.Client(
api_key=os.getenv("GEMINI_API_KEY", ""),
)
model = os.getenv("GEMINI_MODEL", "gemini-2.5-pro-preview-03-25")
Traceback (most recent call last):
File "/Users/vyom/Sites/codebase-documentation/main.py", line 97, in
main()
File "/Users/vyom/Sites/codebase-documentation/main.py", line 94, in main
tutorial_flow.run(shared)
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 16, in run
return self._run(shared)
^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 50, in _run
def _run(self,shared): p=self.prep(shared); o=self._orch(shared); return self.post(shared,p,o)
^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 48, in _orch
while curr: curr.set_params(p); last_action=curr._run(shared); curr=copy.copy(self.get_next_node(curr,last_action))
^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 13, in _run
def _run(self,shared): p=self.prep(shared); e=self._exec(p); return self.post(shared,p,e)
^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 33, in _exec
if self.cur_retry==self.max_retries-1: return self.exec_fallback(prep_res,e)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 28, in exec_fallback
def exec_fallback(self,prep_res,exc): raise exc
^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/pocketflow/init.py", line 31, in _exec
try: return self.exec(prep_res)
^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/Sites/codebase-documentation/nodes.py", line 176, in exec
response = call_llm(prompt, use_cache=(use_cache and self.cur_retry == 0)) # Use cache only if enabled and not retrying
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/Sites/codebase-documentation/utils/call_llm.py", line 55, in call_llm
client = OpenAI(api_key=api_key)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_client.py", line 122, in init
super().init(
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 825, in init
self._client = http_client or SyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/vyom/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 723, in init
super().init(**kwargs)
TypeError: Client.init() got an unexpected keyword argument 'proxies'
The text was updated successfully, but these errors were encountered: