-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Description
I've got A0T tokens staked and the agent-zero's API Dashboard says that I do have my daily free AI inference credits allowance worth ~$0.70 atm.
I have configured my running agent-zero docker container as follows:
- added the API key to all the relevant places in A0 > dashboard > Settings
- set the provider to Venice.ai
- chose the right model (official model name copied from the venice.ai website, as recommended)
but getting an Authentication error both in the A0 chat and in the Docker (detailed error is below).
Then I curl'ed the venice.ai API directly with:
curl https://api.venice.ai/api/v1/chat/completions \
-H "Authorization: Bearer $A0_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "venice-uncensored",
"messages": [{"role": "user", "content": "Hello World!"}]
}'
and got:
{"error":"Authentication failed"}
Note: The A0_API_KEY is properly exported in the current terminal/bash session.
Then I've also tested my A0 (and curl) with a token generated straight from Venice.ai site, and got:
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 402 - {'error': 'Insufficient USD or Diem balance to complete request'}
Meaning both, the Venice.ai Auth service and API work fine, and the Venice.ai API integration within the latest ver. of A0 (docker image) is correct.
Therefore it must be the token that is somewhat invalid.
The token generated on the A0 Dashboard looks like:
sk-a0-5fVAVOznKSaacfInoQqIRRbNDIfMs99AhH
whereas the one generated on Venice.ai site looks like this:
VENICE-INFERENCE-KEY-uPOVhX2mQmrtg9J0SxGS7OWQjOJjwIbjO7EyMMwuxb
What's going on?
Best Regards
P.S. The detailed A0/Docker Auth error:
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': 'Authentication failed'}
Traceback (most recent call last):
Traceback (most recent call last):
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/[openai.py](https://openai.py/)", line 991, in async_streaming
headers, response = await self.make_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_[utils.py](https://utils.py/)", line 190, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/[openai.py](https://openai.py/)", line 454, in make_openai_chat_completion_request
raise e
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/[openai.py](https://openai.py/)", line 436, in make_openai_chat_completion_request
await openai_[aclient.chat](https://aclient.chat/).completions.with_raw_response.create(
File "/opt/venv-a0/lib/python3.12/site-packages/openai/_legacy_[response.py](https://response.py/)", line 381, in wrapped
return cast(LegacyAPIResponse[R], await func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/openai/resources/chat/completions/[completions.py](https://completions.py/)", line 2589, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_[client.py](https://client.py/)", line 1794, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_[client.py](https://client.py/)", line 1594, in request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': 'Authentication failed'}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/[main.py](https://main.py/)", line 598, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/[openai.py](https://openai.py/)", line 1041, in async_streaming
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 401 - {'error': 'Authentication failed'}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/a0/[agent.py](https://agent.py/)", line 454, in monologue
agent_response, _reasoning = await [self.call](https://self.call/)_chat_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/a0/[agent.py](https://agent.py/)", line 808, in call_chat_model
response, reasoning = await model.unified_call(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/a0/[models.py](https://models.py/)", line 502, in unified_call
_completion = await acompletion(
^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/[utils.py](https://utils.py/)", line 1638, in wrapper_async
raise e
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/[utils.py](https://utils.py/)", line 1484, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/[main.py](https://main.py/)", line 617, in acompletion
raise exception_type(
^^^^^^^^^^^^^^^
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_[utils.py](https://utils.py/)", line 2323, in exception_type
raise e
File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_[utils.py](https://utils.py/)", line 477, in exception_type
raise AuthenticationError(