Recently announced by Andrew Ng, aisuite
aims to provide an OpenAI-like API around the most popular large language models (LLMs) currently available to make it easy for developers to try them out and compare results or switch from one LLM to another without having to change their code.
According to Andrew Ng, using multiple LLM providers in the same application can be a hassle, whereas aisuite
aims to make it so easy as changing one single string when instantiating it main component to select the desired LLM provider. For example, to use OpenAI GPT-4o, you would pass “openai:gpt-4o” as the model
argument into the call to create aisuite
chat completion agent. This is further shown in the following code snippet:
import aisuite as ai
client = ai.Client()
messages = [
{"role": "system", "content": "Respond in Pirate English."},
{"role": "user", "content": "Tell me a joke."},
]
response = client.chat.completions.create(
model="openai:gpt-4o",
messages=messages,
temperature=0.75
)
print(response.choices[0].message.content)
response = client.chat.completions.create(
model="anthropic:claude-3-5-sonnet-20240620",
messages=messages,
temperature=0.75
)
print(response.choices[0].message.content)
To install aisuite
you just run pip install aisuite
. The library also provides shortcuts to install LLM providers libraries. For example, you run pip install 'aisuite[anthropic]'
to install the base library plus Anthropic support.
Several X users replied to Andrew Ng announcement echoeing the same feeling that aisuite
actually addresses real pain points when deploying LLMs. Reddit users compared the availability of proxy libraries such as aisuite
to database abstraction layers enabling to switch from, e.g., sqlite in testing to another database in production.
While the overall reception was generally positive, some X and Reddit users highlighted a few limitations of aisuite
, including the fact it does not yet support streaming nor other fine points such as rate limits, token usage monitoring, and so on. Likewise it is unclear how well aisuite
currently supports using custom Cloud-deployed LLMs. Anyhow, it is worth remembering that the library is still in its infancy and under active development.
aisuite
is not the only solution currently available to address LLM cross-compatibility. Specifically, LiteLLM appears to be a rather more mature and feature-complete solution enabling to call multiple LLMs using the same OpenAI-like API, including support for rate and budget limits on a project-by-project basis. Also worth mentioning is OpenRouter, which furthermore provides its own Web-based UI.
aisuite
currently supports OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace and Ollama. The library is written in Python and requires developers to own API keys for any LLM providers they would like to use. The library uses either the API or the SDK published by each LLM provider to maximize stability. Currently, it is mostly focused on chat completions, but new use cases will be covered in future, say its maintainers.