A single line of code to supercharge your LLM with integrations.
And without complex JSON Schemas.
Spend minutes, not days adding integrations to your LLM. With only a few lines of code, the LLM can perform any action in any target app on behalf of the user.
2,200 out of the box integrations. Connect with Google Calendar, Gmail, Hubspot, Salesforce or more.
Manage authentication profiles so your LLM can perform actions on behalf of your users.
Function calling requires the foundation model to be fine-tuned which can be expensive and diminish the quality of your output.
Enable function calling with any LLM, even if it's not natively supported. With Astra, you can build a seamless layer of integrations and function execution on top of your LLM, extending its capabilities without altering its core structure.
Leverage our simple user interface to manage integrations instead of complex and hard to maintain JSON Schemas.
✓ Automatically generate LLM-optimized field descriptions
✓ Always in sync with API changes
✓ Import tools directly into OpenAI or a LLM
Only one extra line of code is required to import tools into your LLM. You can select the tools based on its name or tags.
from astra import Astra
from openai import OpenAI
tools = Astra(auth_token = "******************")
llm = OpenAI()
tools = client.get_tools(
tags = ["project_llm"]
)
assistant = client.beta.assistants.create(
...
tools=tools
)
Function calling in LLMs allows the model to interact with external tools or APIs to perform real-world tasks, like accessing data, scheduling events, or sending emails. This makes LLMs more interactive and capable of executing actions based on user queries. Function calling is also known as Tools.