After months of anticipation and requests from developers, OpenAI has finally released Code Interpreter API. “Code Interpreter is now available today in the API as well,” said Romain Huet, head of developer experience at OpenAI, in the backdrop of Assistants API launch.
This new API is said to make it easier for developers to build their own GPT-like experience into their own apps and services. “These experiences are great but they have been hard to build, sometimes taking months, teams of dozens of engineers, there’s a lot to handle to make this custom assistant experience. So today we’re making it a lot easier with our new Assistants API,” said Altman.
Currently in beta stage, you can try Assistant API here.
The all new API provides new capabilities such as code interpreter and retrieval, alongside function calling to handle a lot of the heavy lifting, making it easy for developers to build high-quality AI applications.
In addition to this, it has also introduced persistent and infinitely long threads, helping developers focus on context window constraints and leaving all the thread state management hassle to OpenAI.
“With the Assistants API, you simply add each new message to an existing thread. (this is different from) other features.” said OpenAI, in its blog post.
As far as the data safety is concerned, OpenAI claimed that its API are never used to train their models and developers can delete the data when they see it.
Features of the Assistants API
The Assistants API leverages Code Interpreter, OpenAI’s tool that writes and executes Python code in a controlled environment. Originally launched for ChatGPT in March, the Code Interpreter facilitates the generation of graphs, charting, and file processing. This functionality allows assistants developed with the Assistants API to iteratively run code for problem-solving in coding and mathematics.
Moreover, the API incorporates a retrieval component, enabling dev-created assistants to access knowledge outside OpenAI’s models, such as product information or company documents. It also supports function calling, allowing assistants to trigger developer-defined programming functions and integrate their responses into messages.
Beta Release and Usage
The Assistants API is currently in beta and accessible to all developers.
OpenAI will bill the tokens used at the chosen model’s per-token rates, where “tokens” refer to text fragments. In the future, OpenAI plans to enable customers to introduce their own assistant-driving tools to complement the existing Code Interpreter, retrieval component, and function calling features on its platform.
The post OpenAI Finally Launches Code Interpreter API appeared first on Analytics India Magazine.