OpenAI unveils ChatGPT API at very low prices
The company says it has reduced costs for the AI-powered chatbot by 90% since December.
OpenAI has released APIs for ChatGPT and Whisper (which does speech-to-text conversion) at a price the company says is 10 times cheaper than its existing models.
“ChatGPT and Whisper models are now available on our API, giving developers access to cutting-edge language (not just chat!) and speech-to-text capabilities,” the company said in a blog post. “Through a series of system-wide optimizations, we’ve achieved 90% cost reduction for ChatGPT since December; we’re now passing through those savings to API users. Developers can now use our open-source Whisper large-v2 model in the API with much faster and cost-effective results.”
Why we care. Chatbots for all! That’s good news for marketers, right? If OpenAI did reduce the cost of their product by 90% in less than three months, they did something every bit as revolutionary as the creation of their AI. It is also possible they’ve cut prices and are offering a loss-leader to ward off potential competitors. Whether or not that’s the case, it’s important to keep an eye on usage. Even at these prices a lot of users can quickly add up to a substantial cost.
Dig deeper: FTC warns tech companies about over-hyping AI claims
The cost. The ChatGPT API costs $0.002 per 1,000 tokens, which are the sequences of messages with metadata that the model consumes. The Whisper large-V2 model is priced at $0.006 per minute. This is a huge reduction given that OpenAI CEO Sam Altman once estimated computing costs at a few cents for each chat.
Already in use. Snapchat is using the ChatGPT API to power the My AI feature available to paid users of SnapChat+. Study aid Quizlet is using it for Q-Chat, a virtual tutor. Instacart will soon roll out Ask Instacart which will reply to customer questions on recipes and food purchases with “shoppable” answers informed by product data from the company’s retail partners.
Also announced. OpenAI also clarified some earlier policies for their service. First, enterprise data submitted through the API will no longer be used for model training or other service improvements unless organizations give their permission. Also, it now requires apps or services using ChatGPT make it clear to customers that they are interacting with a chatbot. This includes identifying all ChatGPT-created content — news, blog posts, etc. — as being written by a bot.
Related stories