Changelog
- Providers: Added Mistral AI as a new provider.
- Providers: Added Google AI Studio as a new provider.
AI Gateway now supports adding custom metadata to requests, improving tracking and analysis of incoming requests.
Logs are now available for the last 24 hours.
AI Gateway now supports custom cache key headers.
Workers AI now natively supports AI Gateway.
AI Gateway is moving from beta to GA.
- Added new endpoints to the REST API.
- LLM Side Channel vulnerability fixed
- Providers: Added Anthropic, Google Vertex, Perplexity as providers.
- Real-time Logs: Logs are now real-time, showing logs for the last hour. If you have a need for persistent logs, please let the team know on Discord. We are building out a persistent logs feature for those who want to store their logs for longer.
- Providers: Azure OpenAI is now supported as a provider!
- Docs: Added Azure OpenAI example.
- Bug Fixes: Errors with costs and tokens should be fixed.
- Logs: Logs will now be limited to the last 24h. If you have a use case that requires more logging, please reach out to the team on Discord.
- Dashboard: Logs now refresh automatically.
- Docs: Fixed Workers AI example in docs and dash.
- Caching: Embedding requests are now cacheable. Rate limit will not apply for cached requests.
- Bug Fixes: Identical requests to different providers are not wrongly served from cache anymore. Streaming now works as expected, including for the Universal endpoint.
- Known Issues: There's currently a bug with costs that we are investigating.