Mick McGuinness
Mick McGuinness DBmarlin Co-founder & Product Manager

DBmarlin AI Co-pilot adds support for GPT-4o model

DBmarlin AI Co-pilot adds support for GPT-4o model

Last week OpenAI launched their new flagship model, GPT-4o, that can reason across audio, vision, and text in real time. GPT-4o (“o” for “omni”) is a step towards much more natural human-computer interaction—it accepts as input any combination of text, audio, image, and video and generates any combination of text, audio, and image outputs. It can respond to audio inputs in as little as 232 milliseconds, with an average of 320 milliseconds, which is similar to human response time.

The good news for DBmarlin Co-pilot users is that GPT-4o has already been integrated into our AI Co-pilot and will be available for all customers in version 4.5.0 due to be released in the next few weeks. In testing we have found that GPT-4o matches GPT-4 Turbo performance for it’s ability to solve complex database questions while also being much faster and 50% cheaper to use.

See this recent customer testimonial from Enerj showing how they used DBmarlin Co-pilot AI to save time tuning their PostgreSQL queries.

DBmarlin Co-pilot is free to use for all DBmarlin customers and already supports the GPT-3.5-turbo and GPT-4-turbo models with GPT-4o coming very soon.

About DBmarlin AI Co-pilot

DBmarlin AI Co-pilot can make many database and SQL tuning recommendations including:

  • Create new index - including the DDL for the CREATE INDEX statement
  • Rewrite SQL query
  • Update table and index statistics
  • Use partitioned tables
  • Use materialised views
  • Tune a database parameter
  • Scale up hardware


If you would like to find out more about DBmarlin AI Co-pilot or try out DBmarlin head to https://www.dbmarlin.com/dbmarlin-ai-co-pilot where you can find videos and links to a freemium version to get you started.