End-to-end machine learning platform Predibase today announced a $12.2 million extension to last year’s $16.25 million Series A funding round. The company also announced that its low-code, declarative ML platform for developers is now generally available.
During the beta period, which started when the company came out of stealth last year, users trained more than 250 models on the platform. Now that the service is generally available, these users can also use Predibase to implement their own large language models (LLMs) instead of using an API from, say, OpenAI. Users also get access to Predibase’s very own LudwigGPT LLM – named after the suite of machine learning tools Predibase co-founder Piero Molino launched in 2019 (and not the tragic one 19th century Bavarian king).
“Every company wants to gain a competitive advantage by embedding ML into their internal and customer-facing applications. Unfortunately, today’s ML tools are too complex for engineering teams and data science resources are too limited, leaving the developers who work on these projects in control,” said Piero Molino, co-founder and CEO of Predibase. “Our mission is to make it dead easy for novices and experts alike to build and deploy ML applications with just a few lines of code. And now we’re extending those capabilities to support building and deploying custom LLMs.”
To do this, the company also today announced its Data Science Copilot, a system that can provide developers with recommendations on how to improve the performance of their models. Predibase is also launching a two-week free trial of its platform.
Like most startups at this stage, Predibase plans to use the new funding to expand its go-to-market functions and build out its platform.
Between low-code/no-code ML platforms from the likes of AWS, Google and Microsoft and numerous startups in this space, Predibase operates in an increasingly crowded market. The company claims that what sets it apart is its focus on developers and its ability to provide them with easy escape hatches from the low-code environment.