In order to assist businesses in building and automating their machine learning (ML) feature pipelines from prototype to production, Tecton, the enterprise feature store firm, today announced a partnership with Databricks, the Data and AI Company and inventor of the data lakehouse paradigm. As a result, data teams may utilize Tecton to quickly develop production-ready ML features on the Databricks Lakehouse Platform, thanks to its integration with Tecton.
"We are thrilled to have Tecton available on the Databricks Lakehouse Platform. As a result, Databricks customers now have the option to use Tecton to operationalize features for their ML projects and effectively drive business with production ML applications," said Adam Conway, SVP of Products at Databricks.
Too many businesses are hesitant to integrate machine learning (ML) into their core business operations and services because productionizing ML models to support a wide range of predictive applications, such as fraud detection, real-time underwriting, dynamic pricing, recommendations, personalization, and search, presents unique data engineering challenges. In addition, it is challenging to curate, provide, and manage the ML features—predictive data signals—that power predictive applications. For this reason, Databricks and Tecton have teamed together to streamline and automate the numerous processes involved in converting raw data inputs into ML features and making those features available to power large-scale predictive applications.
Databricks, which are based on an open lakehouse architecture, enable ML teams to gather and analyze data, facilitate cross-team communication, and standardize the whole ML lifecycle from experimentation to production. With Tecton, these same teams can quickly operationalize ML applications and automate the entire lifecycle of ML features without leaving the Databricks workspace.
"Building on Databricks' powerful and massively scalable foundation for data and AI, Tecton extends the underlying data infrastructure to support ML-specific requirements. This partnership with Databricks enables organizations to embed ML into live, customer-facing applications and business processes, quickly, reliably and at scale."
Mike Del Balso, co-founder and CEO of Tecton
The central source of truth for ML features, Tecton, is accessible on the Databricks Lakehouse Platform. It orchestrates, manages, and maintains the data pipelines that produce features. The interface also enables ML teams to track and share characteristics with a version-control repository, allowing data teams to write features as code using Python and SQL. Then, using production-grade ML data pipelines, Tecton automates and organizes them such that feature values are materialized in a single repository. From there, customers don't have to worry about common obstacles like training-serving skew or point-in-time correctness as they can immediately explore, exchange, and serve features for model training, batch, and real-time predictions across use cases.
Customers can analyze features utilizing real-time and streaming data from a variety of data sources using Tecton, which serves as the interface between the Databricks Lakehouse Platform and their ML models. Tecton reduces the need for intensive engineering support and enables customers to significantly enhance model performance, accuracy, and outcome by automatically creating the intricate feature engineering pipelines required to process streaming and real-time data.