Starburst, a data lake analytics company based in Boston, has announced an integration with dbt Cloud, an open-source transformation tool, to help users of the platform build data pipelines across multiple data sources. The integration is now live and accessible through the dedicated adapter inside dbt Cloud.
Starburst Galaxy, the cloud-native, fully managed service of the company’s massively parallel processing query engine, enables users to query multiple data sources and join data across those sources with a single query. Once installed, the integration with dbt Cloud enables users to quickly and securely transform all their data assets regardless of location. This helps avoid manual-configured ETL pipelines, which are costly and prone to risk.
Speaking about the value of the integration, Matt Fuller, Co-Founder and VP of Product at Starburst, said that it is incredibly easy to deploy and get up and running within minutes. He also highlighted the added convenience of being able to write queries as normal and have Starburst automatically send requests to the right place.
Kevin Petrie, VP of Research at Eckerson Group, commented that Starburst’s federated query engine integrated with dbt’s transformation engine will help companies prepare more data for analytics projects. He said this will be especially beneficial for companies that have many data assets spread across multiple distributed platforms, such as on-prem databases of object storage.
In addition to this integration, Starburst is hosting an executive event in San Francisco on July 11-12 to discuss how AI investments can be integrated and optimized for success. It appears to be an opportunity for more people to learn about the benefits of the Starburst-dbt integration.
To summarize, Starburst’s integration with dbt Cloud is an effective solution for companies dealing with distributed data environments. With the integration, users can leverage the query engine of Starburst to transform data assets across multiple sources. Enterprise customers benefit from being able to save the time, cost and risk of manually configured ETL pipelines. For executive-level attendees of the upcoming event in San Francisco, it’s an opportunity to learn about how to best use AI investments for success.