Date:

Qlik and dbt Labs Drive Big Data Integration

Data Integration Acquisitions Signal New Era of Open, Real-Time Data Architectures

Data never stands still, which makes data integration a perennial challenge. That’s ultimately what drove Qlik and dbt Labs today to announce strategic data integration acquisitions, including Qlik’s purchase of Upsolver, a provider of a no-code data ingestion environment for lakehouses and Apache Iceberg, and dbt Labs’ acquisition of SDF Labs and its real-time SQL validation tool.

Qlik Acquires Upsolver

Qlik is known mainly for its business intelligence, analytics, and ML/AI tools, but the company also has a significant stable of data integration tools that provide an array of data management, data quality, and data governance capabilities. The acquisition of Upsolver, which was founded by Ori Rafael and Yoni Eini in 2014, bolsters that data integration toolset.

Upsolver devoted itself to automating the data engineering work that typically goes into building data pipelines that load data lakes and lakehouses. The Sunnyvale, California-based company developed a no-code platform that it claims can eliminate much of the tedium of hand-coding SQL-based data transformation routines, such as converting unstructured or semi-structured data into database tables.

In recent years, Upsolver has adapted its tools to write data in the Apache Iceberg table format. Iceberg, as we have discussed, is at the center of a resurgence in the data lakehouse design pattern, as it frees organizations to use a multitude of data processing engines against their tables, without the close coupling to the processing engines that was previously required to achieve accuracy and performance.

Qlik says that the acquisition of Upsolver and its real-time ingestion and optimization technologies will enable it to offer a single scalable platform for delivering analytics and AI insights on open, trusted, and governed data.

dbt Labs Acquires SDF Labs

Meanwhile, dbt Labs also made a strategic acquisition aimed at bolstering the data integration capabilities of its extremely popular data transformation environment.

The dbt folks from the Fishtown neighborhood of Philadelphia bought SDF Labs, which isn’t even a year old. SDF Labs, which is based in Seattle, Washington, came out of stealth last June with a Rust-based toolset and framework that’s designed to compile and understand the SQL that users write, regardless of platform. It can be used as an alternative to dbt, or used with it.

On its website, SDF Labs says its technology is "a multi-dialect SQL compiler, transformation framework, and analytical database engine packaged into a single CLI. Unlike other data transformation tools like DBT, SDF extracts SQL compilers from their clouds, understanding proprietary dialects of SQL (like Snowflake) so deeply that it can ultimately execute them."

Conclusion

The acquisitions of Upsolver and SDF Labs by Qlik and dbt Labs, respectively, indicate that data integration is entering a period of accelerated development and change. Out are proprietary databases and tools and brittle data pipelines that lock you in and break. In are open formats, lakehouses, customer choice of query engines, and tools that allow you to easily point your data sources to new data destinations, without the loads of work that used to be required.

And of course, that’s great news for customers.

FAQs

Q: What does the acquisition of Upsolver by Qlik mean for the data integration landscape?

A: The acquisition of Upsolver by Qlik bolsters the company’s data integration toolset and enables it to offer a single scalable platform for delivering analytics and AI insights on open, trusted, and governed data.

Q: What does the acquisition of SDF Labs by dbt Labs mean for the data integration landscape?

A: The acquisition of SDF Labs by dbt Labs brings SQL understanding into dbt, which is something that dbt had always left to the user and the database. This enables dbt to elevate its user experience with unprecedented levels of speed, accuracy, and velocity.

Q: What is the significance of Apache Iceberg in the data lakehouse design pattern?

A: Apache Iceberg is at the center of a resurgence in the data lakehouse design pattern, as it frees organizations to use a multitude of data processing engines against their tables, without the close coupling to the processing engines that was previously required to achieve accuracy and performance.

Q: What does the future hold for data integration?

A: The future of data integration is open, real-time data architectures that enable customers to easily point their data sources to new data destinations, without the loads of work that used to be required.

Latest stories

Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here