10 things to consider

Ten Things to Consider When Modernizing Your ETL

There are many different aspects and capabilities of a data pipeline platform, and it is essential to take a hard look at each platform. We have outlined several key areas you should explore here, providing a framework to vet your data pipeline platforms and tools under consideration.

Ebook Background

About the Ten Things to Consider When You're Modernizing ETL

The ETL and data integration market is decades old. The initial functionality was focused on extracting data from operational systems, transforming it, then loading it into a data warehouse, data-mart, or other analytical data store. For many years, the primary purpose of tools in this market serviced this exact purpose.

As analytics use cases became more diversified, ad-hoc analytics and data science grew, self-service and speed increased in importance, cloud services became a primary computing platform, data privacy and governance became critical, the market evolved tremendously. Multiple worker personas have emerged – data engineers, data analysts, data scientists – as have new leaders such as the CDO, CAO, and CDAO.

DataOps Process: How it helps

What is a Data Pipeline Platform?

Explore the various definitions of data pipelines and see which ones suit you best.

DataOps Process: Data Platform Capabilities

Modern Data Pipeline Platforms

The ability to process very large volumes of data.

Robust Data Preparation

A vast number of functions ranging from simple to sophisticated.

Data Discovery

Reuse the data pipelines and datasets, eliminate duplicative work, and increase productivity


As the market has grown and become more diverse, different groups or types of data pipeline platforms and tools have emerged:

Traditional ETL Platforms – They typically handle data straightforwardly in the mold of the original extract, transform, and load model. Over time, the traditional ETL platforms have expanded their platforms to support new data sources, data types, user experiences, transformations, and the cloud.

Cloud Vendor Tools – As the cloud has emerged as a data management and processing platform, each major cloud vendor – AWS, Microsoft Azure, and Google Cloud – brought to market data integration tools on their platforms.

Cloud Data Integration – In the 2000s and 2010s, organizations expanded into the cloud to use SaaS applications like Salesforce and Marketo and cloud services such as Google Analytics, Google Adwords, and other marketing and services solutions.

Modern Data Pipeline Platforms – Several new data pipeline tools emerged during the data lake era to deal with the volume, diversity, new formats, and complexity of data many organizations face.


Data pipeline tools and platforms have varying degrees of capabilities. Each tool or platform will always have the basics – they can extract data from sources, transform it in some manner, load it into a destination, and run the data pipelines as needed. Some tools and platforms also try to focus on a particular sweet spot, such as simplicity and ease of use, a wide array of connectors, robust transformations, or scalable DataOps.

You should be wary of buying just a data pipeline tool. Without a robust, scalable, and secure platform underneath it, a data pipeline tool solves just one problem: ease of use. A true data pipeline platform will combine easy-to-use tools, a range of capabilities to solve multiple problems, and an enterprise-grade platform.

As you explore data pipeline platforms for your organization, you should be looking for a suite of capabilities that solve your needs. The solution you choose should also be future-proof – meaning it will support your needs for today and the future and protect you from dealing with technology changes.

Get the Ten Things to Consider When Modernizing Your ETL Ebook

Sign Up for Our Newsletter

If you liked this ebook, sign up and stay informed on the most popular trends in data management.