Organizations are adopting cloud analytics in various ways, including public cloud, private cloud, multi-cloud, and hybrid cloud. And that adoption is taking place across a variety of use cases and in many industries. Managing data for cloud analytics continues to have its challenges. Ad-hoc Analytics powers faster delivery in the cloud.
The adoption of cloud BI and analytics continues to grow at a rate of over 22% each year. Organizations are adopting cloud analytics in various ways, including public cloud, private cloud, multi-cloud, and hybrid cloud. And that adoption is taking place across various use cases—like data science, traditional analytics, and data warehousing—and in many industries.
Three key reasons organizations are adopting cloud analytics are scalability, flexibility, and cost. And these are all intertwined. Companies can get the scale they need at any point in time with elasticity that expands and contracts alongside specific needs, paying only for consumed resources. This is especially important for analytics as workloads can continuously vary.
Even with this strong adoption, organizations are wary of moving all of their analytics data to the cloud. In fact, many organizations are taking a hybrid approach to manage their data for analytics. Most organizations are keeping their data on-premises or in hybrid cloud environments today. What’s more, three-quarters of organizations are expected to remain hybrid over the next three years.
Ad-hoc and on-demand analytics. Exploring the unknown and looking for new answers.
Requirements. Design. Engineering. Testing and UAT. Deployment.
Ad-hoc analytics, ELT data preparation and pipeline platform, flexible cloud data warehouse.
Cloud-based analytics with faster, more cost-effective, and less risky projects.
Traditional analytics focused on delivering information via reports and/or dashboards to keep managers up to date on many of the standard metrics used to track business. In this approach, reports and dashboards are refreshed in an automated manner using pre-defined datasets rolled up into specified metrics. Source data is typical “ETL’ed” into a well-defined schema in a data warehouse optimized to produce the metrics along various dimensions.
As more reports were distributed and as the pace of business accelerated from the digital economy, a new analytics workload emerged: ad-hoc analytics. This approach emerged in response to management and knowledge workers seeing information in a report or dashboard and wanting to delve further into the details by asking how and why.
Each of these types of analytics workloads has different requirements, as shown in Table 1. The main difference is that reporting and dashboards are very well-defined upfront for specific repeatable purposes. At the same time, ad-hoc analytics involves a great deal of exploring the unknown and looking for new answers.
As previously mentioned, the agile ad-hoc analytics-powered approach to cloud analytics requires three key components: a virtualized analytics data platform, a self-service data preparation, and pipeline platform, and a modern cloud data warehouse. Datameer offers the first two of these components:
Datameer Spotlight is an integrated self-service analytics data platform that provides greater visibility into and use of enterprise-wide data assets for analytics to speed the delivery of new insights without the cost and risk of moving data.
Datameer Spectrum is a single integrated data preparation and pipeline platform that makes it easy to transform raw data into secure, curated datasets to feed any analytics initiative and discover new insights without IT involvement.