The shift from on-premises server infrastructure to cloud-based and software-as-a-service (SaaS) models has had a profound impact on the data and analytics architecture of many organizations in recent years. More than one-half of participants (59%) in Ventana Research’s Analytics and Data Benchmark research are deploying data and analytics workloads in the cloud, and a further 30% plan to do so. Customer demand for cloud-based consumption models has also had a significant impact on the products and services that are available from data and analytics vendors. Data platform providers, both operational and analytic, have had to adapt to changing customer demand. The initial response — making existing products available for deployment on cloud infrastructure — only scratched the surface in terms of responding to emerging expectations. We now see the next generation of products, designed specifically to deliver innovation by taking advantage of cloud-native architecture, being brought to market both by emerging startups, and established vendors, including InterSystems.
Topics: business intelligence, Cloud Computing, Data Management, Data, natural language processing, data operations, AI & Machine Learning, Analytics & Data, analytic data platforms, Operational Data Platforms
Earlier this year, I wrote about the increasing importance of data observability, an emerging product category that takes advantage of machine learning (ML) and Data Operations (DataOps) to automate the monitoring of data used for analytics projects to ensure its quality and lineage. Monitoring the quality and lineage of data is nothing new. Manual tools exist to ensure that it is complete, valid and consistent, as well as relevant and free from duplication. Data observability vendors, including Monte Carlo Data, have emerged in recent years with the goal of increasing the productivity of data teams and improving organizations’ trust in data using automation and artificial intelligence and machine learning (AI/ML).
One of the most significant considerations when choosing an analytic data platform is performance. As organizations compete to benefit most from being data-driven, the lower the time to insight the better. As data practitioners have learnt over time, however, lowering time to insight is about more than just high-performance queries. There are opportunities to improve time to insight throughout the analytics life cycle, which starts with data ingestion and integration, includes data preparation and data management, as well as data storage and processing, and ends with data visualization and analysis. Vendors focused on delivering the highest levels of analytic performance, such as SQream, understand that lowering time to insight relies on accelerating every aspect of that life cycle.
I have written before about the continued use of specialist operational and analytic data platforms. Most database products can be used for operational or analytic workloads, and the number of use cases for hybrid data processing is growing. However, a general-purpose database is unlikely to meet the most demanding operational or analytic data platform requirements. Factors including performance, reliability, security and scalability necessitate the use of specialist data platforms. I assert that through 2026, and despite increased demand for hybrid operational and analytic processing, more than three-quarters of data platform use cases will have functional requirements that encourage the use of specialized analytic or operational data platforms. It is for that reason that specialist database providers, including Ocient, continue to emerge with new and innovative approaches targeted at specific data-processing requirements.
The data catalog has become an integral component of organizational data strategies over the past decade, serving as a conduit for good data governance and facilitating self-service analytics initiatives. The data catalog has become so important, in fact, that it is easy to forget that just 10 years ago it did not exist in terms of a standalone product category. Metadata-based data management functionality has had a role to play within products for data governance and business intelligence for much longer than that, of course, but the emergence of the data catalog as a product category provided a platform for metadata-based data inventory and discovery that could span an entire organization, serving multiple departments, use cases and initiatives.