The recent publication of our Value Index research highlights the impact of intelligent applications on the operational data platforms sector. While we continue to believe that, for most use cases, there is a clear, functional requirement for either analytic or operational data platforms, recent growth in the development of intelligent applications infused with the results of analytic processes, such as personalization and artificial intelligence (AI)-driven recommendations, has increasing influence over requirements for operational data platforms to support real-time analytic functionality. Operational data platform vendors, including MongoDB, are responding to these evolving requirements with new functionality to support the development and deployment of intelligent applications.
Topics: Analytics, Business Intelligence, Cloud Computing, Data, Digital Technology, Analytics & Data, analytic data platforms, Operational Data Platforms
As engagement with customers, suppliers and partners is increasingly conducted through digital channels, ensuring that infrastructure and applications are performing as expected is not just important but mission critical. My colleague, David Menninger, recently explained the increasing importance of observability to enable organizations to ensure that their systems and applications are operating efficiently. Observability has previously been the domain of the IT department but is increasingly important for business decision-makers as organizations combine machine-generated telemetry data with business event data to understand the impact of a system outage or application performance degradation on their ability to conduct digital business. Companies such as Mezmo are responding with observability platforms designed to facilitate the integration of machine and business data and encourage collaboration between business and IT professionals.
Topics: Data Management, Data, Digital Technology, Analytics & Data
I have written recently about the increasing importance of managing data in motion and at rest as the use of streaming data by enterprise organizations becomes more mainstream. While batch-based processing of application data has been a core component of enterprise IT architecture for decades, streaming data and event processing have often been niche disciplines typically reserved for organizations with the highest-level performance requirements. That has changed in recent years, driven by an increased reliance on streaming data and events to identify trends and anomalies and enable real-time responses through uninterrupted processing and analysis of data generated by applications, systems and devices on a continuous basis. Companies like Solace have been integral to the development of event-driven applications and increased adoption of event-driven architecture (EDA).
Topics: Data, Streaming Data & Events
To execute more data-driven business strategies, organizations need linked and comprehensive data that is available in real time. By consistently managing data across siloed systems and ensuring that data definitions are agreed and current, organizations can overcome the challenges presented by data being distributed across an increasingly disparate range of applications and data-processing locations. Maintaining data quality is a perennial data management challenge, often preventing organizations from operating at the speed of business. Our Analytics and Data Benchmark Research shows that almost two-thirds of participants (64%) cited reviewing data for quality issues as being the most time-consuming aspect of analytics initiatives, second only to preparing data for analysis. This is where master data management (MDM) becomes critical, to ensure that organizations have the clean, consistent data needed to operate efficiently and effectively. When organizations control master data, they gain visibility into their overall operations and can provide proper governance, while also having access to reliable, accurate and timely data about customers, products, assets and employees. Reltio offers MDM products designed to help customers improve trust in data by unifying and cleansing complex data from multiple sources in real time.
Topics: Data Management, Data, data operations
Data Operations (DataOps) has been part of the lexicon of the data market for almost a decade, with the term used to describe products, practices and processes designed to support agile and continuous delivery of data analytics. DataOps takes inspiration from DevOps, which describes a set of tools, practices and philosophy used to support the continuous delivery of software applications in the face of constant changes. DataOps describes a set of tools, practices and philosophy used to ensure the quality, flexibility and reliability of data and analytics initiatives, with an emphasis on continuous measurable improvement, as well as agility, collaboration and automation. Interest in products and services that support DataOps is growing. I assert that by 2025, one-half of organizations will have adopted a DataOps approach to their data engineering processes, enabling them to be more flexible and agile.
Topics: Data Governance, Data Management, Data, data operations