Services for Organizations

Using our research, best practices and expertise, we help you understand how to optimize your business processes using applications, information and technology. We provide advisory, education, and assessment services to rapidly identify and prioritize areas for improvement and perform vendor selection

Consulting & Strategy Sessions

Ventana On Demand

    Services for Investment Firms

    We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

    Consulting & Strategy Sessions

    Ventana On Demand

      Services for Technology Vendors

      We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

      Analyst Relations

      Demand Generation

      Product Marketing

      Market Coverage

      Request a Briefing



        Matt Aslett's Analyst Perspectives

        << Back to Blog Index

        Harness Events to Process Data at the Speed of Business

        The final of the men’s 100 meters at the Paris Olympics this summer was a reminder that being successful requires not just being fast but performing at the right time. Being fast is obviously a prerequisite for participating in an Olympic 100-meter final, and all the competitors finished the race in under 10 seconds, with just 0.12 seconds separating the first man from the last. While all the athletes were fast, what separated the winner of the gold medal—USA’s Noah Lyles—was execution. He was in last place at 50 meters, but as others were slowing down in the second half of the race, Lyles was accelerating towards the finish line. Lyles was not even the first person to cross the finish line, but since athletics measures at the torso rather than at the toes, Lyles’ lean for the line secured him victory. Execution at speed was the difference between being an Olympic 100-meter finalist and the Olympic 100-meter champion.

        There is a lesson to be learned for enterprises as they strive to make use of analytics and data to out-compete their rivals. Just as all 100-meter runners are fast, all enterprises strive to be data-driven. Making decisions based on data is not enough to be successful, however. The winners are those that can process and act upon data at the speed of business. This is easier said than done, given the overwhelming historical reliance on batch data processing. As I have previously stated, the execution of business events has always occurred in real time. Batch data processing is an artificial construct driven by the limitations of traditional data processing capabilities that require enterprises to process data minutes, hours or even days after an event. Processing and acting on data at the speed of business necessitates a change of approach by enterprises to make real-time data processing a central component of their analytics and data strategy, rather than an exception.

        Streaming and events focuses on the uninterrupted management, processing and analysis of data generated by applications, systems and devices on a continuous basis. The processingISG_BR_AD_Frequency_of_Data_Analysis_2024 of business events in real time has been adopted as a standard approach to data processing in industry segments with high-performance, real-time requirements such as financial services and telecommunications. In many other industries, however, the reliance on batch data processing is so entrenched that streaming and events has primarily been seen as a niche requirement, separate from the primary focus on batch processing of data at rest. Less than one-quarter (22%) of enterprises participating in our Analytics and Data Benchmark Research are currently analyzing data in real time.

        Enterprises that fail to process and analyze data in real time run the risk of failing to operate at the pace of the real world. The pressure on enterprises to improve their ability to process and analyze data in real time is being exacerbated by increased demand for intelligent operational applications infused with the results of analytic processes, such as personalization and artificial intelligence (AI)-driven recommendations. AI-driven intelligent applications require a new approach to data processing that enables real-time performance of machine learning (ML) on operational data to deliver instant, relevant information for accelerated decision-making. As demand for real-time interactive applications becomes more pervasive, the processing of streaming and events is becoming a more mainstream pursuit, aided by the proliferation of streaming and event technologies, which have lowered the cost and technical barriers to developing new applications that take advantage of data in motion.

        Fortunately, for enterprises that are yet to embrace streaming and events, the core enabling technologies are mature and readily available. The fundamental enabler of an event-driven architecture is the event broker, which is responsible for handling the communication of messages between applications and systems. Messages can be published sequentially, in the form of message queues, or as a stream: a continuous flow of event messages. A network of event brokers, in combination with event-management software for discovering, cataloging, governing and securing events and event-driven applications, can enable enterprises to act on the event data as it is generated through streaming integration (including aggregation, transformation, enrichment and ingestion of event streams) and streaming analytics, which uses compute engines to analyze streams of event data to deliver actionable insight in real time.

        Processing and analyzing event data in isolation is valuable. As I have previously stated, however, success with streaming data relies on the holistic management and governance ofISG_Research_2024_Assertion_StreamEvents_Standard_Info_Architecture_99_S-1 data in motion and data at rest. In recent years, we have seen the emergence of streaming databases designed to provide a single environment for continually processing streams of event data using SQL queries and real-time materialized views. At the same time, we have also seen streaming and event specialist providers improve their capabilities for the persistence of event data in a data warehouse, data lake or cloud storage for batch processing of historical event data. I assert that by 2026, more than three-quarters of enterprises’ standard information architectures will include streaming data and event processing, allowing enterprises to be more responsive and provide better customer experiences. As more enterprises adopt event-driven architecture, the addition of capabilities for the persistence and processing of historical event data increases the potential for streaming and event specialists to stake a claim to be considered an enterprise’s primary data platform provider, rather than being utilized for a limited set of use cases.

        Accelerating this potential is the rise of AI-driven operational and analytic applications. Providing real-time interactivity is table stakes for operational applications. Enterprises can differentiate user experiences with real-time, AI-driven functionality. Doing so requires AI models to have access to up-to-date data via streams of events as they are generated in real time, as well as the ability to incorporate model inferencing into streaming analytics pipelines. Enterprises with an over-reliance on batch data processing will not be able to match those that are able to harness real-time data as it is generated. The ability to execute at the speed of business by processing and acting on events as they occur will be the difference between competing and winning with analytics and data. I recommend that enterprises evaluating their current and future data architecture requirements consider streaming and event technologies alongside more traditional batch-oriented data platforms to provide a holistic view of all data, in motion and at rest.

        Regards,

        Matt Aslett

        Matt Aslett
        Director of Research, Analytics and Data

        Matt Aslett leads the software research and advisory for Analytics and Data at ISG Software Research, covering software that improves the utilization and value of information. His focus areas of expertise and market coverage include analytics, data intelligence, data operations, data platforms, and streaming and events.

        JOIN OUR COMMUNITY

        Our Analyst Perspective Policy

        • Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business, industry and technology vendor trends. Each Analyst Perspective presents the view of the analyst who is an established subject matter expert on new developments, business and technology trends, findings from our research, or best practice insights.

          Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@ventanaresearch.com

        View Policy

        Subscribe to Email Updates



        Analyst Perspectives Archive

        See All