Events are everywhere around us in the real world but a lot of industries are just realizing the benefits of being event-driven. Slowly but surely, industries such as aviation, manufacturing, and retail are using event brokers to drive digital transformation by becoming event-driven. Learn how Solace’s PubSub+ is enabling companies on their event-driven journey.
Script:
Hello everyone and welcome to first draft of the Being Event-Driven course, this is an introductory course to understand what does mean to be event driven, what is events and what are the steps to become event driven.
To give some context, this course is for new hires joining Solace and will be the first course they take as part of their learning path. The course is also intended for customers and partners who have an idea on what does it mean to be event-driven, but they want to understand the whole picture
To start with the agenda, first I’m going to explain what are events and give a use case of an industry transforming to being event-driven, then I’m going to talk about how the existing legacy systems can’t handle real-time data. Then we’ll go through the different challenges the enterprises faced when they’re trying to transform their systems to event-driven. Then we’ll talk about the 3 steps needed for an enterprise to become event-driven and finally the 7 step methodology in order to achieve a full event driven solution.
Our lives are all based on events, an alarm clock in the morning is an event which triggers another event which is you waking up
Another example would be a green traffic light which is an event that triggers cars to move.
Or your heart pumping blood is an event too triggering blood to flow to your brain
All of that happens in real time, you don’t go and ask for those events.
So what is an event?
An event is a change of state that can be transmitted and processed by other applications,
It can be an API call from an application requesting information from another application
or a browser click to open a bank account
or an app click trigger an email to be sent to a recipient
or a sensor being triggered by a movement
Or a chron job being triggered at a specific time
If we start by looking at an airline business use where event-driven data can enhance customer experience and optimize airline operations.
A sales consultant who travels frequently scans his boarding pass and it usually takes 72 hours for the loyalty points to show up on their account; Instead, the points will get updated instantly on their account. What would that mean to a person who travels frequently, they can have enough points to be bumped from gold to platinum status and by the time they reach the gate they can be greeted with a first-class upgrade.
Another scenario would be as soon as they scan their boarding pass, the traveler would be able to check the status of their baggage and they would get a notification as soon as their bag is loaded on the airplane
Or in the case where a flyer is late for his flight, as soon as they scan their boarding pass, they would get a notification that they might miss their flight and can immediately book other flight options
Or an AI system detecting a late departure time as soon as the boarding pass is scanned and notifies ground control to change the take off slot and change the gate crew schedule.
So with event-driven real time data, you can see how it can take the airline industry to the next level
The answer is no
Much of the technology was designed for static data not dynamic data that is often processed and sent in batches at the end of the day or week, however events gets triggered and sent right away when there is a change in the system
Usually the legacy systems also use polling to ask for data which takes long time in comparison to getting data in real time
So in legacy systems, if you have a new app that wants access to the data, you need a lot of work to create and deploy those applications and that costs a lot of money to make that happen
These legacy system are considered to be synchronous and use request/reply where each aplication would poll some system of record for changes, this can lead to an impact on the performance of those systems when new apps start polling for new changes. So we can see that this solution does not scale very well
And the answer to that is to have an asynchronous system by which updates are sent right away to any interested application without the need to poll for changes.
Now that we know event-driven is the way to become real-time, but how can businesses transform legacy systems to be event-driven?
The answer is not simple, event distribution is complicated. Enterprises are becoming more distributed with various Legacy applications and integration Stacks like an IBM Mainframe, Oracle Fusion, tibco and likely an entire sap ecosystem
but times have changed, and enterprises want to get the flexibility of building cloud-first applications that could be private or public clouds.
They want to take advantage of the cloud based applications offered to get data or events out of those legacy systems and send them to cloud native service in AWS, Azure or GCP where they can do things like create data lakes or perform real-time analytics.
Private clouds are also being explored by Enterprises such as Kubernetes, PKS or openshift so they can spin up all their software in a cloud native way as well as testing out microservices strategies
And to get things moving quickly they could be developing their applications in tools like spring or node-red
You might also have business application that are cloud native like SAP on the cloud or Salesforce or concur or JIRA where you want events to flow to and from those applications. These applications are connected together through iPaas connectors like dell Boomi or Mulesoft.
finally there are physical locations like factories, retail locations, and sensors that need to be connected and often at scale, (click)so if you're trying to create an event-driven architecture given a scenario that has more than one of the parts of this diagram you can stitch it together today but you're going to run into several issues.
first you have to figure out a way to event enable the variety of Technologies and applications in your system moving in the way from batch processing so you can access data in real-time
second how do you connect everything together in a uniform way, but you don't want to adapt and manage multiple systems? so how can you use the same technology everywhere
third how do you manage the event life cycle? how do you design, describe, discover and deprecated events within your system so they don't have to set up every endpoint and make sure it's configured properly in other words how do you make it easy to use and how do you govern and secure it so that only authorized applications are publishing and subscribing to sensitive business data and only legitimate Personnel have access
finally how do you scale it and make it Global with enterprise-wide robustness and security
The answer is through an event mesh. An event mesh is a configurable and dynamic infrastructure layer composed of event Brokers that allows events from one application to be dynamically routed and received by any other application no matter where these applications are deployed no Cloud private Cloud public Cloud Etc without the event producers and consumers having to know about each other hence it's a critical piece of event driven architecture
The first step in becoming event-driven is to liberate your data
Your goal should be to liberate data by breaking up silos which are systems that usually don’t share information with other lines of business,
These on-prem systems can be:
legacy SAP systems (often ERPs/ECC),
could be IBM mainframe running many applications,
could be legacy message buses such as TIBCO, ORACLEYou would want to expose changes in those silos and publish them to an on-prem Solace event broker that forwards the changes to any interested application
Lets take a couple of use cases where we want to break up silos and we want to expose updates
In the airline industry, you would want to expose changes in boarding systems as we talked about before where a lot of other applications would be interested in those.... or passenger information where ground operation or customer service would be interested to see if a passenger is going to miss their flight and act accordingly.... or baggage information that would be very useful to passengers interested in knowing whether their bags are loaded on the plane or not
In the Retail industry, we can see that inventory changes are important to be exposed as other lines of business can act on it like ordering new inventory from vendors or point of sale where finance and analytics system would be interested in knowing retail completed transactions
And finally Credit card, Different lines of business would be interested in customer info or updating the bank app in real-time when a client buys something
The second step in becoming event-driven is to modernize your platform by streaming events across multiple simultaneously active runtime environments
And enterprises often face challenges on how to extend events from liberated systems to cloud based applications
You may want take advantage of
- building more agile microservices.
- link IoT devices and extend to locations such as stores, restaurants and manufacturing sites
- Have more dynamic cloud infrastructure, public, private or often both,
- Or integrate with SaaS applications that you don’t need to manage yourself,
Take advantage of innovative platforms in public clouds to support machine learning, analytics and other functions.
How can you connect and route all those events? The answer is through an event mesh: an architectural layer that dynamically routes events from one application or connected device to any other no matter where they are deployed. A system of interconnected event brokers, usually running in different environments, which distributes events produced in one part of your enterprise, such as your non-cloud systems of record, to applications running anywhere else. An event mesh routes real-time events in a lossless, secure, robust and performant manner, and does it dynamically without configuration – like a corporate Internet of events.
The second challenges that enterprises face is
how do app developers discover events from system records, IoT devices or your field locations?
How do they govern and track which app have access to events
Or How do you evolve your event definitions?
The answer is through an EVENT PORTAL: a single pane of glass used to design, create, catalog, visualize, discover, share and manage all events within your ecosystem.
If we start by looking at the different departments responsible for architecting, developing and maintaining events flowing from one application to the other,
we find that the Architects are first in line where they define, discuss and review the events and app relationships.
In the case where they architecting a new solution, then the EVENT DESIGNER is used where application teams can collaborate and design all aspects of an event-driven architecture.
They start by defining the application that will be developed,
then define relationship of applications through what event are being produced and consumed by each application.
Then define a payload associated with each event or REST API can be used to import defined existing schema registry and
finally visualize the interaction of events between applications.
In the case where we have an existing architecture, EVENT DISCOVERY is used where they can capture and analyze events flowing in real time across a pubSub+ broker running on-prem systems, architects can then explore event topic hierarchy being used and link it to existing applications. Over the next few months, we will have what we call “runtime discovery” which will allow you to discover not only events, but publishers, subscribers and in some cases schemas from your running event mesh and import that into the portal. So – more than just event discovery.
The second piece of the puzzle are the developers who discover, understand and reuse events across applications, lines of business and between external organizations.
Once the architects are done defining the events and relationship between applications, developers can then build their business microservice
by which They use the AsyncAPI app which is a code generator that generates broker API code that is basically a skeleton code on how to connect to the Solace event broker, publish and subscribe events. Developers can then develop their application logic.
The last piece of the puzzle is the data scientist who is responsible to understand event-driven data and discover new insights by combining event. For event insights, Data scientists use EVENT CATALOG which is a storefront for all applications events, schemas created on Event Portal. It provides an extensible searchable interface where you can access all existing events, schemas and applications. Event Catalog is also used by architects and devlopers to see what events, applications and schemas are available when developing new application
Now that your architecture is real-time and event-driven, and you’ve liberated data from closed systems and silos, you can interact with partners, suppliers and customers in an event-driven manner by alerting them about events of interest, and ideally receiving events from them.
By eliminating the need for batch-based, and the load and lag of constant API polling by your partners, event-driven interaction provides the benefits of more real-time responsiveness to extend beyond your IT systems.
An example would be:
Manufacturers that interact in real-time with suppliers to maintain a more accurate and up to date view of their supply chain.
To summarize, if your organization wants to become an event-driven enterprise, you need to:
1. first Event-enable your existing systems by breaking down silos and liberate data by letting your applications publish events as they happen and listen for and act on them. So you enable systems that contain the data many other applications need or that your people can innovate with. So the end goal is to get events streaming in real-time between those legacy systems and your modern microservices
2. the second step is modernizing your application environment so you can take advantage of more dynamic cloud infrastructure, whether its public cloud, private or often both. You would also want to integrate with SaaS applications that you don’t need to manage yourself, and take advantage of innovative platforms in public clouds to support machine learning, analytics.
So this step addresses two issues, first the connectivity and distribution of event brokers running in different environments and this is through the EVENT MESH that routes real-time events in a lossless, secure, robust and performant manner, and does it dynamically without configuration.
The second issue that is addressed is the management of events like what events are being produced, what information they carry, how to access them, understanding application interactions, choreography and information flow as well as knowing what systems produce and consume what information. And That is through the event portal which provides a single source of truth to understand and evolve everything about your events – the schemas of the data they carry and their relationships with applications across your entire enterprise.
With an event mesh seamlessly distributing events between applicationsacross cloud, on-prem and IoT environments, and an event portal making it easy to collaboratively manage all of those events and event-driven applications, it’s time for step 3.
3. Alert & inform by which you can interact with partners, suppliers and customers in an event-driven manner by alerting them about events of interest, and ideally receiving events from them
To give you a sneak peak into how the Solace PubSub+ platform was designed to help Enterprises unlock the full potential of event-driven architecture, let me walk you through the different components of the platform.
(Click) At the runtime data movement level,
The platform is represented by pubsub+ event Brokers which can be deployed across all Cloud, on-premises and iot environments.
The event Brokers are available in three flavours, we have the hardware-based Appliance capable of millions of messages a second with ultra-low latency, we have software broker that can be deployed in any container environment and we have a Saas offering that is available on all the public cloud. All these brokers regardless of their location can be easily connected into an event mesh so you can stream events between applications and connected devices throughout your enterprise
(Click) and how do you achieve that, it’s through the management and monitoring level,
the Pubsub+ platform includes solutions that make it easy to deploy Event Brokers either stand-alone or as part of an event mesh as well as monitoring the health and optimizing the performance of your event driven system
(Click) At the integration level,
Pubsub+ Event Brokers provide a variety of built-in support of open standard protocols and APIs so you can create and connect apps with whatever API or protocols you choose without having to worry about translation.
(Click) At the application Level,
The Event Portal gives developers and Architects the tools they need to design, describe, discover and govern events within their system and to see the relationship between applications and events. Also to visualize application choreography and generate code making event-driven applications and microservices easier to design, deploy, consume and evolve
(Click) At the security level,
PubSub+ Platform enables messaging architectures that deliver consistent multi-protocol client authentication and authorization security across the enterprise and is deeply integrated with enterprise authentication services using a minimal set of components.
(Click) All of the above features and capabilities can be accessed through a single Cloud Console with a single log-in, making it easy for architects, developers and other users to work ,collaborate, and to drive the enterprise’s EDA mission forward.
So really the pubsub_ platform provides a complete event streaming and management platform for the real-time enterprise.