“21st Century Architecture to Solve 21st Century Problems”
Insights Learned from Mighael Botha, CTO of Software AG
In his presentation “The Vision of the Digital Enterprise.”
Presented at the Technology Executives Club Technology Innovation & Leadership Summit August 2015.
By Alex Jarett
At a recent Technology Innovation & Leadership Summit (August, 2015), Mighael Botha, CTO of Software AG kicked off the event with a presentation called “The Vision of the Digital Enterprise.” This was a terrific presentation that reviewed a critical issue – that of matching your architecture and technology with your challenges.
Mighael’s presentation is unique, in that he has the ability as a world class CIO and CTO to tie the understanding of the new customer experience to the back-end architecture and/or technology required to make the experience a reality.
In the first two posts, I summarized what Mighael indicated were the three drivers of Digital Disruption that required new technology solutions and then summarized the sample use cases from different companies.
Mighael makes a great case that creating a better experience for these larger companies creates technology and architectural challenges.
In the third and key part of his presentation, Mighael presents what he calls a “21st Century Architecture to Solve 21st Century Problems.” In today’s Post, I’ll share the net-net key components of his proposed 21st Century Architecture.
Disclaimer: I am not an architect – but after writing this summary (and a few others) I realize how important, more than ever, this process is to our Midcap and Enterprise readers and I realized I had to share this information and highlight Mighael’s work. If I got a detail wrong, forgive me!
First –Companies that are shifting to a better customer experience and digital enterprise soon realize they have to shift from making decisions further down the road to instantly acting on events that occur within the organization. As an example, Mighael points to Greyhound, which used to take their sensor data off the bus at the terminals and then download the data to headquarters for analysis in the following few days. That doesn’t help if the radiator hose breaks on the next trip while they are analyzing the data. They wanted to look at the data quicker to have a more immediate impact on the bus.
Let’s start with the data.
Mighael suggests that most large enterprise companies are already using Big Data databases such as Hadoop or SAP Hana to solve some of the data problems with their core company data. The first new requirement that we see is integrating web site data and now IOT data. You can create a totally different customer experience by putting web site data into the integration platform. Mighael goes on to suggest most companies are already able to do integration today, but the use cases change when you talk about scalability, speed, and web site integration.
Scalability and Web Site Integration though Event Driven Architecture
Here’s how to do scalability and web site integration, according to Mighael.
Most customers run ESBs and a Messaging system for their Integration platform. Mighael believes in using a strong ESB approach combined with APIs for Micro Services for low latency.
The secret sauce that Mighael has seen is the shift to an event driven architecture. What this means is you take events that happen in systems and publish them via a messaging system. Every event can be published. For example, Apple uses Expeditors for shipping. Expeditors use Reddit for messaging. They publish EVERYTHING on the messaging platform.
The Power of Publishing Everything on Messaging
Once you move to this messaging platform that is publishing all the events, you can tap into these event streams with agile based apps. This allows you to solve the problems for shifting to a digital experience using your agile, innovation teams. This also allows you create a more dynamic business process.
Mighael said you can also publish the events back into your data products like Hadoop, which allows you to speed up your predictive analytics.
Use Memory Computing
As you can imagine, to make this all work, you need fast computing. The shift to an event driven architecture leads to the need to process the transactions faster.
The good news? Memory is inexpensive. At the time of the presentation, Mighael pointed to a Samsung RAM module of 16 Terabytes!
Mighael suggests that the solution is to publish the data directly in memory.
As an example, he illustrates that PayPal generates 7 to 8 terabytes worth of data each day. They are a big user of in memory computing. So they could keep 2 days of data in one RAM module if they wanted to.
In another example, Macy’s MDM strategy is to put it all in memory. This is because of black Friday. They’ll get over 2 million people in the stores on Black Friday. They wanted to access their loyalty customer data in real time at point of sale so the sales assistant could say, “You’ve got enough points to get your tie, want to do it?” On a normal day, spooling for the data would be fine, but on Black Friday, with millions of customers looking at once, they need a faster solution. That’s why they put it in memory.
Gain the Ability to do Complex Event Processing and Continuous Analytics
One of the additional benefits of this approach, according to Mighael, is the ability to do complex event processing and continuous analytics.
Mighael first says that location based analytics is very common. The harder thing to do is real time integration. In the case of Greyhound, they track 28 event streams at one time. With complex event processing they are able to correlate the events within a time window and then decide if they should take action.
He indicates you can also take these events back into your BI along with your Hadoop, Cassandra or other DB. As a result, you’ll find new, better models. Mighael says you can get very deep into building a platform with a neural network to learn from the past and potentially predict what will happen on a faster basis. You can take action earlier.
For Mighael, looking at the digital enterprise, it’s creating this event driven architecture that helps him build on a digital platform.
He can also bring things together like the cloud and social. For example, if you are big into BPM, Six Sigma, etc. there is opportunity to bring the processes into better execution and also take the information and put it on a more effective dashboard.
The last technology Mighael covered is data lakes. Using data lakes allows many companies to process unstructured data very quickly in memory. You can cut expensive MIPS usage by front ending the mainframes with real time memory. They are actually putting data in memory instead of the mainframe. Mainframe offloading is very big for some companies.
As an example, look at Healthcare.gov. When they rolled it out, it was too slow. There were way more transactions and load on the system than they could have predicted. They had actually built a great 3 Tiered Architecture, but it was too slow for this volume.
Here’s how they solved it. While the insurance application is being created and processed, they keep it in memory. Once the application was completed, it went to disk. This one change made a huge difference and they went from 10-12 seconds to a microsecond response.
As you can see, Mighael did a great job of showing how the use cases lead to the need for integration and faster processing, which in turn leads to a different type of architectural solution.
He described some very complex architecture in an accessible fashion. I hope this summary gave you some great ideas!
What do you think? Make your comments to this post here:
See Part one of this post here.
See Part two here
Want to see Mighael’s entire presentation? Go here: http://www.technologyinnovationinstitute.com/institutetraining/training/digitalenterprise.php