Skip to content

Of application understanding and machine learning or where I am coming from

grey dots fade right

Of application understanding and machine learning or where I am coming from

For the past 25 years, I have been working in the area of automation.  But this is not where I started my career. During my master’s program at USC, I was first introduced to artificial intelligence, a field which immediately fascinated me.  At that time, jobs related to AI were far and few between, but my first job was indeed relevant: working with rule-based decision systems. Since then, the field of artificial intelligence has progressed a lot, for the most part without grabbing headlines.  Only during the recent years it has captured wide attention once more, particularly the field of machine learning. It reminds me of the hype that web and “dot com” introduced in the 1990s. Bubble anyone? Only this time around, the advances in this field have provided a solid foundation to sustain machine learning expansion.  My second job introduced me to the world of application and language understanding, and this is where I have been active for many years. A lot of fascinating things happened during that time, and as you will see things are now headed to where I started. The cycle is now closing.

Understanding application and their code

In the beginning, there was the mainframe.  Beasts of power that could handle a tremendous number of transactions, but which came at a cost.  Next, the “network was the computer” as Sun Microsystems heralded to the world. Companies now had alternatives to the mainframe monoliths.

There was a problem, however.  Successful companies had invested a lot of money in creating mainframe processes that gave them the edge, and of course reams of custom code to go with it.  What to do with all that custom code now? The will was strong but the code was weak, proprietary and incompatible. And we should not forget that relational databases were not yet taken for granted.  Hierarchical and network database management systems were strong and, if you ask me, based on elegant concepts and very good at what they were doing. But history is (re)written by the winners, so servers and relational databases it became.

The role of automation

The word of the day was Migration.   Migration away from the mainframe monoliths to the client-server.  But that required a splitting of applications across layers in a way that would fit the client-server model:  front-end, business and database layers. This was a tall order since the challenge was multi-level:

  • Front-end:  Going from green screen terminals to fancy graphical terminals with WYSIWYG capabilities (yes, WYSIWYG was a feature back then)
  • Network:  State-fullness was the norm, therefore the correct handling of state was paramount
  • Database: Consistent transaction handling, remaining compatible with transaction managers (not everybody wanted or was willing to switch at the same time)
  • New languages:  COBOL was prevalent for business logic.  However the new server world supported additional languages, such as C, C++, and Java.
  • Operating systems:  Enter Unix and its numerous flavors, Windows and X for front-end interaction.

Clearly a disciplined approach was mandatory in order to succeed.  And this is where application understanding and automation came into the picture.  Application understanding may mean different things to different people, but here we will define it as a way of extracting the key attributes and features of the application.  Extraction is done programmatically in a repeatable manner and stored in a repository as a meta-model of the application and its components. The attributes and features need to be carefully selected in a manner that enables smart decision making when it is time to “rewrite” the application. “Rewriting” actually means both refactoring and rearchitecting.  Refactoring applies to the code, rearchitecting to the application. In our case, rearchitecting was the redistribution of the application components to best match the client-server paradigm. In most cases, breaking up monoliths meant both rearchitecting and refactoring.

The object-oriented paradigm

Most of the code we had to deal with in monolithic systems was written in procedural languages or reporting languages such as COBOL.  Also common was the use of generators, mostly for code, screen definitions and database artifacts. Early refactoring was performed using C, which at the time was the lingua franca of Unix for business and database layers.  For front-end coding, to some degree C++ was the best option in order to take advantage of Windows GUI capabilities.

At the dawn of the millennium, Java gained a strong foothold among developers.  By now it has also become the “favorite” in enterprise development. Java was introduced to a wider developer audience and acted as a facilitator of some key aspects of programming:  object oriented programming, separation of concerns, virtual machines, software packaging and interoperability.

Our use of a meta-model and a repository allowed us to build infrastructure and tools that were flexible and platform neutral.  All due to our intelligent automation approach, which allowed pluggable support of source and target languages and the ability to assimilate and make best use of numerous platforms

ERP and SAP(r)

Another important trend that gained steam in the mid-1990s and has kept growing strong ever since is ERP software and platforms.  For us as a company, one particularly important ERP software was and still is SAP. Our intelligent automation approach enabled us to follow its progress over the last 10-15 years, and we will continue to do so in the future.  The earlier application of our automated solution addressed mostly code refactoring and had a strong focus on ABAP, the programming language of SAP ERP. We were and still are present to support SAP customers with our intelligent automation, particularly when SAP started its journey to digitalization with the support of in-memory databases, the launch of S/4 and the move to the cloud.

Closing words

The beginning of the new year is always a good opportunity to take stock, reflect and plan ahead. As you probably know by now, history repeats itself.  Intelligent automation means staying alert and keeping an eye out for new opportunities. As I hinted earlier, I see the cycle now closing after all these years, especially from my perspective.  I am happy to say that we are now adopting machine learning into our automation repository as a further tool to drive application understanding and open new possibilities to our tools and customers.  In that sense, I would like to wish to all of you a successful start of this New Year and of whatever new chapter you are about to open. From our side, our new year’s resolution is to publish more blogs about us, our intelligent automation approach and the technologies we are using.

Niko Faradouris, Senior Technical Architect, smartShift Technologies, Mannheim

 

Share This

Related resources

Get Your Free Rapid Code Analysis

See first hand what our technology can do for your SAP project. Sign up for a free custom code analysis today.

de_DEDE