Software and industry evolution
Computer scientists have always tried to implement techniques to extend the lifespan and the field of action of the software. Programming standards, reusable libraries, object-oriented programming are all valid examples to achieve this goal. Languages and cross-platform frameworks have been created to generate applications for different Operating Systems starting from the same codebase.
This has ignited a kind of “never-ending rush” to create applications built on top of the latest technology or framework, with the risk of getting on the wrong train or investing in something that the market could leave aside after a couple of years or so. Not to mention the growing number of Open-Source Software (OSS) solutions adopted by the companies to cope with the increasingly faster iterations to build software, but that doesn’t eliminate the complexity and often introduce UI/UX frictions or limit the interoperability.
Legacy systems and new paradigms
The world is more and more interconnected and cloud based: apps for mobile devices and increasingly appealing UIs clearly show this. There’s still, anyway, a relevant backbone of software that guarantees many basic functionalities written with dinosaur languages like Cobol: designed in 1959, standardized in 1968 and with many revisions.
There are a lot of legacy systems that supply essential functionalities and handle operations worth $3 billion daily –almost all in banking systems (ATMs and transactions). They were codified many years ago and still work without issues today (applying the golden rule that “If it isn’t broken, don’t fix it…”). Today, the specialists who implemented those systems have retired, and companies face problems finding people able to support these systems.
Even without considering software written in Cobol, we must recognize that there are still many applications in use, created with other languages (Java, .NET, C/C++), implemented with a “monolithic structure” (that was quite common yet in the early 2000s). These applications often have hit the limitations of such an approach. More recent solutions use “XaaS platforms”, with undeniable advantages concerning the “fat-client” approach but with the drawback to be generally quite complex. They are often associated with other custom ancillary applications with the undesirable effect of poor graphical user experience.
The ION approach to the Desktop Software Integration
When we write software in ION, we try to think long term: this is generally easier when you write for the back end but it gets a bit more complicated when dealing with User Interfaces. The Cobol use case we’ve seen before is a good example: it is not uncommon for a server-side application to be used for years without significant changes. Also, in a system developed as a microservices platform, it is easier to introduce new services or replace old ones if the public API has been correctly designed: the ION Platform is a good example of this kind of system.
Front ends are different since they are written to provide functionalities to users who interact daily with them. The User Interface and the User Experience are the main concerns when designing the application – the openness of the solution is not the first aspect you generally try to implement. ION’s user interface has a couple of decades of life: it has always been flexible in terms of internal evolutions but also keen to include the plug-ins developed by our clients. ION knows that our clients’ Desktops have become crowded with windows and tools, and new software has been added to the spark of daily applications.
The challenge is to integrate them all and keep them “in sync” so that the data context is consistent across all the applications and the relevant information is available to the final user seamlessly minimizing the number of operations the human must do to complete a task. And we wanted to do that not tying the implementation to a specific technology.
Book a demo or speak to our team to find out more about our ION Desktop interoperability framework.