- Related Content
Where RPA makes sense to use - and where not
Do robotic process automation bots pay off?
Many financial service providers are relying on robotic process automation (RPA) in their digitalization campaigns - primarily to counter competitive pressure and shortages of skilled workers. But do process automation and associated bots always deliver what they promise in terms of increased efficiency and process improvement? The short answer is: it depends.
No one for everything: When RPA is the recommended tool of choice
"Theoretically, you can bring up a bot for any user interface application. Theoretically."
It mills through database entries and relieves the workload when, for example, the workload in specialist departments reaches a seasonal peak. The bot then takes over "simple" manual data entry. It triggers check processes in the year-end business of insurers. Or it clicks through user interfaces according to the algorithm. In this way, data can be read from tables or invoices and entered into the desired data fields. Bots also support departments in testing new software components or releases. Process automation robots provide invaluable services during data migration, even though they are only used once.
Marcus Bringe is Principal Consultant for Robotic Process Automation at IKOR, a technology consultancy focused on the insurance industry. Dominica Bengs is Practice Lead for Process Optimization at the company's sister ADWEKO, a technology consultancy in the banking and finance sector. Both companies operate under the umbrella of the X1F Group.
The initial situation decides: Bot or not after all? To Bot or not to Bot?
Time savings, cost savings, and error reduction are the most compelling reasons for using RPA. Typical process automation bots perform repetitive tasks faster and more accurately than humans. By relieving employees of time-consuming, manual tasks, bots free up time and resources for more value-added tasks. In addition, there are fewer human errors typical of repetitive manual tasks. Bots (unlike routine clerks) work accurately and consistently in any "life situation". Process accuracy and efficiency improve and the risk of error decreases.
However, RPA also has its limitations: Someone has to program the bot using no-code or low-code applications such as UiPath, Pega, BluePrism, iRPA from SAP (SAP Build PA) or Automation Anywhere. Depending on the tool selection and automation project, users should have a certain level of programming knowledge. In particular, they should have a thorough understanding of the technical processes in the insurance business - such as claims recognition, assessment and settlement. A basic understanding alone is not enough.
Sample plan based on an upload process
- A directory searches for files (according to its algorithm).
- If the website crashes, the files are stored in an intermediate folder.
- A log checks if the upload was successful - if there is a name match with the file in the intermediate folder and the actual uploaded file.
- If the website crashes, a so-called "try-catch" loop ensures that the upload is started again.
- The files are filtered based on their names.
This process is time-consuming but easy to implement; the benefits exceed the effort in a very short time.
When bots provide the process interim manager
Weighing up the costs and benefits depends on whether the bot is used in the short, medium or long term and for testing or comprehensive business process automation, for example, because: RPA best solves recurring and standardized tasks. If necessary, however, the software robot is only used as an interim solution - for example, until the targeted interface is programmed.
The interface as the means of choice provides stable data structures ensuring data is processed in a concerted manner and darkened with control mechanisms. A bot is no longer necessary after this. Especially with large amounts of data, interfaces are more efficient than individual bots. This is because the robots need time to handle data sequentially - i.e. to extract and insert - all this times X data sets.
When the business decision for a particular process falls on a bot, the scalability of RPA is significant: companies can gradually expand their automation efforts with coding infrastructures such as UiPath and adapt them to increased requirements. Existing automations can be expanded without massive conversions or investments. In addition, the integration of business logic can be used more flexibly than with an interface. The platforms also offer a wide range of functions, including process recording, workflow designers, data extraction, analysis tools, as well as classic code and artificial intelligence (AI) at the machine learning (ML) level.
Since RPA tools can be seamlessly integrated into existing systems and applications, companies can smoothly connect their existing software solutions to the automation platforms and use the full power of the discipline. Companies also don't need to fundamentally change their existing IT infrastructure to do so.
A framework documents the process diagram including the business applications and describes the in-code documentation. On top of this, a knowledgeable third party checks and develops the process and code in perspective. The prerequisites for this are clear: a strict approach to memory paths, clear variable naming, code documentation, orchestration, transparent responsibilities and effective error management have long proven to be factors critical to the success of RPA.
RPA requires a stable infrastructure
However, no matter how eloquent bots are, they are dependent on a stable source website in their respective work environments. They have to adapt to regular changes because user interfaces or frontends on which the bots work are subject to regular adjustments. Changes, in turn, affect robot reliability: If the user interface on which bots work can no longer be reached or fields can no longer be clicked - or information can no longer be retrieved via data scraping - a classic bot terminates. Or it produces an error message. If a robot also has to authenticate itself in order to access certain functions, such as uploading or downloading documents (see info box for an example), it needs the appropriate access rights.
In addition, the robot must support the file formats that arise in its work process. Compatibility issues when interacting with different format versions complicate its work. In the worst case, if the positions, elements, IDs or classes have changed, the bot has difficulty finding the information it needs. Compatibility issues often even prevent a bot from working altogether. Accordingly, the responsible administrators must also "smooth out" the bot.
Typical error sources beyond dynamic content on websites are predominantly network problems or timeouts. A bot recognizes such errors and reacts according to its programming. Ideally, it continues the process or notifies the process owners.
People remain important in RPA
Despite RPA (with an emphasis on automation), people are still essential: The specialist department needs access in order to clarify new issues in connection with the robot activity at any time. Responsibilities for processes should be clearly defined - for faster project progress and the simplest possible error handling, because: Fast decision-making processes through regular monitoring and optimization support a fast process. This also includes maintenance aspects. For example, insurers always keep a handle on interfaces that are adapted graphically and have dynamic content and structures.
The article appeared in „Versicherungsforen Themendossier 17/2023“ in September 2023 (PDF in German).