Individual solutions for human-machine interfaces (HMI), adaptations, extensions or redesign are part of my offer.
- I have already corrected error-prone HMI implementations
- Missing components integrated in the existing HMI
- Complete HMI developed and commissioned for special machines
- Component monitoring for the control devices added
- as well as numerous incidents and system failures resolved
In the aerospace industry, I developed a robot program that works safely on the expensive objects despite very flexible movement patterns that can be changed by the user.
I also programmed robots to pick up, lay down, position, measure, paint, coat and try out alternatives if an action failed. For example, if a measurement gave unrealistic values due to reflections, the position for measuring was changed minimally on the second attempt. Reflections, which of course I had already warned the machine planners about in advance, but who knew how to ignore good advice, after all, they had a lot more experience, which, contrary to talent, now pales.
I also resolved collision conflicts on site at the customer's by temporarily taking over the control programming. It's easy for me to put theory into practice. Should I make the impossible possible for you?
SAP, cloud systems and domains are to be used more and more frequently. More complex camera systems support significantly more interfaces and can be managed via domains, which centralizes user management and thus makes it cheaper. Common HMI systems such as WinCC are also compatible with it. I still had to program many interfaces or hardware components on the controller myself using various bus systems and to reproduce the protocols byte by byte.
- So I monitored the hard drive durability using specially developed software that was integrated into the specially developed HMI
- Linked applications of different computer architectures
- Controlled robots from an HMI software using a robot API
- Linked a virtual robot API of a simulation with a real controller and recorded videos of how the robot actually worked later
When combining several machines, it is always necessary that each is processed independently of one another, because problems can arise without planned response times.
Many events running in parallel must also be processed in parallel in the control unit. These are then synchronized and coordinated with one another.
The need for this is definitely evident with multi-camera systems, but it also has its advantages with all other devices. In the case of very short processing cycles in particular, this is not possible without it, then it goes in the direction of real-time systems.
The ABB Robotstudio, which I used for this at the time, was designed for simple simulations with just a few components. However, it already supported a virtual reality environment in which one could move through the environment and move elements using 3D glasses and hand controls. It was also advertised that you could program in it, which I thought little of. On the contrary, I even calculated a couple of robot targets in my head, wrote them directly into the source code and they were approached exactly as planned.
In a simulation, I used a robot to stack buckets on pallets in different patterns and heights. Buckets came on a conveyor belt, pallets were transported on transport rollers and lined up for collection. Hundreds of buckets were created and moved through the simulation, so that even a specially purchased computer could not keep up with the calculations and the simulation ran incorrectly or had to be slowed down very much.
A couple of times my developed module was used, with which we could connect the robot simulation in the ABB Studio with PLC and thus test the function of all components before commissioning in order to shorten commissioning times.