Episode 24: Burkhard on Qt Embedded Systems
Welcome to Episode 24 of my newsletter on Qt Embedded Systems
I have developed user interfaces (UIs) for machines, harvesters, cars and e-bikes for the last 15 years. The machines are engineering marvels. The UIs not so much. They are a lot better than their predecessors and my customers are super happy. Nevertheless, I think they lack something special. They don't make new things possible. They don't turn average users into expert users by simplifying and automating the operation of the machines.
This must change! Awesome machines deserve awesome UIs that make users more productive and less stressed. I am on a mission to free smart machines from dumb user interfaces. As a special treat for the 2nd anniversary of my newsletter, you are the first to read my mission statement. It's a first version, which will go through a couple of revisions in the next weeks. So, hit the reply button and tell me where I am wrong.
Enjoy reading and take care - Burkhard 💜
My Posts and Talks
Developing a QML Keypad with TDD
I walk you through developing a QML keypad with TDD step by step. The new QML modules from Qt 6.2 make the test setup super easy. A bit of trial and error filled in the gaps in the documentation of QtQuickTest. It was fun from then on. So there are no excuses any more: We can test-drive QML code.
Hexagonal Architecture: The Standard for Qt Embedded Applications
In my talk at Meeting Embedded 2021, I argue that the Hexagonal Architecture should be the standard architecture for UI applications on embedded devices. As software and system architects we should never have to justify why we use the Hexagonal Architecture. In contrast, the people, who don't want to use Hexagonal Architecture, should justify their opinion. They will be wrong almost always. The linked post provides the video and the slides of my talk.
My Perspective: Free Smart Machines from Dumb User Interfaces
Situational Appraisal
The user interfaces of most machines are dumb. Users start with a clear idea of the desired result. Then, they spend too much time to coax the machine towards the right result. They may even end up with a wanting result or no result at all. The problem: Human-machine interfaces focus too much on the machine and too little on the human. It doesn't have to be this way!
Tinsmith workshops buy metal-sheet bending machines for €70,000 and more, roughly the price of a BMW 7 Series. In the hands of experts, these machines can create metal origami like decorative wall caps or garden art. Experts also have no problem figuring out the machine steps to bend flat metal sheets into roof ridge caps, brackets for rain pipes or covers for window sills.
While experts draw a 2D profile on a touch display or on a piece of paper, they already have a clear idea which steps they must perform with the machine to produce the profile. They effortlessly translate what to do in how to do it. They program the machine and then produce the desired result - in no time.
95% of machine users are no experts. They struggle with this translation process. They may spend an inordinate amount of time on figuring out the bending steps. They may even fail.
Many manufacturers complain about a shortage of skilled users for their machines. Actually, the real problem is a shortage of smart user interfaces (UIs). Smart human-machine interfaces (HMIs) translate between the human world and the machine world. They do what an expert user does with ease.
Step 1: Understanding Human-Machine Interaction
The first step towards a smart user interface is a deep understanding of how users interact with the machine.
Where do users of different expertise levels struggle or fail?
How do expert users solve problems where regular users struggle or fail?
How often do users perform certain interactions?
We can answer these questions, for example, by watching users during their daily work, by doing an apprenticeship, and by performing usability tests. The answers tell us where users struggle and why. This brings us to the next question: What information is needed to ease or eliminate the users' struggles?
Step 2: Uncover Missing Information
The second step towards a smart user interface is to uncover missing information. These questions guide the discovery.
What information is missing in the UI to simplify or automate user interactions?
Where do we find the missing information?
How do we get missing information to the UI application?
If no information is missing, we found a low-hanging fruit. We can implement the improvement solely in the UI application, release it promptly, and improve the lives of the users a little bit. Missing information may already be available on the machine, but isn't passed to the UI application yet.
Missing information may be available on other machines. Workers in the same or other tinsmith workshops may have successfully bent the same or similar profiles on machines of the same or different manufacturers. By making these hundreds or maybe thousands of profiles available on each machine, manufacturers would make the users of their machines much more productive.
Manufacturers may be afraid to exchange profiles in a standardised format with their competitors. They shouldn't. The competitive advantage does not come from owning a big pile of data but from exploiting this data to simplify and automate machine operation.
Step 3: Simplify and Automate Human-Machine Interaction
The third step towards a smart user interface is to simplify or even automate the problematic usage scenarios from Step 1 based on the information uncovered in Step 2. Let us see how this extra information - the comprehensive library of fully-sequenced profiles - helps tinsmiths with their work.
Tinsmiths are very likely to find a profile in the library that is similar to the desired profile. They adapt the segment lengths and angles, reuse the existing bending sequence and are done. In few cases, they compose several profiles into a new profile and specify the bending order for the composed profiles. In rare cases, tinsmiths create a profile from scratch guided by their knowledge and intuition. They simulate the bending sequences in the UI to figure out whether the machine can bend the profile. Rinse and repeat if not.
As tinsmiths can reuse existing profiles with little tweaks most of the time, interaction with the machine is considerably simpler. But we can do a lot better! We can automate each step the user must do manually.
Guided by heuristics, the UI software generates several candidate bending sequences from the profile library, checks them for feasibility by simulating them, and selects the best sequence. The user takes the machine through this sequence step by step to produce the given profile.
The heuristic codifies the knowledge of expert users. This will often involve machine learning or even deep learning. Forage harvesters, for example, use image recognition to decide based on the leaf colour whether to cut the maize shorter or longer.
Machines make their newly learned knowledge available to all other machines. Every user benefits. Automation and knowledge sharing makes UIs smarter and users more productive - in a virtuous cycle. Such improvements neither come for free nor right away. They require powerful and future-proof hardware and software platforms.
Step 4: Invest in the Future Now
The fourth step towards a smart user interface is to design a hardware and software platform that manufacturers can use for the lifetime of their machines. Manufacturers must invest today in hardware that will be powerful enough to handle the software demands in 10-15 years. They must create a software platform that is easy to extend in many small steps. They must be able to deploy improvements to all machines frequently. Many small improvements quickly add up to big improvements.
This move is too bold for many manufacturers. They shy away from the higher costs for premium hardware and software now and stifle innovation later. They typically end up rewriting large parts of the software and replacing the hardware every 5 years - on new machines. Old machines rarely get software updates and run on weak hardware right from the beginning. The situation becomes worse, because manufacturers often depend on a single supplier. These suppliers do everything to become irreplaceable. By investing little in the beginning, manufacturers gain little to nothing over the next 10-15 years or even lose money.
Tesla were bold enough. They were the first to put a supercomputer in their cars. This supercomputer is ideally suited for being in charge of semi-autonomous driving and of the user interface, as it concentrates all the necessary information from the car. Tesla were the first to update the car's software automatically and regularly. This enabled them to improve their cars at breakneck speed. Tesla's bet paid off. They went from nearly bust to one of the most valuable companies in the world.
A clever investment in the right software, hardware and people at the right time makes all the difference for the success of a machine. And - smart user interfaces play a crucial role. They make users more productive and let them do new things with the machines. More productive users lead to more machines sold. In short:
Smart user interfaces let manufacturers sell more machines at higher prices.
Reading
Peter Schneider: Can Cross-Platform Development Prepare You for the Next Chip Shortage?
Unsurprisingly, Peter's and The Qt Company's answer to the chip shortage is cross-platform. Together with the operating system, Qt does an excellent job in hiding the SoC specifics behind clean interfaces. If one SoC is in short supply, we can move our Qt applications within hours to another SoC still in ample supply.
Qt mitigates the effects of the chip shortage, but it doesn't remedy its cause. It doesn't reduce the number of chips used. In the post What’s Wrong with Multiple Display Computers in Driver Cabins, I suggest to replace the 2-4 display computers (terminals) by one computer with 2-4 monitors. Besides being cheaper, it replaces, say, four iMX6 SoCs by one iMX8 SoC.
Tesla take this approach a lot further (see Tesla teardown finds electronics 6 years ahead of Toyota and VW and ). They reduced the number of ECUs from more than 60 in normal cars to less than 12. Every ECU has multiple chips. That's a lot of chips saved!
And that's most likely why Tesla "were able to substitute alternative chips and then write the firmware in a matter of weeks.” They simply had to move a lot less software from one SoC to another than other car manufacturers. This is an excellent example of a successful system architecture.
Michael Nygard: Documenting Architecture Decisions
Architecture Decision Records (ADRs) are a very lightweight documentation of architectural decisions. They are used by agile development teams to document the architecture. Developers write ADRs on demand during the lifetime of a project. They serve as a project memory.
Similarly to Architecturally Significant Requirements (ASRs), ADRs can describe constraints, quality attributes, influential functional requirements and other influencers. They have the following standardised format.
Title. The title is a short telling phrase with an identifier like ADR-7: Ethernet connection between telematics and terminal.
Context. This section describes the forces shaping the decision in a factual tone. Forces can be technological (e.g., CAN doesn't have enough bandwidth for OTA updates), political, social or local to the project.
Decision. This section describes the decision taken based on the context. It best starts with We will ...
Status. The status is one of proposed, accepted, deprecated or superseded.
Consequences. This section describes the new context after implementing the decision. We list all consequence, no matter if they are positive, negative or neutral.