The changing face of embedded software development

The software development sector has changed almost beyond recognition, as it has had to adjust to the rapidly changing environment it operates in. Software packages are no longer self-contained islands with few links to the outside world. Application software is not confined to the data centre anymore.
Devices containing embedded software are no longer stand-alone. Against this backdrop, traditional approaches to developing software have had to change, issuing a challenge to the application development community.
The first wall to go was inside the IT system. Service-oriented architectures achieved the critical mass needed for some of the decades-old visions of ‘object-oriented’ thinking to move from the fringe to the mainstream of application development. The resulting software shows at least potential for more and better re-use of functionality, and opportunities to make previously separate modules and applications work together.
Next, better network connectivity broke down the wall between embedded systems and the rest of the technology world. Today, an in-flight jet engine can report sensor readings to a service centre. A medical imaging device can connect to an image processing support system. A car accessory can report real-time vehicle location data to an insurance application.
A good example illustrating the new ways of thinking software designers, architects and developers need to apply is an experimental business application which will only allow a car journey to be validated as a business trip once the driver has allowed it to set the in-car electronics to ‘eco’ rather than ‘sport’ mode.
This new environment for software means that some of the traditional assumptions that have shaped our way of thinking about software have to change. The signs are already there in travel-related applications, where pricing is no longer tied to a static price list but varies in real time, in response to customer demand.
The fundamental trend here is software behaving more and more like a control application, responding to actual, sensed conditions rather than to an abstract model of the outside world held in its database.
For example, asset management applications traditionally used to focus on scheduling inspection and maintenance. As such, they would enforce appropriate workflows for making changes, ensure documentation was updated, and apply business rules for asset valuations. All these steps were based on a person interacting with the software to review and update the underlying database. Today, maintenance engineers simply connect to an asset and read its status page. Maintenance can be scheduled according to actual, not expected, usage.
Wherever you look, you will find this sort of transition.
The value of the smart-metering revolution in electricity, gas and water depends on the capability of the meters to report more detailed, virtually real-time readings. This capability will enable utility companies to identify peaks and troughs in supply and demand as they happen, and come up with new ways of handling them. For instance, they might accord customers a lower price for electricity when charging their cars providing they agree to let the utility company use the car battery as a temporary electricity store while it is plugged in.
And of course, the use of web-interfaces means we all expect the software to work from any device, anywhere, with retained session status and offline updates available as needed.
For software development teams, all of the above means that their code will have to cope with many more unexpected conditions, situations and uses than it ever had to in the past. This will increase pressure on development teams, not least because there is a dark side to be dealt with. Every network connection, just like every data input point, represents an infiltration opportunity for malware. Software development techniques will have to build in suitable firewalls to fend off such threats. This is not an easy task when you are extending functionality across a multitude of boundaries and have to gauge the myriad potential opportunities you might be creating for malware to attack.
This is the world of the systems engineer. Established in automotive, aerospace and defence, systems engineering has a history of being able to consider the ‘big picture’ of interacting yet independent subsystems and components. The discipline has the tools and thought processes in place to scope problem domains, define and track requirements, and manage multiple views of a solution space.
Complex environments like this highlight the importance of being able to define ‘atomic’ subsystems and components. These units must function appropriately even when the software configuration around them changes, so they can be used as building blocks for larger systems. Skills and techniques such as the use of normal forms in database design, and also abstraction and encapsulation in object-oriented software development, are intended to help deliver this ‘building block’ capability.
Consequently, software development teams are entering into new areas in a number of ways. First, their software no longer stands on its own, it is just one piece in a much larger jigsaw, all parts of which need to be considered in the development process. Second, the requirements – even for that single piece of the jigsaw – are becoming more complex. This is due to the growing use of real-world information and the expectation that, rather than just batch processing, software needs to behave more like a control system. Third, the technologies and tools that developers use will have to handle the connections between the data centre and the real world. With this, any leeway that used to exist for anything less than absolute rigour in specification, development, test and release processes will soon be squeezed out because customers, industry networks and products increasingly depend on the consistent availability of every software component.
As a result, development skills and tools have to progress. The role of development tools and methods is going to become even more central for software development teams. The balance will shift from a patchwork of specialist tools to integrated toolkits. The unique development environments created to suit the needs of individual teams will give way to the toolset designed to optimize the performance of the whole development organisation. The main drivers for this standardisation will be financial as well as operational efficiencies, for example greater flexibility in deploying developers and re-assigning development tasks across the extended team.
There will also be more and more development environments where the software release cycle has to be integrated with that of a manufactured product. After all, when you plug your phone into your PC and give the manufacturer’s support website access to solve a problem, you don’t expect a ‘device-not-recognised’ message.
For many software teams, this means new prospective suppliers of tools are appearing on the horizon. As embedded software has grown inside products, the tools used by design engineers have grown to look more like software engineering tools – source code management, configuration management, requirements management and so on. And design engineers using engineering applications for mechanical and electrical design are discovering that the traditional suppliers of software development tools are offering capabilities such as configuration management relevant to all of their work, as well as the tools they need for embedded software development.
There will still be room for the software genius. But they can’t play every instrument in the orchestra.
Peter Thorne, Managing Director, Cambashi, is responsible for consulting projects related to the new product introduction process, e-business, and other industrial applications of information and communication technologies. He has applied information technology to engineering and manufacturing enterprises for more than 20 years, holding development, marketing and management positions with both user and vendor organisations. He holds a Master of Arts degree in Natural Sciences and Computer Science from Cambridge University, is a Chartered Engineer, and a member of the British Computer Society.