Besides lacking a guarantee, traditional software has also proven to be a very expensive method for validating critical software—software that needs to be flawless—like that of a TEE.
In the aerospace and nuclear energy industries in the late 20th century, when traditional testing was the only available software validation option, software engineers spent much of their time reviewing their code by hand. They were keenly aware that an insidious bug might cost lives.
Since there were no tools that could guarantee their code would behave correctly, they combed through it line by line, looking for any coding irregularity that might result in undefined behavior. It was tedious. It took lots of time and energy and was extremely costly. Software engineers were stressed and got fed up. The turnover rate among them was very high, which added to the cost.
A study by the National Research Council in 2001 found that “The shortage of software engineers is an acute problem in the commercial and defense aerospace industry… High-potential young software engineers often leave for jobs in non-defense industries where pay scales are higher and perceived opportunities are more exciting.”1 Burnout from code review may have been a contributing factor.
By the end of the 1990s, it had become clear in safety-critical industries that software that had to be perfect needed new verification tools that could guarantee determinism and safety. Now, twenty years later, with the mass proliferation of both web-connected devices and cybercrime, it is clear that similar tools are needed in the development of trusted execution environments, as well as other applications tasked with protecting connected devices from cyberattacks.