Every embedded designer knows the world revolves around microprocessors, right? I mean, without CPUs and MCUs, there'd be no "Internet of Things," no Google Glass, and no Internet-enabled espresso machines – and how horrible would that be? It doesn't bear thinking about.
Still, there are some philistines who think it's the software, not the silicon, that makes embedded systems hum. Chips, the thinking goes, are just a means to an end.
Then there's a third approach: that it's the development tools – the compiler, debugger, IDE, and so on – that are really the most important part of a development project. So who's telling the truth?
To check this out, I surveyed a whole bunch of engineers around the world (where "whole bunch" is defined as a few thousand) and asked them about the relative importance of their chosen CPU, their favorite operating system, and their preferred development tools. The question went something like this: "If you were stranded on a desert island and could only choose one, would you stick with (a) your current microprocessor; (b) your current operating system, or (c) your current development tools."
The right answer, of course, is A: microprocessor. The answer the survey-takers gave, however, was C: development tools. Wait… what?
I was so sure they'd misunderstood the question that we corralled a bunch of them into a room, fed them sandwiches, and asked the question again, this time slower. And again they claimed their loyalties lay with their development tools. What's wrong with these people?
After some more discussion and a few more sandwiches, I finally came around to their way of thinking. Engineers are like carpenters. You can change a carpenter's project, from a bookshelf one day, chair the next day, roll-top desk after that, and so on. You can change the wood he uses: oak, pine, mahogany, etc. But don't try to change his tools. The tools are what he touches and uses every day, and they're what make him productive, happy, and confident in his abilities. Take the tools from his hands and you've deprived him of his most valuable skills and experience. (There's some backup data on this effect here: https://www-ssl.intel.com/content/www/us/en/intelligent-systems/industrial-applications/thriving-dev... Development Ecosystem Supports Intel® Microprocessors: Paper).
The same is true of programmers and engineers. Change their tools and you've abandoned the hard-fought experience that makes them so valuable. Good managers know this and don't try to force different development toolchains on their staff. Let the developers use what they want to use, and they'll be happier, more productive, and commit fewer mistakes. Spend the time creating new code, not climbing the learning curve.
Turns out, development tools also play a big part in choosing the processor for a new project. As often as not, engineering teams will choose their development tools first, and then choose a processor family that supports those tools – not the other way around. If a chip (or chip family) doesn't support the team's favorite compiler or debugger or operating system, it never makes the short list.
All of this is good news for the more well-established chip families. Over time, they accumulate the largest assortment of development tools, operating systems, compilers, middleware, debuggers, and so on. That's something a relatively new CPU architecture can't do. You can't grow a big oak tree in a weekend, no matter how hard you water it. Garnering third-party tool support is the best indication of a successful chip family.
They're kind of like trophies. The more development tools you've got on your wall, the more successful you are as an embedded processor.
I'm still not ready to give up on the idea of microprocessors as the center of the embedded world. But I guess I'm willing to tolerate some reasoned dissent. Just so long as I get my Internet-enabled espresso.