Do you always need the most advanced chips for the military or could use an older technology generation like in industrial machines? For example, the F35 uses older chip generations.
It's mixed. But even if you stick to older generation chips, someone has to supply them which means someone has to make them. Foreign manufacturing and sourcing is a political and security headache even if you go through the hoops to show a secure supply chain and address all their concerns. I had "cybersecurity" people refuse open source project use because someone in France (an ally) had a commit. Don't let them look at the Linux kernel commits... I wouldn't want to deal with that for hardware.
Also F-35 is not something anyone should aspire to. That system, its software in particular, was a project management disaster. LM went to shifts to try to address being late to complete the software. They literally thought they could double or triple their staff to catch up, idiots had never read The Mythical Man Month apparently.
It depends on the application. Things like terminal guidance systems for hypersonic intercept run on CPUs like an ancient MIPS R3000/4000. It doesn’t need anything more. If you are trying to do real-time processing and fusion of the F-35 sensor suite or an AEGIS system, it is extremely compute intensive and so they live much closer to the bleeding edge with regular upgrades because there is an almost unlimited appetite for more processing power if available to support capabilities.
You often see a mix of really old and really new. They only use the latest greatest, ASICs, or similar when there is an absolute advantage to be gained by doing so. The old platforms are proven and reliable so no reason to not use them if they do the job and they often do.
Performance is a key metric of course, but reliability would seem to be an even bigger consideration. I imagine it's similar to the challenge faced by designers of equipment designed to go into space.
The elephant in this particular room is probably autonomous or semi-autonomous weapons that need edge AI acceleration. Using older chips won't be an option for next-generation weapons, I suspect.
Yes, the advanced requirements are clear but I don't think that covers all the military needs (e.g. tanks). The bottom line of my question is if this is a great business opportunity for Intel because you can use legacy chips to cover the deal which are much less expensive to produce than advanced ones. Basically, higher ROI.