"we believe binary artifacts should instead be treated as the result of a computational process; it is that process that needs to be fully captured to support independent verification of the source/binary correspondence. "
So binary artifacts should verifiably come from source code, which can be accomplished by signing it with a signing mechanism specified in the source code?
Since this is coming from the GNU folk, they naturally have their inclinations towards open-source software, but I'd argue (and they probably would too) that reproducibility is a much stronger invariant than just code signing.
Bootstrapping everything from a tiny first stage compiler and getting bit-identical compiled outputs is a much higher level of confidence than PKI offers, as PKI can be cracked, stolen, made to sign things it shouldn't, etc. Even if the signature is legit, it doesn't help you against insider risk (e.g. internally added backdoors) on closed source software.
These are all things governments (should probably) care about.
Well, Guix sidesteps that problem by (rightly) pointing out that Intel microcode updates are non-free software, and thus aren't included in the system. If one wants those updates, they have to do it themselves, usually by using a software channel that provides ways to use non-free software on their system, which means that the user makes a conscious choice to use non-free stuff instead of it being handed from up high.
It might not be a satisfying answer, but oh well. One can complain at Intel about it.
So binary artifacts should verifiably come from source code, which can be accomplished by signing it with a signing mechanism specified in the source code?