>> [99% of all code is cold]
> in my experience it's about 90-9-1
I don't think such numbers can be meaningfully presented as representative data. They are probably true for certain specific application domains such as crypto or signal processing, if you squint at the data just the right way (i.e., by benchmarking a crypto library with a tiny driver program, not as part of a real application).
I did some profiling on the SPEC CPU 2000 benchmarks some years ago. I don't have the exact numbers lying around. But I recall large variation among the different benchmarks, with anything from 90% of execution time in the top function to the top 60% of execution time spread evenly (i.e., 20% each) over the top three functions. So some things certainly follow the patterns above, and others... don't.
I've also several times optimized programs until their profile was so flat that you could say that 100% of the code was lukewarm.
I don't think such numbers can be meaningfully presented as representative data. They are probably true for certain specific application domains such as crypto or signal processing, if you squint at the data just the right way (i.e., by benchmarking a crypto library with a tiny driver program, not as part of a real application).
I did some profiling on the SPEC CPU 2000 benchmarks some years ago. I don't have the exact numbers lying around. But I recall large variation among the different benchmarks, with anything from 90% of execution time in the top function to the top 60% of execution time spread evenly (i.e., 20% each) over the top three functions. So some things certainly follow the patterns above, and others... don't.
I've also several times optimized programs until their profile was so flat that you could say that 100% of the code was lukewarm.