Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

expecting apple clients to pay attention to best industry practices have never endes well. but good luck


Hardly an Apple specific issue.

People have been writing non-portable assembly for pretty much every architecture for as long as assembly has been a thing.

Heck, people write non-portable C and C++ too by using compiler specific extensions or intrinsics all the time too, that only end up working on very specific stacks.

Or when you go to web and people target browser specific extensions and behaviour.

Developers will target and test the bare minimum of what they need to get the results they want.


To be fair writing trully portable C that does useful things is not that simple. When was the last time you made sure your code supports CHAR_BIT != 8 for example.


Yeah for sure, I just didn’t want to introduce multi-architecture portability since the main issue at hand was intra-arch portability.

But yeah, when you get into multiple architectures, it gets gnarly. Take even something as simple as instruction ordering when porting from x86_64 to arm64


I thought that because the C standard requires sizeof(char) to be 1, CHAR_BIT cannot be anything other than 8, unless the architecture is redefining what 'byte' means.


That's right, char in C doesn't necessarily mean an 8-bit byte. There are modern architectures where there is no 8-bit type and CHAR_BIT is 16.


The possibility for a byte to be something other than 8 bit is the reason, why some people refer to 8 bit as octet in network protocols etc.

I think I remember somebody pointing to such a machine in some special area (Telco?) still in use in a previous discussion here ... but if that still exists it's very very rare and mostly historical.


Several DSPs don’t use 8 bit chars

https://stackoverflow.com/a/2098444


I agree.

Not to mention planned obsolescence of C and c++ syntax via ISO, that on a medium cycle (5/10 years).

With c++ it is even worse: its syntax complexity is grotesque and absurd, it requires beyond sanity compilers, hence limiting to very few the ones which actually "work". c++ and similar should be avoided like hell.

But C has already a waaaaay too rich and complex syntax: integer promotion, enum, switch, typedef should go away (and probably more), only sized primitive types (u8/16/32/64, s8/16/32/64, f32/64, or ub/uw/ud/uq...), only explicit casts (runtime/compile-time) with only implicit casts from literals and maybe pointers reducing to void*, only one loop keyword (loop{}), finally fix macro functions with variable arguments for good, explicit compile-time constant (don't rely on compiler optimisation), no anonymous code block, etc.

Sometimes I ask myself if I should code in SPIR-V instead of C...

In the view of all this, I now code assembly with a very simple and cheap SDK. Namely, I currently code x86_64. I still may code some plain and simple C, but I'll compile and run them with NOT gcc or clang (usually very small alternative C compilers, tinycc, cproc, etc), and I am carefull to stick to c89 with "benign" c99/c11.

As for arm64 assembly, I am not there yet. Additionally RISC-V is happening: namely I will favor a worldwide royalty free standard by default (which are not arm64 neither x86_64), and may still use some C reference implementation for arm64 targets.


Why would removing all those features be helpful? Most of those are important and very useful


Or the result their boss wants and pays for.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: