Unix time has discrepancies whenever leap seconds occur (several times in my career aquiring geophysical data).
If you're measuring | controlling objects in the physical world (cars, rockets, etc) then you should not use unix time - those glitches will happen and instantaneous computations will go kooky.
But so does resolving to a date. I don't see how resolving to a date which cares about leap days fixes any of that.
You should use a monotonic clock with an arbitrary starting point anyway, unless you need some kind of synchronization between devices, but you probably wouldn't use unixtime there anyway.
> But so does resolving to a date. I don't see how resolving to a date which cares about leap days fixes any of that.
So why bring it up then?
> You should use a monotonic clock with an arbitrary starting point anyway
Sure. We started doing that more than 50 years ago now when broad area geophysical surveying started off.
> unless you need some kind of synchronization between devices,
Can't see the problem - there are ways of syncing base station records against aircraft | boat | vehicle records in post processing .. all the stations, fixed or mobile, use a monotonic epoch based record structure that hold channel data and any sync marks that are broadcast by whatever means - raw GPS time serves well enough for a grain of 1.5 seconds, other marks can be used as required.
If you're measuring | controlling objects in the physical world (cars, rockets, etc) then you should not use unix time - those glitches will happen and instantaneous computations will go kooky.
https://en.wikipedia.org/wiki/Unix_time#Leap_seconds