Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sort of, kind of, but not shot at the same time, and not at the same location.

I would object slightly less if they made a model (3D or AI) that captures the whole side of the Moon in high detail, and used that, combined with precise location and date/time, to guide resolving the blob in camera input into a high-resolution rendering *that matches, with high accuracy and precision, what the camera would actually see if it had better optics and sensor*. It still feels like faking things, but at least the goal would be to match reality as close as possible.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: