Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The "average color" (or whatever it is) could have been pre-computed server-side rather than tiring out the poor innocent client CPUs.


But then Google would be responsible for that one-time computation instead of making the clients do it billions of times.


They could do it on a few clients then ship the data back to the server. If they’re resourceful those clients don’t even need to be watching the video! (they could send it and compute the output in the background of another stream)


But that's a distributed problem now and those use up valuable developer time, which we know is the most important resource in the world...


Yeah, but if google solve distributed processing just imagine the cost savings elsewhere (imagine involuntarily crowdsourcing video encoding, it would save them millions)


The effect is created by scaling and blurring the storyboard images that are also used for the seek preview. The image is refreshed every ten seconds, with a CSS opacity animation fading from the old background image to the next over a couple of seconds.

This sounds like it should be relatively cheap if composition is properly accelerated.


Couldn't this be done cheaply on the GPU?


The browser isn’t the ideal place to do things “on the GPU” unless the site is designed around it.


It could probably be somewhat cheaply extracted during decoding.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: