Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The blur happens on the GPU. HEVC encode also happens on the GPU (or at least a GPU-adjacent device; it's rarely a full-shader affair). If you were to use HEVC software encode with GPU blur, you'd need to send the camera data to the GPU, pull it back to the CPU, and then software encode. Performant GPU readback is often cumbersome enough that developers won't bother.


But it is still more performant to do so in general. There are more image corrections of great quality happening than just background removal nowadays, like lighting improvements or sometimes upscaling, and you wouldn't want to do all that on the CPU.

But also, HW encoding of some codecs is not always of great quality and doesn't support the advanced features required for RTC, so the CPU encoding code-path is sometimes even forced! While it doesn't necessarily apply to HEVC as you'd need a license for it (and almost all apps rely on the system having one), it's happening for VP9 or AV1 occasionally more frequently.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: