And does the background blurring part of their pipeline somehow consume the raw H.265 bitstream directly..? Wouldn't they be blurring based on the raw pixel buffer, before any encoding takes place?
Someone elsewhere in the thread replied that it has to do with the blurring happening on the GPU combined with bandwidth issues reading that dataset back followed by software encoding the video.
If I understand that all correctly blurring is cheap when you already have the raw video data on the gpu for encoding, but introduces too much latency when combined with software encoding.
Software blur should be possible, but the feature has not been implemented and would not be nearly as cheap as it is on the gpu
If so, time for customers to complain to Microsoft.