Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I dont understand, you want high bitrate accurate display to the point of actually considering XDR, but simultaneously you really want LOSSY compression? The not so secret secret of Display Stream Compression is it degrades picture quality.


It was news to me, but XDR _also_ uses DSC.

I want high refresh rates, which help even for operations work, web browsing, not just photography and eye fatigue. And I want HDR.

Frankly, the only reason I'd consider the XDR is as mentioned, aesthetics - I've got the 2019 Mac Pro, and I recently bought a house and set my desk at home up in the middle of my office, not against a wall, so entirely superficially, the aesthetics of the back of the XDR display could look nice.

The only thing the XDR has going for it is color accuracy (which is orthogonal, though certainly impacted, I'm sure, to lossy compression), and resolution (though I still like my two ultra thin bezel 4K screens versus one 6K screen). The refresh rate on the XDR is 60Hz.

Oh, and the XDR is not bugged/broken/crippled by Apple so as not to be able to run at full capability.


Lossy compression methods are usually smarter at choosing where to degrade picture quality. E.g. reducing JPEG quality usually results in much better pictures at the same file size than reducing image resolution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: