Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have bad eyes but apparently not as bad as yours. Let's go ask a blind man if they prefer 1080p or 4K, that'll settle it.

If you don't see the difference, it doesn't mean there's none.



Not OP but if you don‘t see the difference why pay for it? It will also be more taxing for the graphic card or cpu to update 4k, I believe.


Huh !? No one said 'the difference is not there'.

I said the difference is not meaningful enough (for me and many other ppl, if i read the comments) and that stating that 1080p 'is simply a none starter' and it 'cannot' render text 'correctly' it not true. It can render it more correctly yes.

To be really annoying even 8k wont render text 100% correctly since if there is a circle like section example 'p' or 'b' no pixel based device would be able to give a 'true rendition' of the symbol since you need basically infinite precision to correctly display a circle like concept on gridbased scheme(i.e pixel based display)

Sure a 4k or 8k will approximate it better but(and this is really my point

4k or 8k wil be better(yes) but 1080p it sufficient for a good subset of ppl.

Not really sure how the blind man helps either side of this discussion ?


The whole argument is silly. People coded on 21" CRT and LCD monitors at 640×480 even, and yes ... they still live!

While higher resolutions look nicer, they provide no real value for a console. I've had 4x xterms on my desktop for 30+ years.

There is zero "extra space" with higher dpi. None.

One sad downside of things looking nicer, is the battery cost.

UHD vs HD means 4x the pixels, and a GPU rendering all those pixels. If you want a 14hr battery, and light weight, you won't be getting it with a UHD screen in 2022.


Pointing at the GPU is an excuse. That thing doesn't need to go over 1% load to make good text happen.


yup !




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: