Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
/thread
*instantly unsubs from discussion as well*
If you see both colors combined, it's because the monitor is already sharper than your own visual acuity. A 25" monitor at normal desktop distance will be blended. Adding more pixels into it won't change what you perceive from the image. If you go higher than 25", the pixel density will get lower, and you may begin to see differences and have to go to a 1440P monitor which has the same pixel density at 32" as a 1080P screen does at 24. And that is the scenario where you notice differences.
above 25", 4K make look sharper than 1080p, but it won't look sharper than 1440p until you get up to 38" at desktop distances. Meaning about 25-30" from the screen. Gamers aren't gaming on a monitor this size usually. Unless it's an ultrawide in which that doesn't change pixel density.
Now, here's the other side of this. If you play a game at 4K, typically it's going to take 4x the hardware to render it at the same frames per second. Which means for a modern game you may struggle to maintain 30fps on high settings without using gimmicks like DLSS and dynamic resolution, both of which aren't really 4K, they're lower resolutions with a filter. If you're running a game at 30fps, it's going to look blurrier when the camera moves than a lower resolution will running at higher frame rates. The motion blur will be greater than the differences between resolutions you can't even see.
This is why gamers, at least, any of them with two neurons to rub together, do not game at 4K.
figure someone has 8 smart phones screens linked right now and going It works fine.
I mean... you could watch it, but it would look atrocious. VHS on a CRT TV would look better.
Snything above and you really need a logic course.
Mostly the same for me. The difference I have though is when it comes to doing things most people don't do for games. I play a lot of older stuff, including SNES, PS1, and later. Things scale better into 1440P usually. 240p, 480p, 720p all fit into it perfectly, but more interesting to me at this point is filters and shaders. Doing a proper CRT shader where each pixel of game content is broken down into vertical channels of red, green, and blue to simulate the mask on a CRT screen does look and scale better with more real estate. And playing games with those filters is absolutely amazing.
You would never really tell a different at normal viewing distances with todays games as they are all blurred from the factory.
The only monitor to make a huge difference is my 16" 1600P monitor. But it sucks outside of it's crazy PPI.
The biggest reason they don't make games like that is because even an AAA game will have less than 1000 users that actually run such an expensive setup.
https://steamhost.cn/hwsurvey/
Another reason is consoles. I don't think there's even one that runs past 4k. (Correct me if I'm wrong.)
Devs can't be bothered to update the UI when they port from PC to console. They certainly don't care about resolution.
Even remakes of old games are lucky to get a 1080 boost. (Looking at you 'Mass Effect'.)