Monitor resolution
My monitor has a native resolution of 1080p 144hz , but in display settings it goes upto 1440p 75hz .. also in all games there are resolution beyond 1080p and its all upto 1440p , and even in wrhammer 40k darktide it shows 1440p as my native resolution .... why

Now when i set to 1440p resolution ingame or in windows display settings , my monitor shows warning that im not on optimal resolution thing, and it is clearly noticable that it does not support 1440p ...

so whats going wrong and what should i do...

link:
https://amzn.in/d/hrUN2Io
< >
Showing 1-15 of 17 comments
_I_ 29 Jul @ 12:13am 
disable dsr/vsr in gpu control panel

its forcing it to render at a higher res and scale it down
Originally posted by _I_:
disable dsr/vsr in gpu control panel

its forcing it to render at a higher res and scale it down
i tried to find it but its not in the gpu control panel of amd , i think its off by default , the only thing that i turned on from gpu control panel is fsr 4 support over 3.1 .. gpu rx 9060xt 16gb
Last edited by † ROGUE †; 29 Jul @ 12:18am
Where is the monitor connected to?
GPU or motherboard.
DVI hdmi display adapter?
Originally posted by Jaunitta 🌸:
Where is the monitor connected to?
GPU or motherboard.
DVI hdmi display adapter?
gpu Display port
Last edited by † ROGUE †; 29 Jul @ 12:23am
_I_ 29 Jul @ 12:31am 
amd calls it vsr, nvidia dsr
they do the same thing, look up where it would be in the control panel

https://www.amd.com/en/resources/support-articles/faqs/DH3-010.html
disable it
Last edited by _I_; 29 Jul @ 12:32am
Yes if you are appearing to have any Display Scaling issues; double check the Screen Res; as well as the OS DPI% (put this on 100% to be correct size for native res) and disable DSR/VSR
I don't know for sure, but I did have a monitor once that allowed resolutions higher than the native resolution of the panel.
Windows used to get confused because it just assumed that the highest resolution the monitor reported was the resolution of the panel, and it wasn't.
The solution for me was to download an editor that allows you to edit the data file that Windows downloads from the monitor and delete the resolutions that you don't want.
I'm sorry, I can't remember the name of the application, you will have to search the internet for it. It's not easy to use, either, it's quite confusing, but it did work.
Last edited by Pocahawtness; 29 Jul @ 12:42am
Originally posted by _I_:
amd calls it vsr, nvidia dsr
they do the same thing, look up where it would be in the control panel

https://www.amd.com/en/resources/support-articles/faqs/DH3-010.html
disable it


Originally posted by Bad 💀 Motha:
Yes if you are appearing to have any Display Scaling issues; double check the Screen Res; as well as the OS DPI% (put this on 100% to be correct size for native res) and disable DSR/VSR
vsr is disabled , i just checked the gpu control panel
Yes I can't really give you 100% answer as to why.

Sometimes it's the way the GPU detects the Display or something.

On my GTX/RTX GPUs I've never had any issue like that. However when I went to setup a Lenovo (had AMD Radeon iGPU) and an HP (Intel iGPU) customer desktop PCs on my 1440p/165Hz Display; at first I thought everything was fine, but then YouTube was really sluggish; so I check around and see it's pushing the Display @ 2160p/60Hz for some reason. But WinOS had the DPI% turned up, which is why it "looked" like 1440p
Last edited by Bad 💀 Motha; 29 Jul @ 12:44am
Originally posted by Bad 💀 Motha:
Yes I can't really give you 100% answer as to why.

Sometimes it's the way the GPU detects the Display or something.

On my GTX/RTX GPUs I've never had any issue like that. However when I went to setup a Lenovo (had AMD Radeon iGPU) and an HP (Intel iGPU) customer desktop PCs on my 1440p/165Hz Display; at first I thought everything was fine, but then YouTube was really sluggish; so I check around and see it's pushing the Display @ 2160p/60Hz for some reason. But WinOS had the DPI% turned up, which is why it "looked" like 1440p
on my windows display settings it shows resolutions upto 1440p but 1080p is marked as recommended though.
so to conclude this , its a monitor firmware bug as its giving wrong info to other softwares ? is that it
Originally posted by † ROGUE †:
so to conclude this , its a monitor firmware bug as its giving wrong info to other softwares ? is that it

No, read my reply. It's not a bug. Its deliberate to allow the monitor to work at higher resolutions. It's a Windows issue that it is assuming the native resolution is the highest resolution the monitor reports. But you can change this, as I said before.

The file only needs to be edited once. Windows only reads it when it detects the monitor for the first time.
Last edited by Pocahawtness; 29 Jul @ 1:08am
Originally posted by Pocahawtness:
Originally posted by † ROGUE †:
so to conclude this , its a monitor firmware bug as its giving wrong info to other softwares ? is that it

No, read my reply. It's not a bug. Its deliberate to allow the monitor to work at higher resolutions. It's a Windows issue that it is assuming the native resolution is the highest resolution the monitor reports. But you can change this, as I said before.

The file only needs to be edited once. Windows only reads it when it detects the monitor for the first time.
i already read it , i mean whats the point of displaying higher resolutions if they are not supported , i feel like its clearly a monitor manufacturer's fault , and the faulty information is getting passed down to other softwares inside the system leading to a total misleading behaviour ... but as long as i set the resolution to 1080p its all good .. atleast this is wht i am thinking of it
Ok have you tried the DDU method? In there is an option to wipe your Displays out also so that after the GPU Driver wipe and reboot your OS can fresh detect the Displays again
Last edited by Bad 💀 Motha; 29 Jul @ 1:53am
Originally posted by † ROGUE †:
Originally posted by Pocahawtness:

No, read my reply. It's not a bug. Its deliberate to allow the monitor to work at higher resolutions. It's a Windows issue that it is assuming the native resolution is the highest resolution the monitor reports. But you can change this, as I said before.

The file only needs to be edited once. Windows only reads it when it detects the monitor for the first time.
i already read it , i mean whats the point of displaying higher resolutions if they are not supported , i feel like its clearly a monitor manufacturer's fault , and the faulty information is getting passed down to other softwares inside the system leading to a total misleading behaviour ... but as long as i set the resolution to 1080p its all good .. atleast this is wht i am thinking of it

They are supported. The monitor will scale the input down to the native resolution. It was something some monitors have in their software to allow greater flexibility.
The real problem lies with Windows or the games, because they assume that the higher resolution is the native resolution, which it isn't.
You can quickly check this because file downloaded from the monitor is displayed in the NVIDIA Control Panel, under Change Resolution. If it shows resolutions higher than the native then that's the problem. And that's why the file needs to be edited. Just remove all the options that are higher than the native, and you are done (I say "just" but it is quite difficult).
Last edited by Pocahawtness; 29 Jul @ 1:57am
< >
Showing 1-15 of 17 comments
Per page: 1530 50