Blackhawk wrote: ↑Tue Nov 21, 2023 2:45 pm
Grifman wrote: ↑Tue Nov 21, 2023 12:26 pm
So back in the day, for the most part a monitor was a monitor, but time has passed me by. Now you have all sorts of resolutions, different frequencies, HDR, various types of “syncs”, etc. I’d like to get a new monitor to take advantage of my new RTX 4080 graphics card. I’ve read that 1440p is good because you can get high refresh rates that you can’t get with 4K monitors. I’m looking for either a 27 or 32 inch, not sure whether it should be flat or curved (not sure what the latter brings to the table). Any recommendations from the hive kind? Thanks.
You didn't specify, so I'm assuming that gaming is the goal.
1080 is good, but with larger monitors you can see the pixels. 4k works better on TVs than monitors, although there is some benefit above 27". 1440 is a good sweet spot. I do recommend that you get at least a 144Hz refresh rate, but look
here before you decide exactly how high you need.
'Sync' is complicated. What it essentially does is synchronizes the monitor's refresh rate with the GPU's FPS output. It reduces ghosting (afterimage), tearing, and just generally makes things smoother.
Short version, grossly oversimplified: Imagine you have a 10Hz monitor. It shows 10 pictures per second. Imagine you have a video card that puts out 10 FPS. They line up perfectly. Imagine your GPU is putting out 5 FPS. Now you're getting the same frame twice in a row (it stutters), but it still lines up (every two monitor refreshes (10/sec) gets a new frame (5/sec.) Now imagine that you're getting 7 FPS. It doesn't match up. What happens is that the GPU draws one frame, then the monitor shows it, then the monitor shows part of it again, then the GPU draws another frame, and the monitor... shows you half of the first frame and half of the second. That's called frame tearing, and it looks like this:
The classic Vsync setting addresses this by forcing the GPU's output to match the monitor's output (or numbers that divide into it, so no half frames are created.) In our example of 7FPS on a 10Hz monitor, it would lower your FPS to 5 (in real-world gameplay this isn't nearly as severe a reduction.) What Gsync does is the opposite - it adjust your monitor refresh on the fly to a multiple of the framerate. In our 7/10 example, it would lower the monitor's refresh rate to 7 to match the GPU output. If you had 7FPS on a 20Hz monitor, it would lower the refresh to 14Hz.
Ghosting is when one image isn't properly removed from the display before the next one is drawn. It looks like this:
Gsync helps reduce this, although it can only do so much to compensate for a low-quality monitor.
For Gsync: You've got an Nvidia card. You want either a GSync monitor, or you want a GSync compatible (but there is caveat - see below.) Gsync conmpatible is actually a FreeSync monitor (AMD's equivalent) that works correctly with Gsync. That gets a little complicated, though: Nvidia only certifies a few monitors, while most gaming monitors that are actually compatible are not certified, and therefore are not listed as such. That means a bit of research is in order. Pick a few monitors that suit you, then search for them and 'gsync compatible' and do some reading. Someone will have tested the monitor you're looking at.
Here's the caveat: Some Gsync and Gsync compatible monitors - especially those that use the older tech - cannot handle both Gsync and HDR simultaneously. Some can. If that matters to you, it's something you should check into.
Next up: Response time. Response time is how long it takes a pixel to change color. It's something that doesn't matter much for office use (you don't notice how long it takes for the pixels to change from white to black when you type the letter 'M.' For gaming, though, it adds to input lag. What you want depends on what kind of games you play. If all you play are turn based games, you'll be OK with anything under 4-5ms. If you play anything at all that involves timing, reflexes, or quick responses, you want to look for 1-2ms.
As for curved vs flat... that's more subjective.
Flat:
~Take up more horizontal space
~Tend to give have better response times (see above), which is important for games like shooters
~Smaller monitors aren't wide enough to really benefit from the curve. 27" is usually considered to be right at the point that it could go either way.
Curved
~Take up more front-to-back space
~Can give you a better sense of being 'in the game' as they give you more peripheral vision
~This one might be a deciding factor: If you're using it for PC gaming at a desk, a 32 inch is going to be really, really big. The problem with gaming on huge monitors up close is that the interface (or other visuals) on the edges and the corners can be hard to see. Your health and your status effects might be close to three feet away from each other, far enough that you may need to move your whole head instead of just using your peripheral vision. A curved monitor brings them a little closer, as measured by the distance your eye needs to move.
One factor you didn't bring up: normal aspect ratios (2560x1440) vs ultrawide (3440 x 1440.) Ultrawides do have some advantages - they give you better immersion, and they give you more screen real estate. They do come with some trade-offs, though. They take up more space. They come with a performance impact, although much less than a 4k (1440p = 3.6m pixels to render, 1440 Ultrawide = 4.9m, 4k = 8.3m). They also have compatibility issues with some games (most modern games support ultrawide, but many older games - including most 'vintage' games from before the switch away from CRTs - do not. If you do go ultrawide, the
PC Gaming Wiki has compatibility info, including workarounds and solutions for incompatible games, for almost every title in the last gazillion years.