It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Two ways to avoid bad or outright dead pixels: Either buy the monitor with a pixel guarantee that some shops offer or buy it remotely as you are then covered by the much more generous return policy. Of course, you run the risk of getting sent the same two monitors with dead pixels alternately again and again until you give up or time runs out.

I once bought a monitor with a known dead pixel but was then offered a substantial discount. That is the right way to do things. A pity that shop has gone out of business :-(
avatar
r8V9b1X3u9VcA12p: 1. If I'm not playing action/FPS games, what is the maximum acceptable response time? Should I aim for 1 ms as you said or 5ms max is OK?
Ditto on AB212. < 5 ms is all right for a casual user. For eSports, always go for 1 ms for precision.

2. You said 75 Hz is OK, I thought 144 Hhz was the norm for a 32": is 75 Hz enough for games or would 144 Hz be better to spare my eyes (even in working environment) or there's no noticable difference?
It is the norm for people spending that much anyway since the incremental cost is worth it to them. 32" 75Hz monitor is probably for older people who want high image quality for business use.
- For a non-gamer, 144Hz probably isn't worth it unless you like seeing the fluidity of mouse paths when you shake your mouse violently.
- For a non-eSports gamer, I think it's worth it. But it's something you have to see in-store and determine its worth because it is all completely subjective.

3. I never used DisplayPort but I do have DP connectors behind my PC, what is the best technology, HDMI or DisplayPort (because some monitors seem to work only with one of both)?
You have to compare HDMI and DP version and specifications to determine which one has higher bandwidth to give you a better spec'd picture. When comparing time-equivalent tech, DP has the higher bandwidth. Again, since you own a GTX 1070 Ti, you're pretty much forced to use DP to make use of Freesync / G-sync on your monitor. And with your budget, I believe your only realistic option is Freesync (G-sync compatible) with your criteria.

If you don't encounter the * issue I mentioned above or it's not a big deal to you, then DP is your only choice since your Nvidia GPU doesn't have a Freesync-over-HDMI feature. If you want to keep an option to have Freesync / G-sync compatible over HDMI, ensure you check the specs for a monitor with HDMI 2.1 spec for later when you get a modern GPU (RTX 30+ or RX 6000+).

4. In your experience, do curved monitors make sense for a 32" or is it just a gadget? BTW, I'm not sure I wan't to feel more immersed in my work applications :-p
Ditto on AB2012. Flat monitors is the way to go if work is more important to you. Ultimately, monitor stuff is subjective. You have to go into a store and demo it out.

I am quite biased towards IPS panels and I think that and adaptive sync (Freesync / G-sync) are must-haves for image quality because of its consistent image quality at different angles, which is important for content creators (e.g., graphic design, CAD, photographers, video editors, web design, etc.). VA gives great contrast ratios, but needs calibration and being centered to get the same colour accuracy as an IPS at tolerable ranges of viewing angles. I still believe the filtered selection of two 2560x1440 or two 3440x1440 monitors are great selections:

https://fr.pcpartpicker.com/products/monitor/#r=344001440,256001440&amp;P=2&amp;A=2,3,6,5&amp;sort=price&amp;F=800100000,863600000&amp;X=0,60385

EDIT: if you really want to be anal about RMA and dead pixels, make sure you're familiar with the manufacturers' RMA and warranty policies and compare them, including shipping locations. I'm quite familiar with North American RMA, but EU RMA is a different ballgame.
Post edited June 19, 2021 by Canuck_Cat
avatar
nightcraw1er.488: 3) To use resolution above HD, you will need to use DP ports. Your card will have them. Try to get the latest ports versions. Explanation on wiki:
https://en.wikipedia.org/wiki/DisplayPort
avatar
teceem: I think you're confusing HDMI with DVI. Just look up the specifications of HDMI (especially 2.0 and 2.1). They go way above HD (I assume you mean Full HD, because "just" HD = 720P).
avatar
nightcraw1er.488: Finally, I said it above, you have a nvidia card, you should not look at freesync, use gsync. They are made by the same company to work at maximum benefits, where freesync is designed to work with anything.
Also, it’s always a toss up, money versus what you can get. Me, I would maximise anything I can for the money even if I go over a bit.
Oh, go and see them in action at Fnac (think that’s France, but could be Belgium only). Having a few lined up showing the same image can really help in terms of what you want from colour, brightness etc. Which may not always be the “best” but might suit you more.
avatar
teceem: Yeah, GSync is the "better" technology. Downsides: needs Nvidia, adds a lot to the price, needs active cooling (fan(s), more noise).

Indeed, Fnac is a well-known electronics (and more) chain in Belgium.
I maybe. I simply remember when I move to uwhd monitor the hdmi cable no longer cut it. I had to switch to display port to get the max resolution.
He already has a nvidia 1070 card, hence why it is the superior choice to have gsync.
I know fnac is big in Belgium, just don’t know if it is in France also. Perhaps mediamarkt is a better example.
Post edited June 19, 2021 by nightcraw1er.488
avatar
tfishell: ngl, based on the username I thought this was spam :P
avatar
r8V9b1X3u9VcA12p: It was supposed to be my password but I mixed it up when I created my account :-)
...
Just FYI you can contact Support and they should be able to change your username.
avatar
nightcraw1er.488: I maybe. I simply remember when I move to uwhd monitor the hdmi cable no longer cut it. I had to switch to display port to get the max resolution.
He already has a nvidia 1070 card, hence why it is the superior choice to have gsync.
I know fnac is big in Belgium, just don’t know if it is in France also. Perhaps mediamarkt is a better example.
Ultra Wide is a different matter, I think you need a graphics card that supports HDMI 2.0+ for the UW higher resolutions.

Fnac originated in France, Mediamarkt in Germany. Both are widespread in Belgium. (Although Fnac stores are mostly found in the centres of the bigger cities here)
I think Fnac is going the way of the dodo, like many other smaller electronics stores... For me, it's not a big loss - I always hated visiting the busy shopping streets in the past.
avatar
nightcraw1er.488: I maybe. I simply remember when I move to uwhd monitor the hdmi cable no longer cut it. I had to switch to display port to get the max resolution.
He already has a nvidia 1070 card, hence why it is the superior choice to have gsync.
I know fnac is big in Belgium, just don’t know if it is in France also. Perhaps mediamarkt is a better example.
avatar
teceem: Ultra Wide is a different matter, I think you need a graphics card that supports HDMI 2.0+ for the UW higher resolutions.

Fnac originated in France, Mediamarkt in Germany. Both are widespread in Belgium. (Although Fnac stores are mostly found in the centres of the bigger cities here)
I think Fnac is going the way of the dodo, like many other smaller electronics stores... For me, it's not a big loss - I always hated visiting the busy shopping streets in the past.
Yeah, you miss the point, the guy is in France, so I was trying to think of a store which he might know in France. Not sure they have fnac, but pretty sure they have mediamarkt.
You will be happy with the future then, so let and only controlled by Amazon as a subscription package. No shops.
avatar
Zimerius: Yes, I technically know that game performance and display port on nvidia's site should not be related, that said display port is the way to go for nvidia and..... i think i read to much user feedback that did indicate that there are games out there that apparently NEED display port, to function ...
I agree that given the option, you probably should use displayport, but HDMI and displayport are much of a muchness. HOWEVER, just to clarify on your comment above as it's a little unclear, for the avoidance of doubt, it's incorrect to say that there are games that need displayport to function. The display unit is completely independent of the computer itself - you could connect a screen using a classic VGA adaptor and any game would still work. The only limiting factor would be the refresh rate of the screen (or carrying cable).
avatar
nightcraw1er.488: You will be happy with the future then, so let and only controlled by Amazon as a subscription package. No shops.
I think you misunderstood me (or I didn't communicate it well enough). I've lived in a (big) city centre for more than 2 decades, and in that time 'practical' stores have been disappearing or huddling together in a small area.
To me, the evolution from physical to online shopping often seems like a self-fulfilling prophecy.
I like a bit of balance: not Twin Peaks or the suburbs, but not the sardine life of a big Asian city either.

avatar
pds41: you could connect a screen using a classic VGA adaptor and any game would still work. The only limiting factor would be the refresh rate of the screen (or carrying cable).
Resolution is definitely a limiting factor of D-SUB (VGA). It can't display 4K, and I'm not even sure about 1440P.
Post edited June 20, 2021 by teceem
avatar
r8V9b1X3u9VcA12p: It was supposed to be my password but I mixed it up when I created my account :-)
...
avatar
tfishell: Just FYI you can contact Support and they should be able to change your username.
Users/members have (for a while now) been able to change their user names on here themselves.

https://www.gog.com/account/settings/personal
avatar
Zimerius: Yes, I technically know that game performance and display port on nvidia's site should not be related, that said display port is the way to go for nvidia and..... i think i read to much user feedback that did indicate that there are games out there that apparently NEED display port, to function ...
avatar
pds41: I agree that given the option, you probably should use displayport, but HDMI and displayport are much of a muchness. HOWEVER, just to clarify on your comment above as it's a little unclear, for the avoidance of doubt, it's incorrect to say that there are games that need displayport to function. The display unit is completely independent of the computer itself - you could connect a screen using a classic VGA adaptor and any game would still work. The only limiting factor would be the refresh rate of the screen (or carrying cable).
I agree though ..... maybe with the incredible versatility in builds and maintenance levels display port might also be the safest option out there? at least, as a theory
avatar
nightcraw1er.488: You will be happy with the future then, so let and only controlled by Amazon as a subscription package. No shops.
avatar
teceem: I think you misunderstood me (or I didn't communicate it well enough). I've lived in a (big) city centre for more than 2 decades, and in that time 'practical' stores have been disappearing or huddling together in a small area.
To me, the evolution from physical to online shopping often seems like a self-fulfilling prophecy.
I like a bit of balance: not Twin Peaks or the suburbs, but not the sardine life of a big Asian city either.

avatar
pds41: you could connect a screen using a classic VGA adaptor and any game would still work. The only limiting factor would be the refresh rate of the screen (or carrying cable).
avatar
teceem: Resolution is definitely a limiting factor of D-SUB (VGA). It can't display 4K, and I'm not even sure about 1440P.
Also agreed. I missed that bit when phrasing my response. All I was trying to get across was a clarification that displayport isn't required, but I'd still recommend using it.
Post edited June 20, 2021 by pds41
avatar
gog2002x: Whatever monitor you end up with, make sure it has no defects. :)
avatar
teceem: That's impossible to "make sure of". Even if you buy a display model in-store; some defects might only become apparent once in use.
Possibly the worst defect (still, arguably) is dead pixels. It's not that common anymore as it once was - but you can only avoid it 100% by buying a display model in-store. Of course, display models are not "brand new, unused", they're sort-of second hand (but without the possible higher price discount).
Maybe I should have clarified it a bit.

Perhaps I should have said, whatever monitor he ends up buying, make sure to try it out to see if there are any defects. As far as the defects, it may not be common any more, but it's best to make sure you're not the unlucky customer.

The original one I bought in 2016, had such a problem. Thank goodness it was within the return / exchange policy.

As a person who's worked in the service / light industrial industry for a while now, I can't tell you how often things get caught by QC. And we only do sampling, not 100% inspection. Though with better training and supervision, things have gotten better. But it's far from perfect.

Every customer has a different experience with their hardware or software. Many I'm sure are fortunate and I count my blessing that for the most part, I'm in that group. :)
.
Post edited June 21, 2021 by gog2002x
Test it with your fist, so hopefully it wont break when you rage after losing to bunch of cheating kids.
avatar
skeletonbow: If you're looking for more of a budget conscious model, I have 2 of these and they're great and also on sale, however they've been on sale about $200 cheaper when I got them:
https://www.dell.com/en-ca/shop/dell-ultrasharp-30-monitor-with-premiercolor-up3017/apd/210-AJGT/monitors-monitor-accessories
avatar
teceem: I'm sure that 8K has its purposes... I highly doubt that playing games at desk-distance is one of them. Definitely not now, I can't imagine in the future.
...
Similar attitudes towards video hardware capabilities have existed in the PC gaming space the entire time I've been a PC gamer since 1994 onward. With every new bump up to any higher resolution, frame rate, color depth, physical monitor size or aspect ratio, or other improvements there are people who look forward to the improvements with glee, as well as people who think it is unnecessary and brings no benefit at all for the increased cost, as well as people who claim that they can't even see any difference whatsoever and think it is just a ripoff.

Then a few years goes by and what once was once a feature exclusively to higher end more expensive models starts to trickle into the mainstream, where it is less of an exclusive high end feature and now becomes a more mid level "serious" feature, and then eventually it makes it into all hardware even the bargain budget space. Once it hits the point where it is just absolutely everywhere (such as is the case of 1080p 16:9 widescreen displays currently), then it becomes everyone's minimum expectation of what is "good enough", and those upgrading reluctantly from whatever was "good enough" 5-10 years ago suddenly experience excitement over the new technology that they couldn't see any benefit of whatsoever before.

It's almost like merely the pricetag of something can cause people to be technically unable to see the benefit of said technology, but when the price comes down to the point where it is a baseline technology, they are magically cured and can now see the benefits of 1440p/4k/HDR/whatever suddenly. Like the much lower price was the cure for their strange blindness. LOL

The same thing happened going from 640x480 being the standard to 800x600 to 1024x768, 1600x1200, and with widescreen from 720p to 1080p and now to 1440p and 4k (and later 8k), as well as from 14" monitors being standard to 15" then 17" then 19" then 21" then 24" wide and now 27" is becoming the new norm. Perhaps the feature people adapted to most quickly was the shift from 8 bit color to 15&16 bit and then to 24bit color, once we had bus speeds fast enough to accommodate it along with linear framebuffer access.

From the initial release of a new higher resolution display until mass widespread adoption of it as the new standard, takes 15 or more years but eventually it happens. 4k for example came out about 14 years ago or so and is still only a small fraction of the computer display market, while 1440p adoption is about 4x more than that and growing faster. Within about 4-5 years we're likely to see 1440p become more widespread than 1080p is today finally, and within a few years of that we might see 4k replace that. 8k is pretty niche at the moment granted, but no more so than 4k was in 2011 or so, and it now makes up around 2.44% of the gaming market. We wont likely see 8k become the popular standard on computer displays for many years to come, probably not until the 2030s but that day is definitely coming just like it did for 4k and 1440p.

Can anyone tell the difference? Not in terms of human bias over cost that's for sure. When price isn't a factor people will be complaining about how blocky 8k displays look and wish they could afford 16k. And yeah, 16k will be a thing too, it already exists for high end cameras (as does 32k), and it'll eventually trickle down to consumer space decades from now too. Above that however the DPI becomes too high for additional improvements to have much bearing so we probably wont see anything beyond that... if we're even alive then. :)

All hail our 8k display manufacturer overlords, who have the foresight to know it is going to be the hot gaming item in 2030!


Update: We should all bookmark my post to refer back to it in 2030 to see just how correct I was. Chances are that GOG's forum software will still be the same and not had any improvements made to it over time nor be replaced by something new, so the links will probably still work.
Post edited June 21, 2021 by skeletonbow
avatar
skeletonbow: …snip
look and wish they could afford 16k. And yeah, 16k will be a thing too, it already exists for high end cameras (as does 32k), and it'll eventually trickle down to consumer space decades from now too. Above that however the DPI becomes too high for additional improvements to have much bearing so we probably wont see anything beyond that... if we're even alive then. :)
…snip
Surely there is a limit to DPI, I mean 8k must already be more pixels per inch than reality itself?
More an issue is the amount a storage, ram, processors, etc. Needed to push such vast quantities of information through. It’s ok for a still image, but real time rendered. I would need a Bitcoin mining farm to render anything >5fps at 32k.