It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
Have we hit a wall in terms of graphics development, and perhaps video games overall?

Look at Half Life 2 from 2004 and compare it to Half Life from 1998. The difference is quite stark. Half Life 2 has better weapons, story, graphics, voice acting, AND HAS SUBTITLES. The only thing 1 is better at is that the levels are more open. And perhaps it also has some cool weapons they apparently didn't care to reintroduce in 2.

Compare that to modernity. Games from 2020 obviously look better than in 2014, yes, but the difference is much less stark. Witcher 3 or GTA V still look fantastic on my low-end PC 5 years later.

So have we hit a wall? Like, we can't go any further in the development of graphics?

Same goes for games overall. I think that after Cyberpunk 2077, we will have hit a wall in how great and original a game can be. Afterwards, it will be just rehashing of old stuff.
Post edited November 15, 2020 by GeraltOfRivia_PL
Would be good regarding graphics, ultra-realistic graphics which can't be distinguished from real life would be pretty creepy imo. And while graphics can be important, there should be more focus on gameplay imo, and arguably much of gaming may have regressed in this regard compared to the "golden era" of late 1990s/early 2000s.

And "after Cyberpunk 2077 it will just be rehashing old stuff" is a bit silly imo, don't fall for the hype, saves you inevitable disappointment.
Post edited November 15, 2020 by morolf
avatar
GeraltOfRivia_PL: So have we hit a wall?
In terms of originality, maybe; there are always some bright exceptions here and there though. In terms of graphics, no, definitely not and i doubt we'll hit any wall soon (if ever). Also, if you want HL1 with better graphics, you can try Black Mesa.
No.
No. Shit doesn't look like AVATAR, yet.
avatar
GeraltOfRivia_PL: Witcher 3 or GTA V still look fantastic on my low-end PC 5 years later.
Define "low end pc" please. I recall you have a RTX 2060? in poland?
avatar
Dark_art_: Define "low end pc" please. I recall you have a RTX 2060? in poland?
I was just thinking - Witcher 3 and GTA V on a low end PC?
It seems to me that, regarding 3D graphics, we have been into diminishing returns for a while. For me personally, modern game graphics are well past the point where they are good enough for me to enjoy a game. I don't need (or necessarily want) photo-realistic graphics, or anything close.

However, I don't agree that means that games development is going to plateau, because a good game is about much more than just graphics. The games I am more interested in are the ones where developers are focusing on other areas besides graphics, such as writing, character development, immersion, gameplay design, AI, atmosphere, player freedom, replayability.

For example, AI is an area where there is massive scope for further advancement, with more sophisticated AI and machine learning. Imagine playing against AI enemies that can learn and adapt to your strategies on the fly. Or conversing with an NPC that is almost indistinguishable from a real person. No game has even come close to that sort of cool stuff yet, because all the devs are chasing after this 'photorealism' stuff.

I also don't think we have exhausted the potential yet of 2D games. I don't subscribe to the view that 3D games are the only place where games innovation is happening.
no, there's still so many things to improve but it's been mega slow because the rich gets richer and hogging all the cash, the world is in a slow development
Until we have full ray tracing technology at 120fps at 8k resolution with photo-realistic everything (including physics, hair physics, boob physics, etc) then there will be more to grow towards.


Is the technology pretty darn good? Sure. And other than AAA and huge companies we won't go anywhere near what they have now or in the future.

Meaning we'll probably see things ramp up on the high end, and continue to have overpowered machines for things like pixel art games and indie games.
I think this is why backwards compatibility is getting a lot more press than usual with these new consoles. The difference between a 2010 game and a 2020 is pretty minor, even with graphics. Which isn't to say there's no difference, but when you take a 2012 game and play it at a high resolution does it really stand out as "old?" It doesn't for me.
No, desktop sized quantum computers are still on the way, though probably not in our lifetimes sadly :(
We've reached a point were adding finer details costs a lot of GPU power, even in the examples you make there are like 6 years of GPU development which in terms of technology and microchips manufactoring are really a lot of time.

Similarly, IE, to go from raster rendering to raytracing hits your hardware heavily to get, in some cases, almost indistinguishable outputs in some cases.
Eh?
avatar
Dark_art_: Define "low end pc" please. I recall you have a RTX 2060? in poland?
avatar
Sachys: I was just thinking - Witcher 3 and GTA V on a low end PC?
It depends what kind of resolution and FPS you are content with.

I've played both on my 8 years old gaming laptop (ASUS G75VW) with
Intel i7-3610QM (2.30 GHz, 4 cores)
NVidia Geforce GTX 670M

I guess that is very low-end by today's standards, especially compared to gaming desktops.
The Witcher 3 seemed to run at about 30 FPS or thereabouts in the 1280x720 resolution and medium settings.
GTA V was... playable with similar settings, but in some scenes the FPS dipped too low to my liking, like 20 FPS or even under.

However, if the resolution must be minimum 1920x1080 or 4K or 8K and FPS 60Hz or 120Hz, then no dice.
Post edited November 16, 2020 by timppu