It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
:cough: Intel ARC B850 :cough:
That ad they put out has got to be one of the worst use cases for a video card I've seen.

Let's trust this hallucinating internet connected parrot to make talk with you because you're a failed streamer who can't even elicit a response from your nonexistant chat.
its all possible with AI
Wow... With competition like this, AMD doesn't even NEED to make new GPU's, everyone will keep buying their old ones. Seriously - 575W????? I'll turn my PC on and my lights will flicker! I'll be paying almost as much as tesla electricity costs these days!
high rated
I swear nVidia announces a new card and people again behave like Pavlov's Dog. Hype is pathetic, I don't subscribe to hype.

But it has more DLSS you say! So what, it's a visual cheat. But it has raytracing! So what not enough developers care to use it.

I'm looking forward to the nVidia crash frankly. And AMD if they keep going the way they are into "muh AI".

S3 and Matrox really should come back from the dead.

I'll keep enjoying my RX 6800s without FSR thanks.
Post edited January 08, 2025 by u2jedi
avatar
sadlyrematch: waow! another new thing to spend hundreds to thousands of dollars on to play modern triple A slop at 30 fps!11!
I remember when £300 graphics cards could run some games as high as 500FPS in 1080, now £500-£2000+ cards can't do 120FPS in 4K. I get it's a big jump from 2 million to over 8 million pixels but having to cheat and use DLSS upscaling to get a decent frame rate at that price tag, really ?
Post edited January 08, 2025 by TeleFan76
Again. the 5090 is not a card intended to be used or gotten by a majority, instead perhaps one of 100 gamers may barely be able to afford it.

In general i still would not use this card at full load all the time, instead sync my FPS with the monitor, so it would be between 60 and 120 FPS in my case (above 120 makes less sense to me, i am not a competition gamer). The "sweet spot" is a bit above 50% load, but no more than 80% load, so in most cases it should require 300-400 W, which is a lot of course, just not as much as most people got in mind. 4k is possible with good performance using it... in most cases even using native + DLAA, which is the whole point for getting a flagship. The last flagships (3090 TI, 4090, 7900 XTX) are still good for 1080P up to 1440 P, but in general this is the limit for the most demanding games... if the performance should stay above 60 FPS.

The more affordable cards either are 1080 P up to 1440 P (comparable to the old flagships) yet they will only be able to always provide good 4k performance using DLSS or other frame generation and/or upscaling technology.

Nvidias and AMDs biggest step is not really any higher efficiency, at least not in native mode... instead their main target is to increase AI and frame generation/frame upscaling technology a lot more, in which the newest stuff surely is exceeding any other products. For people enjoying to use this "generation magics" it surely is the best option; for any other people a old flagship may be almost head to head for in many cases a more affordable price.

Traditional rasterization is nowadays only improving very marginally, so for anyone enjoying a "honest old school way" they will have to face a hard time getting even more "honest performance". I guess the only way better card with honest performance is the one with the very bold price... yes the 5090. The other cards are simply good for using "magics" including the best RT to this date.
Post edited January 08, 2025 by Xeshra
avatar
Xeshra: Again. the 5090 is not a card intended to be used or gotten by a majority, instead perhaps one of 100 gamers may barely be able to afford it.

In general i still would not use this card at full load all the time, instead sync my FPS with the monitor, so it would be between 60 and 120 FPS in my case (above 120 makes less sense to me, i am not a competition gamer). The "sweet spot" is a bit above 50% load, but no more than 80% load, so in most cases it should require 300-400 W, which is a lot of course, just not as much as most people got in mind. 4k is possible with good performance using it... in most cases even using native + DLAA, which is the whole point for getting a flagship. The last flagships (3090 TI, 4090, 7900 XTX) are still good for 1080P up to 1440 P, but in general this is the limit for the most demanding games... if the performance should stay above 60 FPS.

The more affordable cards either are 1080 P up to 1440 P (comparable to the old flagships) yet they will only be able to always provide good 4k performance using DLSS or other frame generation and/or upscaling technology.

Nvidias and AMDs biggest step is not really any higher efficiency, at least not in native mode... instead their main target is to increase AI and frame generation/frame upscaling technology a lot more, in which the newest stuff surely is exceeding any other products. For people enjoying to use this "generation magics" it surely is the best option; for any other people a old flagship may be almost head to head for in many cases a more affordable price.

Traditional rasterization is nowadays only improving very marginally, so for anyone enjoying a "honest old school way" they will have to face a hard time getting even more "honest performance". I guess the only way better card with honest performance is the one with the very bold price... yes the 5090. The other cards are simply good for using "magics" including the best RT to this date.
hey and did you notice ...

the 5070Ti at 31 TF needs to able to hold its own against the 40 freaking 90, of course with DLSS 4.0 and all supportive new game titles
Still a lot of speculations... we do need a 3. party review, else useless going into details.

Making some very dodgy estimations and guessing: I think the 5070 TI will be more performant vs. 4090 if it comes to DLSS without DLAA (native) but probably less performant if it comes to DLAA or any other native mode.

The 5080 could be head to head at least for DLAA and any other native mode, and surely way better at DLSS with a little bit less power consumption.

However... it is only offering 16 GB VRAM which is currently sufficient but not really future-proof (as soon as PS6 is released, VRAM demand may increase everywhere). Future proof is 24 to 32 GB to me... so the 4090 (along with 3090 TI, 7900 XTX) still got a slight edge here.

No 4k no VRAM demand? Nope... this is the traditional knowledge. Nowadays, using RT and/or frame generation/upscaling the VRAM demand may increase even without 4k native resolution... because all those magics is in need of fast VRAM.

If a card is offering 12GB it simply is not meant to go above 1080P and usually not even with any high grade RT (which is impossible on those cards anyway). 8 GB is dead now... this age is over. It basically has been killed by Intels Battlemage setting the new "lowest standard".

Another issue: Because of excessive hardware shortage almost everywhere when it comes to processors (even HDDs... crazy time) the "official price" is rather a myth, i do predict. Guess the real price for the 5080 is more toward 1500 coins i guess but it surely should stay below the 4090 levels, at least.

Regarding the competition, the thing we already knew is pretty much solidified: The 9070 XT is simply better at AI and RT, but in rasterization it may not beat the old flagships.

Some sources got in mind "Nvidia is suffocating AMD", i do not think so because the entry level up to upper mid range offers are in such a stingy condition... so AMD can actually offer a non supreme product and even being able to deliver a better value. The only bad thing is that gamers will still buy Nvidia cards... just because its Nvidia and even with only 8 GB VRAM attached. Yet even Nvidia is not bold enough only providing 8 GB anymore: It may even work for them because of the "Apple privilege" of being able to use half the RAM for the same programs... or so... i heard, and indeed, people seems to believe it.

The buyers are suffocating AMD, not Nvidia... by my logic.
Post edited January 09, 2025 by Xeshra
I'm already happy with the fact that the 3000 series of card also received some sort of DLSS improvement. Higher clarity or something or less blurry. I didn't understand if this is a driver update solution both for gpu and implemented DLSS solution in game but .. it made me happy ;)
avatar
drxenija: 5090 has 32GB VRAM
Whoa! I don't have that much memory even in my head!

The future is here, and the past is tomorrow! A vicious cycle!
avatar
Xeshra: 8 GB is dead now... this age is over.
Reality Check : Out of the 2,000 games I own I think 4-5 need 6GB VRAM, 1,995 work fine on 4GB VRAM, and well over 1,500 would play fine on 2GB GPU's / APU's... Comment isn't aimed at you personally, but the "PC Master Race" crowd get way too carried away with their own inflated sense of self-importance by popping up on Reddit on every new hardware release with the worst 12x cherry-picked unoptimised bad ports they can find, then declare everything else outside of that "Real Gamer (tm)" bubble (ie, 99,900 games out of 100,000 PC games made) to be instantly "unplayable" or "dead" as self-justification for blowing $2k on a new toy... I don't think GOG even sells one single game (inc Cyberpunk 2077 & Baldur's Gate 3) that has even "8GB VRAM" set as the minimum requirement, let alone 12-32GB...
Post edited January 08, 2025 by BrianSim
I still remember a time when a top-tier GPU would cost no more than $600 (translated to €700 for Europoors).
Now, people are relieved that the 5090 only costs $2000 and not $2600. Thanks Obensen.
avatar
drxenija: 5090 has 32GB VRAM
avatar
timppu: Whoa! I don't have that much memory even in my head!

The future is here, and the past is tomorrow! A vicious cycle!
Your conscious memory is very small, but there is a unconscious memory as well; the only issue is, you got barely any access to it. I do not think that humans in general are biologically much different from each others, but, the ones with a truly good memory are able to tap into that unconscious memory with sometimes amazing traits. Sure, if someone is barely using their brain, the body may decrease or "unload" those cells, and in the end, just because of lack on training, there might even be a huge biological difference at some point. Without training it will not work, but those truly interested will learn fast and good. Some humans may even see it as a burden if to much stuff is "stressing" them, but for me, it is hard to "beat me down", due to different reasons, so i got mental resilience.

avatar
Xeshra: 8 GB is dead now... this age is over.
avatar
BrianSim: Reality Check : Out of the 2,000 games I own I think 4-5 need 6GB VRAM, 1,995 work fine on 4GB VRAM, and well over 1,500 would play fine on 2GB GPU's / APU's... Comment isn't aimed at you personally, but the "PC Master Race" crowd get way too carried away with their own inflated sense of self-importance by popping up on Reddit on every new hardware release with the worst 12x cherry-picked unoptimised bad ports they can find, then declare everything else outside of that "Real Gamer (tm)" bubble (ie, 99,900 games out of 100,000 PC games made) to be instantly "unplayable" or "dead" as self-justification for blowing $2k on a new toy... I don't think GOG even sells one single game (inc Cyberpunk 2077 & Baldur's Gate 3) that has even "8GB VRAM" set as the minimum requirement, let alone 12-32GB...
In general i can not say your entirely wrong, as the truth is that probably around 90% of my installed PC games (270 of 300) may run pretty OK with 8 GB or less VRAM and with a GPU not stronger than the one a standard PS5 got. Nonetheless, i do not focus on the mainly classic (5 years or more of age) games my PC/GPU can run well, but instead on the games that may not run so well with any hardware weaker than a PS5 and/or with a setting either not EPIC + NATIVE @60 FPS+ .

It depends on the settings and the required performance of course... as even the most demanding games can somehow run with very low settings and weak performance, but this is not an option for everyone, especially not for those who got a big screen and enjoy great graphics.

However, there is no meaning related to "minimum settings". It usually will mean "it can run a game at 30 FPS, low settings and 720P" thats it... which is for way to many gamers not an option. So, what we are looking out for are "recommended settings", not minimum settings. The recommended settings for many "big games" are more at the 12+ GB territory and a GPU at least on par with a standard PS5 GPU.

Regarding my PS5 Pro and what i found out: I got over 50 games and i can clearly say... i am glad i got sufficient performance because the majority of those 50 games can barely manage keeping up 60 FPS with the rather high settings involved.

Of course i "only" got a bit above 50 games on Playstation and "only" a few hundreds on my PC... not necessarily because i do lack any greed but rather because there are simply no more "outstanding" games... in many cases in the "AAA" territory available and those games automatically got a way higher demand than those 2000 Indies, classics or whatever games which has been installed on some other PCs.
Post edited January 08, 2025 by Xeshra
avatar
Xeshra: Again. the 5090 is not a card intended to be used or gotten by a majority, instead perhaps one of 100 gamers may barely be able to afford it.
The 5090 is just like the 4090 a prosthetic for men and women with disadvantaged physical features...