It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
5070 has 12GB VRAM
5080 has 16GB VRAM
5090 has 32GB VRAM

They kept lower tiers exactly the same amount except for the most top card
Neat, how many GigaWatts is required to throw at that workstation card disguised poorly as a consumer product?
high rated
250W, 360W, 575W (up from 200W, 320W, 460W from the 4000 series). Absolutely ridiculous how energy-inefficient these things are.

For comparison the GTX 970/980 took 145W/165W, and the RX 470/480 took 120W/150W. I wish we could return to numbers like that, but in the decade since then, these cards have been getting more and more wasteful. :/
Post edited January 07, 2025 by gogtrial34987
Yeah and the prices are also ridiculous. I'm hoping that AMDs new generation will be more reasonable in regarding to price and power consumption.
AMD 2025 Product Update (CES)

https://www.youtube.com/watch?v=mNP-qhGdWvM
Scalpers rejoice!
Oh, nice.
Another series to ignore like it never existed in the first place.
It's time for every dev to "optimize" their games for 32GB VRAM. (I.e. not working properly anymore with anything less than that.)
avatar
g2222: It's time for every dev to "optimize" their games for 32GB VRAM. (I.e. not working properly anymore with anything less than that.)
I really hate this, Red Dead 2, one of the most detailed open world games out there, can run with just 3GB of VRAM, and even if we push it with a dense city like Cyberpunk's, that still only requires 6GB

But from now on, watch some linear, corridor shooter or action game require 16GB, gotta sell those graphic cards!
high rated
Looking at the mindless "We invented DLSS and $1000 250w GPU's so games devs don't have to optimise, and because of that they optimised less, so we invented DLSS2 and $1500 300w GPU's, and because of that they optimised even less, so we invented DLSS3 and $1600 400w GPU's and because of that they now don't optimise at all, so we invented DLSS4 and $2000 575w GPU's..." downward spiral, stepping out of the modern AAA rat race and focusing on Indie's / older games that play on a sub £200, sub 120w GPU is one the healthiest and sanest things I did.
avatar
AB2012:
I'm looking to upgrade too in a comparably modest way. Totally agree there, the only way to go when trying to stay sane.
avatar
AB2012: Looking at the mindless "We invented DLSS and $1000 250w GPU's so games devs don't have to optimise, and because of that they optimised less, so we invented DLSS2 and $1500 300w GPU's, and because of that they optimised even less, so we invented DLSS3 and $1600 400w GPU's and because of that they now don't optimise at all, so we invented DLSS4 and $2000 575w GPU's..." downward spiral, stepping out of the modern AAA rat race and focusing on Indie's / older games that play on a sub £200, sub 120w GPU is one the healthiest and sanest things I did.
Why do you prefer this take over 'we are obligated to reach the speed of light' ?
avatar
drxenija: 5070 has 12GB VRAM
The way they announced this specific card felt like an attempt to kill-shot AMD out of the GPU space. This is the only card where Jensen felt the need to boast "4090" performance, and with a price of 549$ it would force AMD to take a loss on their 9070 line-up if they want to sell anything.

Of course, the "4090 performance" claim is sneaky language, as the 5070 can only match it with the help of DLSS4 frame-gen but if they can convince the average consumer that they're getting a "4090", then AMD is kind of screwed.
Post edited January 07, 2025 by botan9386
high rated
waow! another new thing to spend hundreds to thousands of dollars on to play modern triple A slop at 30 fps!11!
No idea what Nvidia is doing with the VRAM numbers since the 40 series. You had to go for a 4080 at least to get "comfortable" level of video memory. Probably intended to make people go for the higher end cards? And 5080 is exactly the same memory. Just wow....

I went from a 1080 Ti to a 4080, so I'm set until at least the 70 series, if not more. Especially when I'm going to be staying at high FPS 1440p for the foreseeable future. I don't give a damn about any upsacling, AI, DLSS or whatever performance, just classic rendering. And until it's at least a 100%+ upgrade in that area, my undervolted 4080 will do just fine.

Though I suspect a very suspicious bump in the sales of the eventual 6090, no matter the price. Just because :) I can already see the marketing. "The RTX 6090 is just NICE".
Post edited January 07, 2025 by idbeholdME