It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Matruchus: I guess it all depends on what hardware a person can afford.
As well as the games and the time between upgrades - a very cheap system now (even with integrated graphics) could quite easily outperform a mid-range PC from 2008 - 2009.

avatar
Matruchus: The strange things is on the first link you gave me with the scene from one game that the speed or the feeling of speed is bigger at 30 fps then at 60 fps.
Might be caused by the jittery motion at 30fps making it harder for your eyes to track objects, or maybe because the 30fps video is just the 60fps video dropping every second frame (not 100% sure that's what it's doing though).
Post edited April 09, 2014 by DreadMoth
The background forest looks almost the exact same no matter whether it's 30, or 60 FPS. The only thing you notice a small change on is the detail of quickly moving stuff (like the ball across the screen).

So yeah, it can be noticed, but to say it's unplayable at 30 is just as idiotic as saying it can't be noticed at all.
Post edited April 09, 2014 by OldFatGuy
avatar
Matruchus: I guess it all depends on what hardware a person can afford.
avatar
DreadMoth: As well as the games and the time between upgrades - a very cheap system now (even with integrated graphics) could quite easily outperform a mid-range PC from 2008 - 2009.
Yeah agree on that. My pc is one of those from 2008.
This are my specs at the moment:
Intel Core 2 Duo E8400 3.0 Ghz - 64bit processor
4gb ram 1333
geforce n450 gts 1gb dd3


avatar
Matruchus: I guess it all depends on what hardware a person can afford.
avatar
DreadMoth: As well as the games and the time between upgrades - a very cheap system now (even with integrated graphics) could quite easily outperform a mid-range PC from 2008 - 2009.

avatar
Matruchus: The strange things is on the first link you gave me with the scene from one game that the speed or the feeling of speed is bigger at 30 fps then at 60 fps.
avatar
DreadMoth: Might be caused by the jittery motion at 30fps making it harder for your eyes to track objects, or maybe because the 30fps video is just the 60fps video dropping every second frame (not 100% sure that's what it's doing though).
Yeah I think it is probably faster because it has less fps to show.
Post edited April 09, 2014 by Matruchus
avatar
OldFatGuy: The background forest looks almost the exact same no matter whether it's 30, or 60 FPS. The only thing you notice a small change on is the detail of quickly moving stuff (like the ball across the screen).

So yeah, it can be noticed, but to say it's unplayable at 30 is just as idiotic as saying it can't be noticed at all.
Well obviously, the smaller the increments between individual screens are (i. e. slower moving stuff) and the more uniform the pictures are (sharing mostly single color), the less noticeable the effect is going to be. Still, if you speed the 'forest' up, effect can still be observed.

At any rate, different people have different reactions and eyesight - while some don't even see a difference, there are others who get sick at 30 FPS. Also, fast games with high competitive play do essentially become unplayable at 30 FPS as you're in a clear disadvantage against people who play at 60 or 120.

Regardless, me being a singleplayer guy, I will mostly try to crank games visually up as much as possible, even if it means playing at 30 FPS.
avatar
Fenixp: Also, fast games with high competitive play do essentially become unplayable at 30 FPS as you're in a clear disadvantage against people who play at 60 or 120.
Doh, that is a MOST EXCELLENT point, and one I should've considered and qualified my statement by saying SINGLE PLAYER. Sorry.

Yeah, that is most definitely true, if you're playing multiplayer, 30FPS can be lethal, virtually speaking. My bad.
avatar
OldFatGuy: Yeah, that is most definitely true, if you're playing multiplayer, 30FPS can be lethal, virtually speaking.
Yeah.

That said, in some multiplayer games I would quite happily play with 30fps if I could get ping/latency equal to or lower than my frametime...
After reading the last posts I must also state that I was thinking about fps in singleplayer since I don't play multiplayer games. Only exception to that is SC2 but fps aren't that important there since its not an action/shooter type game.
avatar
Crosmando: I don't see AMD as having too bright a future in the PC desktop scene, they'll probably do more business with contracts for putting processors in laptops and also of course supplying hardware for game consoles, but in terms of the desktop PC I think their days are numbered, Intel Ivy-bridge processors are just in a different universe altogether than anything AMD can make, and the "Bulldozer" model turned put to be a big fat nothing. I can't see them ever being able to get close to the speed of Intel CPU's without bankrupting themselves. Also I don't see AMD graphics cards as being able to compete with Nvidia for too long in the future, the difference isn't as steep as on processors with Intel, but they're still well behind and the new Titan GPU is well beyond anything AMD can produce even at the top end of the market. The only advantage AMD has is price, but of course the only reason they can sell hardware cheaper is because the actual quality of the hardware is lower and it's slower than Intel/Nvidia hardware.

tl;dr AMD is doomed
avatar
Matruchus: You do know that new Playstation 4 and Xbox One have only AMD hardware inside so I think AMD doesn't have such a bleak future as you describe it since the console market is several times bigger than the pc market.
You might find this interesting:
http://www.gamespot.com/articles/ps4-not-worth-the-cost-says-nvidia/1100-6405300/
avatar
DreadMoth: Minimal to no improvement on this system, with these games. Probably due to the GPU being the bottleneck...
I haven't studied closely, but I think Mantle really shines where a low to mid range CPU is involved. These drivers from Nvidia probably make better use of a low end CPU too.

I'll look closely at GPU benchmarks when I get ready to upgrade. ;)
avatar
Matruchus: You do know that new Playstation 4 and Xbox One have only AMD hardware inside so I think AMD doesn't have such a bleak future as you describe it since the console market is several times bigger than the pc market.
avatar
JohnnyDollar: You might find this interesting:
http://www.gamespot.com/articles/ps4-not-worth-the-cost-says-nvidia/1100-6405300/
Thanks for the article. Will se what happens with the console business. At the moment there are 3 million xbox ones sold and 7 million ps4 till end of february. Hopefully this helps amd.
Post edited April 09, 2014 by Matruchus
avatar
Matruchus: Thanks for the article. Will se what happens with the console business. At the moment there are 3 million xbox ones sold and 7 million ps4 till end of february. Hopefully this helps amd.
They may be able make something out of these console contracts. Time will tell.

Nvidia made some impressive gains in efficiency judging by their early Maxwell GPUs, the GTX 750/Ti. I'm curious to see what their mid to high end cards will look like. If the GTX 750/Ti is a good indication, then I think AMD are going to need to make some decent gains in efficiency too with their next gen in order to keep up.
avatar
Matruchus: Thanks for the article. Will se what happens with the console business. At the moment there are 3 million xbox ones sold and 7 million ps4 till end of february. Hopefully this helps amd.
avatar
JohnnyDollar: They may be able make something out of these console contracts. Time will tell.

Nvidia made some impressive gains in efficiency judging by their early Maxwell GPUs, the GTX 750/Ti. I'm curious to see what their mid to high end cards will look like. If the GTX 750/Ti is a good indication, then I think AMD are going to need to make some decent gains in efficiency too with their next gen in order to keep up.
Yeah the Titan (black edition) is a lot better than r9 280, where as gtx750 is a lot slower than r9 280. And of course amd cards have a lot higher power consumption than nvidia. The only funy thing is that Nvidia and AMD being shareholding companies have the same owners :)
Post edited April 10, 2014 by Matruchus
This thread really has opened my eyes to the whole picture of the API wars.

For what's it worth, I'm just really hoping DX12 takes off and is fully implemented in Win 7. It's the only API that I see that will benefit all of PC gaming since Nvidia is showing zero support for mantle, whereas I don't see ATI when push comes to shove, ignoring DX12 if it's fully implemented on such a large userbase such as Win 7 and Xbox one.

I think MS isn't as fanatical with trying to sell their operating systems with exclusive API's since Win 8 has failed and because Developers after so many generations are tried of slim pickings when it comes to optimizing for the largest audience possible and are obviously pushing for Win 7 to have DX12.
Post edited April 10, 2014 by Gorion
It's funny because few weeks earlier Nvidia paid the guy which secured console contracts for AMD, to move from AMD to Nvidia and to steal documents with details about negotiations. AMD sued him for this.

So yeah, "we are not interested in console contracts, but let's make sure that we will have a chance to secure contracts for next-next-gen"

http://www.crn.com/news/components-peripherals/240146409/amd-sues-four-former-employees-who-left-for-nvidia.htm
Post edited April 19, 2014 by Aver
Interesting article, though my take on it is a little different.
Post edited April 19, 2014 by JohnnyDollar