|
11-07-2012, 09:58 AM
(This post was last modified: 11-07-2012, 09:58 AM by CuBe0wL.)
Source:
http://nvidianews.nvidia.com/Releases/NV...g-8ac.aspx
Has anyone tried to benchmark this yet in Xonotic? It'd be interesting to see if the 310 beta driver makes any difference?
"One should strive to achieve; not sit in bitter regret."
|
|
Yes! I'll benchmark it soon, I have a GTX650 factory over-clocked coming very soon! Lol, it's great that Steam coming to GNU/Linux, not because I love Steam, because then they'll upgrade the drivers!
|
|
Quote:because then they'll upgrade the drivers!
And if ATI update their drivers too and proper support comes in then hell yeah, I'm moving to Linux.
|
|
Installed the driver, we´ll see whether that makes some difference, though I didn´t noticed any big boost yet...
|
|
"One should strive to achieve; not sit in bitter regret."
|
|
Oh wait, I see I´m using version 304. Lucky...
|
|
(11-07-2012, 04:30 PM)C.Brutail Wrote: LOL?
https://picasaweb.google.com/lh/photo/lC...directlink
MASSIVE.
Look how you can play with high settings without getting any fps drop compared to low. That's AMAZING.
|
|
why is the 310 capped at 75 fps? vsync? :-P
My contributions to Xonotic: talking in the forum, talking some more, talking a bit in the irc, talking in the forum again, XSkie
|
|
11-08-2012, 01:14 AM
(This post was last modified: 11-08-2012, 03:07 AM by CuBe0wL.)
Oh my... Cyber Killer, you're right! Somehow, the new driver turned on vsync by default.
I gotta retract, and measure again.
Ok, I've redone the measurements. Here are the results:
https://picasaweb.google.com/lh/photo/cm...directlink
This however, could also mean I hit CPU limit (I really think I did), so I'll redo the measurements later with mod_alias_force_animated 0.
Stay tuned!
"One should strive to achieve; not sit in bitter regret."
|
|
Yeah, the current results don't seem to be too different from each other.
My contributions to Xonotic: talking in the forum, talking some more, talking a bit in the irc, talking in the forum again, XSkie
|
|
11-08-2012, 06:53 AM
(This post was last modified: 11-08-2012, 06:53 AM by Shadowman84.)
New headline
"NVIDIA exaggerated, a lot!"
|
Professional noob...
of...
Paradox Space... |
|
|
Update: The new driver seemed to misconfigure my kernel. This!
|
|
Phoronix has published some more benchmarks of those new drivers (in Xonotic too). It looks like they indeed are faster, but only a little bit.
My contributions to Xonotic: talking in the forum, talking some more, talking a bit in the irc, talking in the forum again, XSkie
|
|
11-08-2012, 05:51 PM
(This post was last modified: 11-08-2012, 05:51 PM by rocknroll237.)
(11-08-2012, 10:26 AM)Cyber Killer Wrote: Phoronix has published some more benchmarks of those new drivers (in Xonotic too). It looks like they indeed are faster, but only a little bit.
Only a little bit? The Gtx 680 got a 15fps increase in the Unigine benchmark at 1080p! That's pretty big imo. There was also a 45fps improvement in Xonotic.
|
|
But it's not the hyped "2x better performance", like they said. It's only a couple % (which is good nonetheless, but the marketing ppl @ nvidia need to learn some f**** math!).
My contributions to Xonotic: talking in the forum, talking some more, talking a bit in the irc, talking in the forum again, XSkie
|
|
You always have to keep in mind that a lot of people have problems using the driver because it just doesn´t work (god knows why...). NVIDIA better should work on stability than on trying to get some more frames per second. Who cares about +20fps if you aren´t even able to use it?
|
|
Nvidia uses L4D2 as an example. This game is based on more modern engine than for example Xonotic. So maybe this driver uses some tech that's important in Source, but absent in DP?
|
|
Which OpenGL is DarkPlaces running on? 4? Then it's the same as (closed)Source, because OpenGL 5 isn't here yet.
(08-10-2012, 02:37 AM)Mr. Bougo Wrote: Cloud is the new Web 2.0. It makes no damn sense to me.
|
|
I wonder if any of this is also linked to the recent dropping of support for Geforce 6 and 7 hardware? Is it possible that older hardware support was holding back performance?
I will run some Xonotic benchmarks myself on this to get some more data.
I'm at least a reasonably tolerable person to be around - Narcopic
|
|
I've built my computer now, today I gonna install Arch Linux and try this out! GTX 650 OC here.
|
|
11-13-2012, 10:13 AM
(This post was last modified: 11-13-2012, 10:32 AM by Lee_Stricklin.)
(11-11-2012, 10:55 AM)Minkovsky Wrote: Which OpenGL is DarkPlaces running on? 4? Then it's the same as (closed)Source, because OpenGL 5 isn't here yet.
Uses OpenGL 2 if I'm correct.
(11-09-2012, 06:14 AM)Maddin Wrote: You always have to keep in mind that a lot of people have problems using the driver because it just doesn´t work (god knows why...). NVIDIA better should work on stability than on trying to get some more frames per second. Who cares about +20fps if you aren´t even able to use it?
I knew a while back that Nvidia was gonna go straight to hell when I read about them "releasing early, releasing often" on Phoronix about four or five years ago when they were talking about drivers. Unfortunately at the time of me building my rig, ATI still didn't have their crap together and had pissed me off like a year before that with an AGP card (HD2600, 512MB, 128 bit) I had in my previous rig due to it's faulty drivers that caused constant display crashes. Sucked because it would've been a good card if it didn't have such bad drivers, ran UT3 maxed out at 1280X1024 (max res my CRT supported at the time) with a good frame rate as well as other games that were recent at the time. Now Nvidia seems to be making mistakes that are going to send them down ATI's road while AMD turns what used to be ATI around. Hell they've even released dangerous drivers that damaged cards, one such driver found it's way on my system and I didn't know it until I started seeing artifacts one day.
ECKZBAWKZ HUGE LIST OF ACHIEVEMENTS GOES HERE....
Oh wait.
|
|