I'm also benching at the moment on a P3-550MHz and intend to bench an Athlon XP2400+ with a whole series of different AGP cards to gauge performance of hardware of the age. I will update with progress.
I'm at least a reasonably tolerable person to be around - Narcopic
... and then a few seconds into 'low' it crashed. From dmesg this was down to an issue in the driver/mesa/kernel, not Xonotic.
Interesting none the less. A few points where the system was paging it did slow down on but I would think 512Mb RAM would solve that. This system is definitely below minimum requirements on both CPU and RAM.
I don't think the driver setup is ideal and it might be that some improvement can be had. I'm willing to put some time into experimenting as it will be a similar setup for anyone else using the free driver on older Radeon hardware.
I'm at least a reasonably tolerable person to be around - Narcopic
There is a real cliff for this system at the high settings and just by the 'min' numbers you can see that the slowdowns on those settings are massive. Clearly hardware of this age just doesn't cope well with realtime lighting. Normal remains very playable. This system is also CPU limited which gives an interesting thought: If an XP2400+ can do around 67fps on low settings and 30fps is the minimal playable then a 1GHz CPU with similar graphics card would still be playable. RAM wise no slow downs for paging were seen even though a full KDE desktop was running in the background. Hence I would suggest that 1Gb is a perfectly usable amount of RAM.
I'm at least a reasonably tolerable person to be around - Narcopic
02-21-2012, 06:15 AM (This post was last modified: 02-21-2012, 08:15 PM by edh.)
(02-20-2012, 07:29 PM)tZork Wrote: I would also suspect that the this system see a such a slow down at high+ due to the gpu memory needed for those higher presets.
I've looked at video memory usage using memstats from the end of the-big-keybench. Each time a new effects level was tried the texture cache was flushed with a vid_restart:
These numbers don't necessarily mean peak but are a pretty good indicator. They all fit in with the different texture quality levels shown in the effects menu.
These don't suggest that video memory is associated with the speed drop in going to effects-high.cfg as then a massive drop would be expected from low to medium due to the card only having 128Mb of memory.
From watching the demos on this system it really does look like it's the higher lighting effects that cause the slowdowns. I can try some more things with that system in the effects menu to try and pinpoint what causes the slowdown on high settings.
I have also with the same system done benchmarks using a Geforce FX5600 which I can upload some results for later. I still have a Geforce 4 MX440 which I could try as well in the same system.
I'm at least a reasonably tolerable person to be around - Narcopic
Higher settings were skipped due to the time taken.
I'm quite suprised in a way just how poor this card performs even on low.cfg. I was expecting it to be substantially worse than the 6600GT, maybe 3 times worse at higher settings but the really poor performance on low just makes it unusable. Based on this I really can't recommend playing Xonotic on the FX5600 as all that is playable is OMG.cfg.
This is with an old driver on an old install (not updated in over 4 years) but I doubt a new distro with newer drivers would manage a staggering improvement. Something else to suggest here is that the whole Geforce FX series are out of support so new drivers do not support them. Would it be sensible based upon the performance levels given to not list such cards on the minimum requirements? Make the minimum requirements Geforce 6, Radeon HD2xxx (oldest fully supported by AMD), Intel i965 (first OpenGL 2.0 in Mesa), etc?
I'm not sure I'm even going to bother trying the MX440 now as it needs an even older 96 series driver just to work.
I'm at least a reasonably tolerable person to be around - Narcopic
(02-20-2012, 07:29 PM)tZork Wrote: I would also suspect that the this system see a such a slow down at high+ due to the gpu memory needed for those higher presets.
Coming back again to the performance drop on high settings with the 6600GT 128Mb I have done some testing with different settings. Switching the texture quality from 'good' to 'normal' more than tripled the framerate in the testing I did. So it could still be down to amount of video memory available for textures. Is there anything other than texture resolution which changes with the texture quality slider?
If it is only texture resolution that is different then we can easily see that a 128Mb card is not appropriate for the higher texture quality so the recommended has to be 256Mb. If anyone has access to some different 128Mb and 256Mb cards to test this would be interesting to see.
I have also tested my main system (E8200, 9600GT) again after doing an update. Framerates have crept up a little . Not sure if this is down to the latest rsync autobuild or NVIDIA 295 series drivers or other updates but it's good none the less.
I'm at least a reasonably tolerable person to be around - Narcopic
I've tried only Warsow, Doom3, Quake4, QuakeLive, NFS World and Xonotic on this PC. I'm not a fan of anything else No noticeable problems have been found. Win7x64's drivers is ok. But there is some problems in WinXPx86, so now I don't use it.
About Linux, I have Ubuntu 11.10 x64 installed. I can make some tests if it's needed
(02-22-2012, 02:07 PM)adem4ik Wrote: About Linux, I have Ubuntu 11.10 x64 installed. I can make some tests if it's needed
This would be interesting to see! Windows 7 vs Linux on the same hardware. Ubuntu is also a common enough distro that any issues found and possible resolution may help others. Even if you don't get good performance straight off, any work you have to do to improve it will be very useful to know.
I'm at least a reasonably tolerable person to be around - Narcopic
02-22-2012, 03:57 PM (This post was last modified: 02-22-2012, 03:57 PM by rocknroll237.)
As stated in an earlier post, it's slightly strange that my Ares is outdone by cards that are no where near as powerful in games like Crysis or Battlefield 3. I think it's down to the fact that Nvidia has superior performance when Open Gl is used.
I will see if there are any noticable performance gains when I update the drivers (from 11.11 to 12.1), CAP profiles and overclock my cpu and ram a little.
02-22-2012, 05:25 PM (This post was last modified: 02-22-2012, 05:26 PM by edh.)
I think there are a few things with the Ares. For a start it's 2 processors. Regardless of them being on one card or not this means there is a Crossfire setup going on which is never 100% efficient. I would not be too suprised if it does not work at all for Xonotic. If the Crossfire setup does not work then you will actually get negative performance boost due to the driver overhead, hence you might be better off with just one 5870 than the Ares.
You could test this hypothesis by monitoring the temperatures during use of the card in Xonotic. Maybe Speedfan (Google and download it) can detect the temps from each core and log them. If one core heats up substantially more than the other then the Crossfire setup is not working. Why might it not work? Got to look at how Crossfire works in different games and OpenGL/DirectX.
In terms of ATI and OpenGL, performance is a cause for concern. Look for any Doom 3 benchmarks (also OpenGL) and you'll see a similar tale vs. NVIDIA.
Oh, and for information on your earlier comment: 2Gb of RAM is way more than enough to run Xonotic. Go on, take some of your RAM out and benchmark again, see if there's much difference. ;-)
I'm at least a reasonably tolerable person to be around - Narcopic
02-22-2012, 06:09 PM (This post was last modified: 02-22-2012, 06:10 PM by rafallus.)
Oddly enough, my c2d-desktop system has 4850, and as you can see in results, it outperforms systems with similar nVidia cards (GTS 250) on ultimate settings. It confused me, because I expected it to be neck in neck all the way or below (according to that allegedly worse openGL ATI performance, also my version is 512MB GDDR3).
Still, even with one 5870, you should have more than 2x mine fps.
02-25-2012, 08:46 PM (This post was last modified: 02-25-2012, 08:48 PM by edh.)
Some ideas to improve the benchmark script a bit:
- Switch the in-game FPS counter on for the benchmarks
- We know that certain options and configurations can make some effects levels invalid. What about disabling the effects level from the run that are not valid? For example if +vid_soft 1 is enabled, don't do ultra and ultimate and if OpenGL < 2.0 then don't do normal or higher
- Add a timer option to stop the process if a user specified time (eg, 1 hour) is given
- Add an option to end the process if a user specificed FPS limit is broken, eg. the FPS falls below 20fps on a run so the process stops at the end of the run
- Add a shutdown option (sudo needed) so that if a user appended '-shutdown' for example, it would shutdown at the end of the benchmark
- Print the MED FPS to the console at the end of each effect level run
Thoughts? I can work on this now and come up with a patch to submit.
I'm at least a reasonably tolerable person to be around - Narcopic
You're right! One of my cores is idling (42 degrees), while the other is a lot hotter (70 degrees). This means I'm actually running Xonotic with a 5870, instead of a 5970... Hmm, maybe if Xonotic gets a bigger following, ATI can actually do something about this.
OMG: 20.0796540 fps, one-second fps min/avg/max: 9 21 40 (336 seconds)
Low: 13.3231038 fps, one-second fps min/avg/max: 5 15 28 (336 seconds)
Higher settings not run due to time.
I've been interested in seeing how Nouveau performs for some time. Right now the legacy NVIDIA drivers for Geforce FX do not work with the latest Xorg (NVIDIA need to update their legacy drivers). This means now is a good time to try Nouveau on any NVIDIA TNT through to Geforce FX.
I would not expect it to perform as well as the propreitary driver at this stage but it does work at least. Some graphical corruptions are evident with console text being garbled, the odd texture problem and a common occurence of polygons stretching from the origin to a point on the screen. It's not so bad that you couldn't live with it but not ideal. I did play on some tweaked low settings on a bot match and things were OK. The Geforce FX implementation is known to cause these problems and is being rewritten: http://nouveau.freedesktop.org/wiki/MesaDrivers
I'll keep working on this as it may be something better can be achieved. Once NVIDIA gets their legacy driver updated I will do a direct comparison on this hardware.
(02-26-2012, 09:15 AM)rocknroll237 Wrote: You're right! One of my cores is idling (42 degrees), while the other is a lot hotter (70 degrees). This means I'm actually running Xonotic with a 5870, instead of a 5970... Hmm, maybe if Xonotic gets a bigger following, ATI can actually do something about this.
Next thing to see is if you have the same situation in every other game. Could be there is something wrong with Catalyst? Here's a similar topic just in case it is in every game: http://www.overclockers.com/forums/showt...p?t=681855
I'm at least a reasonably tolerable person to be around - Narcopic
02-29-2012, 12:13 PM (This post was last modified: 02-29-2012, 05:03 PM by edh.)
Craptop:
- Via C3-2 'Nehemiah' 1.2GHz
- 256Mb PC2100 (shared)
- Via CLE266 onboard, set to use 16Mb shared memory
- Arch Linux
- x86
- OpenChrome 0.2.904, Mesa 7.11.2
- OpenGL 1.2
Realistically this system isn't going to play Xonotic. The CLE266 chipset just isn't made for gaming.
Low DID run but at around 0.1fps and having seen within a minute or two of this slideshow that the lava texture was being rendered black, this just isn't going to work with Xonotic so I aborted.
This was tested with the Openchrome driver which is the most advanced open source driver for Via graphics processors and certainly the one I'd recommend to anyone else unlucky enough to have a Chrome based graphics core. I get around 25% better on glxgears with it than with the older Unichrome driver that OpenChrome forked from. Just in case this is considered a poor performing driver I have benchmarked with Q3 (playable on low settings) on this system and it runs over twice as fast under Linux with the OpenChrome driver than on Windows with Via's own drivers! So don't even think about running Xonotic on a Via graphics core in Windows!
The OpenChrome driver has just had a new release, KMS is on it's way at some point and Mesa keeps moving so it may be that Via graphics performance under Linux can improve but at least for the CLE266, it'll never be playable on non-OMG settings. I did purposefully test with only the minimum 16Mb of RAM for video use which makes this result obviously worse case scenario. I will try to retest with 32Mb and 64Mb to compare. It does still show that the game theoretically runs on some appalling hardware!
Now who can beat this for poor performance? You've got to keep to proper, non-broken, drivers and no extreme underclocking!
Edit: I've now benched on 32Mb and 64Mb video RAM as well.
32Mb OMG: 10510 frames 1113.0167483 seconds 9.4428049 fps, one-second fps min/avg/max: 5 10 16 (336 seconds)
64Mb OMG: 10510 frames 1125.0383082 seconds 9.3419041 fps, one-second fps min/avg/max: 5 10 16 (336 seconds)
32Mb is faster than 16Mb due to the extra video RAM but 64Mb get's slower due to no graphical advantage and less system RAM available so more swapping. The black lava issue remains when starting on low. Just can't recommend anything remotely close to this chipset right now.
I'm at least a reasonably tolerable person to be around - Narcopic
03-07-2012, 06:42 AM (This post was last modified: 03-07-2012, 07:38 AM by edh.)
I've now been experimenting with Nouveau again on the P3 2x1GHz 'LianLi' system listed previously. There was always before an alarming error message printed because S3TC texture compression is not found. This can be fixed however by an environment variable:
Code:
export force_s3tc_enable=true
Testing however showed it actually made things WORSE.
OMG: 15
Low: 10
This compares with 20 and 13 with it disabled. S3TC is not on in Mesa/Gallium by default due to patent issues in certain countries. This may be resolved soon but without more work on how this is implemented, it won't give any benefit.
As an aside, what does Darkplaces do with S3TC? How important is it that it is present?
I've also now done some more work on the 'SN45G' system:
-1x Athlon XP 2400+ (@stock 2.0GHz)
- 2x512Mb dual channel RAM
- Geforce 6600GT
The difference now is that I have done a fresh install of Arch Linux. This also includes moving up to the NVIDIA 295 series driver.
- Arch Linux x86
- GL_RENDERER: GeForce 6600 GT/AGP/SSE/3DNOW!
- GL_VERSION: 2.1.2 NVIDIA 295.20
KDE 4 was used as opposed to the previous KDE 3 but compositing was switched off to balance things out.
The improvement over previous results is 20% on OMG, 22% on low, 20% on medium, 24% on normal, 184% on high, 152% on ultra and 94% on ultimate!
No hardware changes or overlocking were done to achieve this. The only configuration changes made were to increase the AGP apperture size from 64Mb to 256Mb and setting the CPU interface timing to aggressive. I can also confirm with the log files that this is the same build of Xonotic.
Increasing the AGP apperture size may have a substantial effect at higher settings but goes nowhere near the whole way of explaining the performance increase. I would have to say that the changes in the base system and the drivers would make up most of the change as it was a 5 year old Sabayon install which was pretty broken. As for the timings, this computer was already set up pretty aggressively so most things were already turned up and in previous experience only a fraction of a percent can be gained from such settings.
I would suggest these results replace the previous ones for SN45G and serve as an example of how the system setup should be looked at before resorting to 'r_draweverythingcrap 1' and 'r_1980s 3'.
Not bad for an 8 and a half year old system!
I'm at least a reasonably tolerable person to be around - Narcopic