EXPERIMENT: Does it work for others?

I want people with newer graphics cards to try this. Don't knock the idea until you try it please. I did some research and found that newer generation graphics cards actually have a harder time running lower-setting graphics (low-quality, texture, etc.) My system settings for reference:

- Windows XP Pro
- 2.0 GHz Pentium 4
- 3GB of RAM
- 1GB Nvidia Graphics card

So, as an experiment I tried doing the opposite of every setting I use for GFX card settings and in-game. I turned EVERYTHING to high quality. In my graphics card settings and in-game settings. Turns out, I literally run 20-30 FPS higher in-game than I did while running "low-quality, high-performance" settings. I know it sounds COMPLETELY backwards. I'm still wondering how that works myself. But I want some people to try it and post their findings.

My findings
  • Low quality settings in and out of game, high performance: 5v5 = 40-50 FPS max with a lot of FPS drops/spikes.
  • High quality settings in and out of game: 5v5 = 80-90 FPS, steady. Drops maybe to 75 at the lowest.
I'd like to see if anyone else has similar results. I did not tweak specific graphics card functions. I simply used the automated slider and set it to high quality.

Comments