Author Topic: GTX580 3Gb VS GTX680 4Gb: and the winner is...  (Read 1397 times)

gonzohot

    Reputation: 36
GTX580 3Gb VS GTX680 4Gb: and the winner is...
« on: January 05, 2013, 03:35:39 pm »
January 05, 2013, 03:35:39 pm
Hi Lumion Team and everybody !

As I have worked hard on a very heavy virtual model (10 millions polys) and wanted it to be displayed the better way It could i have recently changed my GTX580 3Gb for a brand new GTX680 4Gb...

And then what was my surprise: it was running faster (more fps) on the oldest videocard !  |:(
I know that the fermi and kepler architectures are different but i was not expected this ! I also am not pretending it occurs in all situations but with that specific model the GTX580 3Gb win...
Is there anybody experimenting the same?

« Last Edit: January 05, 2013, 03:37:52 pm by gonzohot »

Gilson Antunes

    Reputation: 70
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #1 on: January 05, 2013, 05:57:19 pm »
January 05, 2013, 05:57:19 pm
Hello Gonzohot
Thanks for the info ... I use a 580-3G and am thinking of changing.

Nice job. Congrats.

Michael Betke

    Reputation: 35
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #2 on: January 05, 2013, 06:31:36 pm »
January 05, 2013, 06:31:36 pm
Maybe Nvidia didn't optimize drivers on 680GTX anymore for DX9 application like Lumion and concentrate more on latest DX11 games like Battlefield 3 and so on.

The 580 is nearly two years old now and I guess more suiteable to DX9. Lumion looks really great but its based on a ancient version of Directx.

Did you change driver versions with change from 580 to 680?

ToFi

    Reputation: 11
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #3 on: January 05, 2013, 06:36:52 pm »
January 05, 2013, 06:36:52 pm
Hello Gonzohot,
could you please post the exact rendertimes for the two graphics cards for the same frames/stills.
Thanks in advance.
TF

gonzohot

    Reputation: 36
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #4 on: January 05, 2013, 11:12:25 pm »
January 05, 2013, 11:12:25 pm
Hi guys !
Waouh ! Gilson and Mickael: two important figures here in Lumion forum so thank you for your comments !!!
Just to give more informations: last drivers were used and the difference is about 5 Fps from one card to another (around 15fps with gtx680 and 20fps with gtx580). I'm talking about realtime framerate; when calculating the scene, render times are almost the same: between 15 and 17 seconds per frame...
And I can't do anymore tests now because I've sold my Gtx580 3Gb  :'(
« Last Edit: January 07, 2013, 01:56:26 pm by gonzohot »

Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #5 on: January 06, 2013, 01:19:17 am »
January 06, 2013, 01:19:17 am
Its a bit like the difference between an older 480 and any of the 5 series up to the 580.  Although the 480 is basically lower spec, less cores it was a powerhouse and out performed most 5 series. Why? Not sure, but as I couldn't get supply from my regular hardware seller I opted for a 560 and should have gone with the 480 and sourced elsewhere.  Anyway.

Some things that might be different for your cards:
1. are they the same make?  there is definitely a lot of difference between producers, both in quality of chips, overall speeds (mainly bus)
2. are they set to factory defaults or was the 580 boosted and currently the 680 is not?
3. even though the 680 has 3 times the CUDA cores than the 580 and better clock speeds, the bandwidth is better on the 580, and this could explain the real-time FPS difference and why renders are much the same (same due to Lumion, other PC specs and DX9)
4. texture fill-rate is quite a critical and the 680 is definitely much better so its likely limitations of other parts
5. are you running screen at same resolution
6. your PC specs may be that its not able to make any further use of the 680 over the 580 anyway,
7. when you get down to a few FPS in real-time its not really anything and more to do (IMHO) with Lumion and general PC specs etc.  My 560 is OK with small to some larger scenes, but chokes on anything too much and real-time FPS dies, but the scenes still seem to render in roughly the same time (+/-).

It's render time that of course is the time cost to you, if you are finding you need more FPS in Build then just lower your Editor quality and even resolution % to help in the build phase.

The advantage you have with the 680 is an extra 1GB of VRAM which helps with allowing such a heavy scene to be loaded and the cards ability to handle higher display resolutions.

It would definitely be nice to see a good render improvement for the 680 especially with 3X cores, but as Michael mentioned its likely GPU drivers for DX9 (although some things in DX9 are still or just as fast than in DX11), and the real-time technology in the background of Lumion compared to something like Crytek or Unreal.

You could test your GPU with some of the benchmark tools such as Furmark etc found at geeks3d.com (see also links mentioned in forum) and see how well your card compares.

Appreciate your post.  It is interesting to hear your story as have been struggling with my own 560 card an whether to upgrade or not.

Derekw

    Reputation: 27
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #6 on: January 06, 2013, 10:47:02 pm »
January 06, 2013, 10:47:02 pm
I upgraded my GTX 570 1GB Card to a GTX 680 4GB card and got a 30% improvement in rendering times.
My largest project in Sketchup is 300 MB (collada file of 473 Mb) which is easily imported into Lumion. I'm very happy with the results!

http://www.gainward.com/main/vgapro.php?id=868

tug

    Reputation: 27
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #7 on: January 06, 2013, 11:36:00 pm »
January 06, 2013, 11:36:00 pm
Hi gonzohot and you all guis

I've several of those graphic cards here mentioned and others, all tested with Lumion and the results are quite different on depend in wich chipset is based the card.

On mobile wokstation:
Dell Quadro 3700M 1Gb-not good
Dell Quadro 5000M 2Gb-Nice

On flat Workstation:
PNY Quadro 4000 2Gb- not good
Gainward GTX 580 Phantom 3Gb- Quite good
Gainward GTX 680 Phantom 4Gb- Really Quite good

Be sure about your computer can support Power Connector: 8-pin & 6-pin

I guess is a good choice gainward Phantom cards.

rgds

Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #8 on: January 07, 2013, 12:19:31 am »
January 07, 2013, 12:19:31 am
ps: be good to mother earth and the environment; re-cycle all replaced video cards better than a 560 (eg 580 or 660 and above series) to me  :-D  :-D  :-9

Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #9 on: January 07, 2013, 10:34:18 am »
January 07, 2013, 10:34:18 am
Hi gonzohot, it's a pity you haven't got the 580 card anymore - I must admit I find it hard to believe that you only get 5 fps with the 680 compared to 15 fps when using the 580.

If you still had the 580 card I would urge you to double-check that you tested the scene with the exact same settings, i.e.:

  • Identical version of Lumion
  • Identical quality settings (star quality, resolution, F7, F9, low memory on or off)
  • Identical spotlight shadows (on or off)
  • Identical planar reflection settings (on or off)
  • Identical Global Illumination settings (on or off)

In the short run, you could ask another customer with a GTX 580 card to test your scene (export it as an LS3 file). Remko also has a 580 card, but for that to work we would need screenshots of the settings as well as the resolution that you're running Lumion in.

In the long run, you could wait for Ferry to make a benchmark application for Lumion.
« Last Edit: January 07, 2013, 07:13:50 pm by Morten »

Michael Betke

    Reputation: 35
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #10 on: January 07, 2013, 10:36:28 am »
January 07, 2013, 10:36:28 am
Can't somebody just make a scene which we import and run as a benchmark for rendering?

Its not this complicated. I also played with the idea but have no idea for a scene...

Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #11 on: January 07, 2013, 10:45:34 am »
January 07, 2013, 10:45:34 am
I think Gonzohot is mainly talking about the framerate in Build mode, not so much the render times?

As you can see in my previous post, it is not as straightforward to measure Build mode performance as you need to ensure that all parameters are identical.

In addition, in order to be useful, a benchmark application should ideally measure average performance with a range of effects, scene types, complexity and model/material types.

For example, if we used a massive city with many spotlights and movie effects as a benchmark scene, the information gained from this scene would probably not help you much if you mainly made small-scale product visualisations for your customers.
« Last Edit: January 07, 2013, 10:48:48 am by Morten »

tug

    Reputation: 27
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #12 on: January 07, 2013, 11:27:33 am »
January 07, 2013, 11:27:33 am
Can't somebody just make a scene which we import and run as a benchmark for rendering?

Its not this complicated. I also played with the idea but have no idea for a scene...

Hi Michael, which kind of scene are you thinking about?

gonzohot

    Reputation: 36
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #13 on: January 07, 2013, 02:09:25 pm »
January 07, 2013, 02:09:25 pm
Hi gonzohot, it's a pity you haven't got the 580 card anymore - I must admit I find it hard to believe that you only get 5 fps with the 680 compared to 15 fps when using the 580.

If you still had the 580 card I would urge you to double-check that you tested the scene with the exact same settings, i.e.:

  • Identical version of Lumion
  • Identical quality settings (star quality, resolution, F7, F9, low memory on or off)
  • Identical spotlight shadows (on or off)
  • Identical planar reflection settings (on or off)
  • Identical Global Illumination settings (on or off)

In the short run, you could ask another customer with a GTX 580 card to test your scene (export it as an LS3 file). Remko also has a 580 card, but for that to work we would need screenshots of the settings as well as the resolution that you're running Lumion in.

In the long run, you could wait for Ferry to make a benchmark application for Lumion.

Hi Morten and everybody: thank you all for comments and participation to this thread...

As i said my GTX580 3Gb is sold (it was an EVGA manufacturer model) but I still have a MSI GTX580 1,5Gb. Problem is that my scene makes a huge usage of textures and only 2GB and more ram GC can handle it properly. So i can't do anymore the comparisons now !

But just for you to know i used exactly the same computer and scene (showed in first post). I have just switched the videocard and loaded the scene in lumion 2.5 (not yet experimented in lumion 3.0)...

« Last Edit: January 07, 2013, 07:13:26 pm by Morten »

Francan

    Reputation: 24
Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
« Reply #14 on: January 07, 2013, 02:44:07 pm »
January 07, 2013, 02:44:07 pm
Hi all

I agree with you Michael, from the time we request a file to make comparisons.
 
The selected file is not important from the moment that everyone uses the same.

Then we can compare our results. So please give us the famous file.

Or we simply Start Lumion on Sunny Day, and before doing anything, we give the number of vertices and FPS in the top right

This will already give an overview on power display our PC.

For me: 2097 k vertices - 44 FPS

With the following information:
 
OS: Win7 64
CPU: Dual Bi-Xeon X5460 3.16 GHz
RAM - ECC 32
GC + ram: GTX 670 4GB
Screen Size: 32 "2560x1600

This will provide a beginning of answer to our questions about what is the best configuration and the best graphics card for Lumion relative to investment. Especially for a laptop.

For information when I finish my swimming project OASIS, I was  with 36791 k vertices and 7 FPS
« Last Edit: January 07, 2013, 03:52:47 pm by Francan »