Lumion Support Center

Support for unlicensed users => Post here if you can't find your License Key => Topic started by: Member3545 on July 07, 2011, 10:54:12 pm

Title: Need help
Post by: Member3545 on July 07, 2011, 10:54:12 pm
I'm undecided what graphics card to buy, as I am a user of AutoCAD and 3d Max and Quardos Graphics know that are good for these applications, but dont know if is good for Lumion.
I dont want to do a double expense and expensive I wanted to know who is good for these this three and both applications:

Acad + 3Dmax + Lumion = which graphics card is more appropriate and low cost

Does Nvidia Quadro 600 1 gb?
Title: Re: Need help
Post by: Member3234 on July 08, 2011, 10:14:32 am
I'm undecided what graphics card to buy, as I am a user of AutoCAD and 3d Max and Quardos Graphics know that are good for these applications, but dont know if is good for Lumion.
I dont want to do a double expense and expensive I wanted to know who is good for these this three and both applications:

Acad + 3Dmax + Lumion = which graphics card is more appropriate and low cost

Does Nvidia Quadro 600 1 gb?
Hey, I get a Quadro 600, and it's working well to render video (with Lumion 64 bits). If you don't need to show your project in Live and best quality, it's ok. But don't forget that the GeForce GTX 580 seems to be the best and choose by a lot of Lumion users for big project and live event. Good luck!
Title: Re: Need help
Post by: Member3545 on July 08, 2011, 03:51:18 pm
Thanks man!
Title: Re: Need help
Post by: peterm on July 12, 2011, 12:00:46 am
In a little late on this topic, but there's also the GTX570 2.5GB VRAM, starting to enter market, its price/performance seems good compared to 580 plus its got 1GB more VRAM and for Lumion that's all good for rendering and scenes.  There's a topic elsewhere on forum, only company producing so far though is EVGA.

edit: woops, was thinking about the 590 and had 3GB but of course its 2.5
Title: Re: Need help
Post by: Member2853 on July 12, 2011, 10:34:43 am
http://lumion3d.com/forum/index.php?topic=1406.0 (http://lumion3d.com/forum/index.php?topic=1406.0)

GTX 570 2.5go From EVGA and POV

have a nice day
Title: Re: Need help
Post by: Member1739 on July 12, 2011, 02:32:08 pm
Can anyone answer this?  
I think it has been asked before but I don't know that it ever got answered.
I could probably research it - more fun to ask here however.

What does more memory on the graphics card do for Lumion?

If you have a video card that has 1gig of memory on board but your computer say has 30 gigs of RAM on board will Lumion run differently than a computer that has a video card with 3 gigs of memory and a computer with 28 gigs of RAM on board?

What is the difference in memory on a graphics card verses a lot of extra RAM in the computer as it relates to how Lumion operates?

Does more memory on a graphics card just increase the speed at which Lumion operates?
Title: Re: Need help
Post by: Remko on July 13, 2011, 02:39:13 pm
Can anyone answer this?  
I think it has been asked before but I don't know that it ever got answered.
I could probably research it - more fun to ask here however.

What does more memory on the graphics card do for Lumion?

If you have a video card that has 1gig of memory on board but your computer say has 30 gigs of RAM on board will Lumion run differently than a computer that has a video card with 3 gigs of memory and a computer with 28 gigs of RAM on board?

What is the difference in memory on a graphics card verses a lot of extra RAM in the computer as it relates to how Lumion operates?

Does more memory on a graphics card just increase the speed at which Lumion operates?

When you hit the memory limit on the 3D Card you get graphics errors because Lumion won't be able to initialize memory buffers or textures. More memory on the 3D card means you can use more and bigger textures.

More memory on your system only matters if you have very low memory. If you have 4GB or something like that you are probably ok. Depending on the size of the models you load you need more memory. Memory has no influence on speed.
Title: Re: Need help
Post by: Member1739 on July 13, 2011, 02:45:14 pm
Thanks for the reply Remko -  :)

so simply put?

memory on one's system affects model sizes - more memory bigger models (more polygons)

memory on one's graphic card affects materials - more memory more materials (bigger materials)
Title: Re: Need help
Post by: peterm on July 16, 2011, 03:11:49 am
Would this sort of technique help with GPU texture issues?  Maybe its already being done or can't for Lumion, and I've only read their web site, not tested things, but if this can effectively manage all textures to the GPU.

(its from Unity but hey, it's the technique that might be worth considering)
Quote
Amplify is a Virtual Texturing extension for Unity Pro that allows scene/level designers to use a large amount of textures, quite literally up to hundreds of gigabytes, without worrying about streaming or GPU memory limits.

web site here (http://insidious.pt/#amplify).
Title: Re: Need help
Post by: Remko on July 28, 2011, 11:57:35 am
Would this sort of technique help with GPU texture issues?  Maybe its already being done or can't for Lumion, and I've only read their web site, not tested things, but if this can effectively manage all textures to the GPU.

(its from Unity but hey, it's the technique that might be worth considering)
web site here (http://insidious.pt/#amplify).

Think of it like Google earth's way of displaying the map. As you zoom in higher resolution textures are streamed from hard disk. A 1x1m resolution map of the enire world would never fit in your GPU ram. It's interesting technology but it does have it's limitations. The primary use for it is on low end systems to precalculate light maps. Try rage on the iPad or iPhone and you'll see what I mean. In general precalculating the lighting is a bit outdated but it's awesome for low end systems. If you have tiling textures, like most materials in Lumion. This technique has limited usefulness since it relies on the fact that only a small portion of a huge texture needs high resolution.

One cool thing I could think of is that you could use this technology to store a precomputed radiance transfer for your entire world and have dynamic radiosity for the entire scene. For games this is really cool. For Lumion not so much because it would take days to render the radiosity solution.