Author Topic: when multi GPU ?  (Read 7502 times)

lucaradio

    Reputation: 0
when multi GPU ?
« on: September 05, 2011, 07:28:55 pm »
September 05, 2011, 07:28:55 pm
Dear LUMION STAFF

are you thinking about Lumion with multi GPU ? (like geforce gtx590) because is too slow
sorry but i don't speak english

RAD

    Reputation: 32
Re: when multi GPU ?
« Reply #1 on: September 05, 2011, 08:31:14 pm »
September 05, 2011, 08:31:14 pm
It is too slow?  LOL
What is slow?
What is faster?  What is faster?

lucaradio

    Reputation: 0
Re: when multi GPU ?
« Reply #2 on: September 06, 2011, 12:18:45 am »
September 06, 2011, 12:18:45 am
It is too slow?  LOL
What is slow?
What is faster?  What is faster?
working with only one GPU and when the scene is too big ,lumion running slow.
the second GPU not working ...

RAD

    Reputation: 32
Re: when multi GPU ?
« Reply #3 on: September 06, 2011, 02:28:12 am »
September 06, 2011, 02:28:12 am
Lumion currently only works with one GPU on the card.

There is a lot of discussion about this on these forums if you are willing to hunt them down.

 :)

artmaknev

    Reputation: 1
Re: when multi GPU ?
« Reply #4 on: September 06, 2011, 08:41:55 pm »
September 06, 2011, 08:41:55 pm
If I will have 2 GPU cards set as SLi mode, Lumion will be able to access just 1 card?

RAD

    Reputation: 32
Re: when multi GPU ?
« Reply #5 on: September 07, 2011, 12:25:28 am »
September 07, 2011, 12:25:28 am
That is my understanding. No SLI configuration YET.  DX12

Re: when multi GPU ?
« Reply #6 on: September 07, 2011, 02:16:54 pm »
September 07, 2011, 02:16:54 pm
Lumion does support multi GPU but the performance gain is not that big. We'll see what happens with the new 2.0 render engine.

Goal043

    Reputation: 0
Re: when multi GPU ?
« Reply #7 on: January 17, 2012, 05:26:45 pm »
January 17, 2012, 05:26:45 pm
Lumion does support multi GPU but the performance gain is not that big. We'll see what happens with the new 2.0 render engine.

So, what happened?

Re: when multi GPU ?
« Reply #8 on: January 18, 2012, 09:35:25 am »
January 18, 2012, 09:35:25 am
So, what happened?

Not looked at it yet. The 25th we'll get a special multi GPU computer so we can start doing some tests. Of course we looked at this before but I never could find why it does not work. It looks like we're doing all the things needed to make SLI work.

Michael

    Reputation: 10
Re: when multi GPU ?
« Reply #9 on: January 18, 2012, 07:29:24 pm »
January 18, 2012, 07:29:24 pm
Lumion does support multi GPU but the performance gain is not that big. We'll see what happens with the new 2.0 render engine.
Is this new render engine going to be part of a service-pack for Lumion 2? If so: :-9

stucki

    Reputation: 32
Re: when multi GPU ?
« Reply #10 on: January 19, 2012, 12:51:57 am »
January 19, 2012, 12:51:57 am
it is already inside lumion 2.0 if i understood right.
Otherwise indoor lighting wouldnt be possible ...

Re: when multi GPU ?
« Reply #11 on: January 19, 2012, 10:52:08 am »
January 19, 2012, 10:52:08 am
it is already inside lumion 2.0 if i understood right.
Otherwise indoor lighting wouldnt be possible ...


Indoor lighting has nothing to do with multi GPU. I'm wainting for the new computer to arrive so we can take a look at it. We looked at it before but apparently we need to do some additional tests because it seems like multiple GPUs does not have any positive effects on performance.

stucki

    Reputation: 32
Re: when multi GPU ?
« Reply #12 on: January 19, 2012, 10:55:50 am »
January 19, 2012, 10:55:50 am
but hasnt lumion 2.0 deferred rendering which was used for indoor lighting ?

Re: when multi GPU ?
« Reply #13 on: January 19, 2012, 02:38:23 pm »
January 19, 2012, 02:38:23 pm
Yes, the 2.0 deferred render engine that was mentioned back in September is the same that Ferry and Remko implemented in Lumion 2.

Ming Architect

    Reputation: 3
Re: when multi GPU ?
« Reply #14 on: January 19, 2012, 05:30:46 pm »
January 19, 2012, 05:30:46 pm
Thing moving fast. Just stumble upon this: http://stadia3d.com/

Stadia enables architects to generate interactive renderings straight from Revit model data.
It work like this. Architect send the revit models and got baked textures .exe file within minute/hour. I think they found ways to automate it with unity or quest.

It seem cloud is big thing to come in most cpu rendering side. Multi GPU is monetizable. People can start Lumion farm rendering or how about Lumion cloud? Geohow and many users would be very happy and can do some impossible animation in no time.

My high quality V-ray rendering take almost an hour per image. Imagine we can get 10 minutes HD animation within the same timeframe after Lumion set up is done without bankrupcy. That's real breakthrough for the whole industry!!! In artists hand, 10 minutes of Lumion animation would not make less money than one V-Ray for them. And possible make a lot more money ;)

Thanks,
Ming