PDA

View Full Version : Maya 6.5 - Satellite Rendering


Koogle
02-01-2005, 09:46 PM
a great new feature ... finally

but i've noticed it isn't using a lot of the cpu power on my other networked pc, infact from looking at the progress messages in the output window it only does a few jobs like 10-20 , and the cpu percentage on the other pc barely goes up meanwhile the pc i'm using does most of the work :(

my other pc is an xp2200, 512mb.. and the current one i'm using is an xp2400, 1gb ram network is 100mbit with both pc's connected to a router...

are there any settings I can use? has anyone had better performance, solutions ???

Thanks

TomD
02-02-2005, 12:36 PM
Your router is that a hub or switched router?
In many cases of networkrendering, is the network a bottleneck.
In short networkhubs are less efficient than switches. Especially when your scene stays on your host computer and is read (over network) while rendering and outputting a frame (over network) thus creating a lot of network traffic. Don't know how satellite works exactly but your question suggests no local cache'ing on the satellite.

There are several solutions to improve networkspeed.
1) change hubs with switches speed is still 100Mbit but slightly more efficient .
2) if it is to connect 2 (maybe 3) computers only and if you use WinXP you can use firewire aka 1394-connection. If at least 1 side is an unpowered firewireport (the small ones) Speed can be 400 or 800Mbit
3) gigabit switches

oh yeah a dedicated harddisk for the maya-scene and another harddisk for windows' pagefile.

Suggestion Number 2 should be the cheapest speed improvement if you already have firewire installed, if not go for suggestion nr 3

Greetz

TomD

Koogle
02-02-2005, 02:22 PM
thanks for reply...

well I haven't got any firewire cable to try that out, and my router is switched based.

one other thing is that I have been talking to a friend who has the same network setup as me, but he uses 3dsmax and finalrender to do the same thing as satellite rendering. He says both his pc's cpu speed goes to 100% until the scene has been finished rendering.. so i'm a bit confused as to why i wouldn't get the same results.. i'm pretty sure there isn't any network bottleneck as data is been sent/received from maya on this machine to the satellite one. Just the other pc doesn't do much to help render anything

Selection from the output window on a scene with about 47000 edges .. 192.168.0.8 is my other machine

JOB 0.4 progr: 0.3% rendered on xp2400.4
JOB 0.3 progr: 0.6% rendered on xp2400.3
JOB 0.5 progr: 1.0% rendered on xp2400.5
JOB 0.4 progr: 1.3% rendered on xp2400.4
JOB 0.3 progr: 1.6% rendered on xp2400.3
JOB 0.4 progr: 2.0% rendered on xp2400.4
JOB 0.3 progr: 2.3% rendered on xp2400.3
JOB 0.4 progr: 2.6% rendered on xp2400.4
JOB 0.3 progr: 3.0% rendered on xp2400.3
JOB 0.n progr: 3.6% rendered on 192.168.0.8.0
JOB 0.4 progr: 3.3% rendered on xp2400.4
JOB 0.n progr: 4.6% rendered on 192.168.0.8.0
JOB 0.3 progr: 4.0% rendered on xp2400.3
JOB 0.4 progr: 4.3% rendered on xp2400.4
JOB 0.3 progr: 5.0% rendered on xp2400.3
JOB 0.n progr: 5.3% rendered on 192.168.0.8.0
JOB 0.4 progr: 5.6% rendered on xp2400.4
JOB 0.4 progr: 6.0% rendered on xp2400.4
JOB 0.n progr: 6.3% rendered on 192.168.0.8.0
JOB 0.4 progr: 6.6% rendered on xp2400.4
JOB 0.3 progr: 7.0% rendered on xp2400.3
JOB 0.n progr: 7.3% rendered on 192.168.0.8.0
JOB 0.4 progr: 7.6% rendered on xp2400.4
JOB 0.3 progr: 8.0% rendered on xp2400.3
JOB 0.4 progr: 8.3% rendered on xp2400.4
JOB 0.6 progr: 8.6% rendered on xp2400.6
JOB 0.4 progr: 9.0% rendered on xp2400.4
JOB 0.4 progr: 9.3% rendered on xp2400.4
JOB 0.6 progr: 9.6% rendered on xp2400.6
JOB 0.4 progr: 10.0% rendered on xp2400.4
JOB 0.6 progr: 10.3% rendered on xp2400.6
JOB 0.4 progr: 10.6% rendered on xp2400.4
JOB 0.6 progr: 11.0% rendered on xp2400.6
JOB 0.4 progr: 11.3% rendered on xp2400.4
JOB 0.6 progr: 11.6% rendered on xp2400.6
JOB 0.4 progr: 12.0% rendered on xp2400.4
JOB 0.6 progr: 12.3% rendered on xp2400.6
JOB 0.4 progr: 12.6% rendered on xp2400.4
JOB 0.4 progr: 13.0% rendered on xp2400.4
JOB 0.4 progr: 13.3% rendered on xp2400.4
JOB 0.6 progr: 13.6% rendered on xp2400.6
JOB 0.6 progr: 14.0% rendered on xp2400.6
JOB 0.4 progr: 14.3% rendered on xp2400.4
JOB 0.6 progr: 14.6% rendered on xp2400.6
JOB 0.6 progr: 15.0% rendered on xp2400.6
JOB 0.4 progr: 15.3% rendered on xp2400.4
JOB 0.6 progr: 15.6% rendered on xp2400.6
JOB 0.4 progr: 16.0% rendered on xp2400.4
JOB 0.6 progr: 16.3% rendered on xp2400.6
JOB 0.4 progr: 16.6% rendered on xp2400.4
JOB 0.6 progr: 17.0% rendered on xp2400.6
JOB 0.6 progr: 17.3% rendered on xp2400.6
JOB 0.4 progr: 17.6% rendered on xp2400.4
JOB 0.6 progr: 18.0% rendered on xp2400.6
JOB 0.4 progr: 18.3% rendered on xp2400.4
JOB 0.4 progr: 18.6% rendered on xp2400.4
JOB 0.6 progr: 19.0% rendered on xp2400.6
JOB 0.4 progr: 19.3% rendered on xp2400.4
JOB 0.6 progr: 19.6% rendered on xp2400.6
JOB 0.6 progr: 20.0% rendered on xp2400.6
JOB 0.4 progr: 20.3% rendered on xp2400.4
JOB 0.6 progr: 20.6% rendered on xp2400.6
JOB 0.4 progr: 21.0% rendered on xp2400.4
JOB 0.6 progr: 21.3% rendered on xp2400.6
JOB 0.4 progr: 21.6% rendered on xp2400.4
JOB 0.6 progr: 22.0% rendered on xp2400.6
JOB 0.4 progr: 22.3% rendered on xp2400.4
JOB 0.4 progr: 22.6% rendered on xp2400.4
JOB 0.6 progr: 23.0% rendered on xp2400.6
JOB 0.4 progr: 23.3% rendered on xp2400.4
JOB 0.6 progr: 23.6% rendered on xp2400.6
JOB 0.4 progr: 24.0% rendered on xp2400.4
JOB 0.6 progr: 24.3% rendered on xp2400.6
JOB 0.6 progr: 24.6% rendered on xp2400.6
JOB 0.4 progr: 25.0% rendered on xp2400.4
JOB 0.4 progr: 25.3% rendered on xp2400.4
JOB 0.6 progr: 25.6% rendered on xp2400.6
JOB 0.4 progr: 26.0% rendered on xp2400.4
JOB 0.6 progr: 26.3% rendered on xp2400.6
JOB 0.4 progr: 26.6% rendered on xp2400.4
JOB 0.6 progr: 27.0% rendered on xp2400.6
JOB 0.4 progr: 27.3% rendered on xp2400.4
JOB 0.6 progr: 27.6% rendered on xp2400.6
JOB 0.4 progr: 28.0% rendered on xp2400.4
JOB 0.6 progr: 28.3% rendered on xp2400.6
JOB 0.4 progr: 28.6% rendered on xp2400.4
IMG 0.4 progr: opening texture C:\tmp\Sky\diffuse.jpg, for reading
JOB 0.6 progr: 29.0% rendered on xp2400.6
JOB 0.3 progr: 29.3% rendered on xp2400.3
JOB 0.3 progr: 29.6% rendered on xp2400.3
JOB 0.3 progr: 30.0% rendered on xp2400.3
JOB 0.3 progr: 30.3% rendered on xp2400.3
JOB 0.3 progr: 30.6% rendered on xp2400.3
JOB 0.6 progr: 31.0% rendered on xp2400.6
JOB 0.3 progr: 31.3% rendered on xp2400.3
JOB 0.3 progr: 31.6% rendered on xp2400.3
JOB 0.3 progr: 32.0% rendered on xp2400.3
JOB 0.3 progr: 32.3% rendered on xp2400.3
JOB 0.4 progr: 32.6% rendered on xp2400.4
JOB 0.3 progr: 33.0% rendered on xp2400.3
JOB 0.4 progr: 33.3% rendered on xp2400.4
JOB 0.3 progr: 33.6% rendered on xp2400.3
JOB 0.4 progr: 34.0% rendered on xp2400.4
JOB 0.3 progr: 34.3% rendered on xp2400.3
JOB 0.3 progr: 34.6% rendered on xp2400.3
JOB 0.4 progr: 35.0% rendered on xp2400.4
JOB 0.3 progr: 35.3% rendered on xp2400.3
JOB 0.4 progr: 35.6% rendered on xp2400.4
JOB 0.3 progr: 36.0% rendered on xp2400.3
JOB 0.3 progr: 36.3% rendered on xp2400.3
JOB 0.4 progr: 36.6% rendered on xp2400.4
JOB 0.4 progr: 37.0% rendered on xp2400.4
JOB 0.3 progr: 37.3% rendered on xp2400.3
JOB 0.4 progr: 37.6% rendered on xp2400.4
JOB 0.3 progr: 38.0% rendered on xp2400.3
JOB 0.4 progr: 38.3% rendered on xp2400.4
JOB 0.3 progr: 38.6% rendered on xp2400.3
JOB 0.4 progr: 39.0% rendered on xp2400.4
JOB 0.3 progr: 39.3% rendered on xp2400.3
JOB 0.4 progr: 39.6% rendered on xp2400.4
JOB 0.3 progr: 40.0% rendered on xp2400.3
JOB 0.4 progr: 40.3% rendered on xp2400.4
JOB 0.3 progr: 40.6% rendered on xp2400.3
JOB 0.4 progr: 41.0% rendered on xp2400.4
JOB 0.3 progr: 41.3% rendered on xp2400.3
JOB 0.4 progr: 41.6% rendered on xp2400.4
JOB 0.4 progr: 42.0% rendered on xp2400.4
JOB 0.3 progr: 42.3% rendered on xp2400.3
JOB 0.3 progr: 42.6% rendered on xp2400.3
JOB 0.4 progr: 43.0% rendered on xp2400.4
JOB 0.3 progr: 43.3% rendered on xp2400.3
JOB 0.4 progr: 43.6% rendered on xp2400.4
JOB 0.3 progr: 44.0% rendered on xp2400.3
JOB 0.4 progr: 44.3% rendered on xp2400.4
JOB 0.3 progr: 44.6% rendered on xp2400.3
JOB 0.3 progr: 45.0% rendered on xp2400.3
JOB 0.4 progr: 45.3% rendered on xp2400.4
JOB 0.3 progr: 45.6% rendered on xp2400.3
JOB 0.4 progr: 46.0% rendered on xp2400.4
JOB 0.3 progr: 46.3% rendered on xp2400.3
JOB 0.4 progr: 46.6% rendered on xp2400.4
JOB 0.3 progr: 47.0% rendered on xp2400.3
JOB 0.4 progr: 47.3% rendered on xp2400.4
JOB 0.3 progr: 47.6% rendered on xp2400.3
JOB 0.4 progr: 48.0% rendered on xp2400.4
JOB 0.4 progr: 48.3% rendered on xp2400.4
JOB 0.3 progr: 48.6% rendered on xp2400.3
JOB 0.3 progr: 49.0% rendered on xp2400.3
JOB 0.4 progr: 49.3% rendered on xp2400.4
JOB 0.3 progr: 49.6% rendered on xp2400.3
JOB 0.4 progr: 50.0% rendered on xp2400.4
JOB 0.3 progr: 50.3% rendered on xp2400.3
JOB 0.4 progr: 50.6% rendered on xp2400.4
JOB 0.3 progr: 51.0% rendered on xp2400.3
JOB 0.4 progr: 51.3% rendered on xp2400.4
JOB 0.3 progr: 52.0% rendered on xp2400.3
JOB 0.4 progr: 51.6% rendered on xp2400.4
JOB 0.4 progr: 52.3% rendered on xp2400.4
JOB 0.3 progr: 52.6% rendered on xp2400.3
JOB 0.4 progr: 53.0% rendered on xp2400.4
JOB 0.3 progr: 53.3% rendered on xp2400.3
JOB 0.3 progr: 53.6% rendered on xp2400.3
JOB 0.4 progr: 54.0% rendered on xp2400.4
IMG 0.3 progr: opening texture C:\tmp\Sky\Hair_diffus.jpg, for reading
JOB 0.4 progr: 54.3% rendered on xp2400.4
JOB 0.4 progr: 54.6% rendered on xp2400.4
JOB 0.4 progr: 55.0% rendered on xp2400.4
JOB 0.3 progr: 55.3% rendered on xp2400.3
JOB 0.4 progr: 55.6% rendered on xp2400.4
JOB 0.3 progr: 56.0% rendered on xp2400.3
JOB 0.4 progr: 56.3% rendered on xp2400.4
JOB 0.3 progr: 56.6% rendered on xp2400.3
JOB 0.4 progr: 57.0% rendered on xp2400.4
JOB 0.3 progr: 57.3% rendered on xp2400.3
JOB 0.4 progr: 57.6% rendered on xp2400.4
JOB 0.3 progr: 58.0% rendered on xp2400.3
JOB 0.4 progr: 58.3% rendered on xp2400.4
JOB 0.3 progr: 58.6% rendered on xp2400.3
JOB 0.4 progr: 59.0% rendered on xp2400.4
JOB 0.3 progr: 59.3% rendered on xp2400.3
JOB 0.4 progr: 59.6% rendered on xp2400.4
JOB 0.3 progr: 60.0% rendered on xp2400.3
JOB 0.3 progr: 60.3% rendered on xp2400.3
JOB 0.4 progr: 60.6% rendered on xp2400.4
JOB 0.3 progr: 61.0% rendered on xp2400.3
JOB 0.4 progr: 61.3% rendered on xp2400.4
JOB 0.3 progr: 61.6% rendered on xp2400.3
JOB 0.4 progr: 62.0% rendered on xp2400.4
JOB 0.3 progr: 62.3% rendered on xp2400.3
JOB 0.4 progr: 62.6% rendered on xp2400.4
JOB 0.2 progr: 63.0% rendered on xp2400.2
JOB 0.4 progr: 63.3% rendered on xp2400.4
JOB 0.2 progr: 63.6% rendered on xp2400.2
JOB 0.4 progr: 64.0% rendered on xp2400.4
JOB 0.2 progr: 64.3% rendered on xp2400.2
JOB 0.4 progr: 64.6% rendered on xp2400.4
JOB 0.2 progr: 65.0% rendered on xp2400.2
JOB 0.2 progr: 65.3% rendered on xp2400.2
JOB 0.4 progr: 65.6% rendered on xp2400.4
JOB 0.2 progr: 66.0% rendered on xp2400.2
JOB 0.2 progr: 66.3% rendered on xp2400.2
JOB 0.4 progr: 66.6% rendered on xp2400.4
JOB 0.2 progr: 67.0% rendered on xp2400.2
JOB 0.4 progr: 67.3% rendered on xp2400.4
JOB 0.2 progr: 67.6% rendered on xp2400.2
JOB 0.4 progr: 68.0% rendered on xp2400.4
JOB 0.2 progr: 68.3% rendered on xp2400.2
JOB 0.4 progr: 68.6% rendered on xp2400.4
JOB 0.2 progr: 69.0% rendered on xp2400.2
JOB 0.4 progr: 69.3% rendered on xp2400.4
JOB 0.2 progr: 69.6% rendered on xp2400.2
JOB 0.4 progr: 70.0% rendered on xp2400.4
JOB 0.4 progr: 70.3% rendered on xp2400.4
JOB 0.2 progr: 70.6% rendered on xp2400.2
JOB 0.2 progr: 71.0% rendered on xp2400.2
JOB 0.4 progr: 71.3% rendered on xp2400.4
JOB 0.2 progr: 71.6% rendered on xp2400.2
JOB 0.4 progr: 72.0% rendered on xp2400.4
JOB 0.2 progr: 72.3% rendered on xp2400.2
JOB 0.4 progr: 72.6% rendered on xp2400.4
JOB 0.2 progr: 73.0% rendered on xp2400.2
JOB 0.4 progr: 73.3% rendered on xp2400.4
JOB 0.2 progr: 73.6% rendered on xp2400.2
JOB 0.4 progr: 74.0% rendered on xp2400.4
JOB 0.6 progr: 74.3% rendered on xp2400.6
JOB 0.4 progr: 74.6% rendered on xp2400.4
JOB 0.6 progr: 75.0% rendered on xp2400.6
JOB 0.4 progr: 75.3% rendered on xp2400.4
JOB 0.6 progr: 75.6% rendered on xp2400.6
JOB 0.6 progr: 76.0% rendered on xp2400.6
JOB 0.4 progr: 76.3% rendered on xp2400.4
JOB 0.6 progr: 76.6% rendered on xp2400.6
JOB 0.4 progr: 77.0% rendered on xp2400.4
JOB 0.6 progr: 77.3% rendered on xp2400.6
JOB 0.4 progr: 77.6% rendered on xp2400.4
JOB 0.6 progr: 78.0% rendered on xp2400.6
JOB 0.4 progr: 78.3% rendered on xp2400.4
JOB 0.6 progr: 78.6% rendered on xp2400.6
JOB 0.4 progr: 79.0% rendered on xp2400.4
JOB 0.6 progr: 79.3% rendered on xp2400.6
JOB 0.4 progr: 79.6% rendered on xp2400.4
JOB 0.6 progr: 80.0% rendered on xp2400.6
JOB 0.6 progr: 80.3% rendered on xp2400.6
JOB 0.4 progr: 80.6% rendered on xp2400.4
JOB 0.6 progr: 81.0% rendered on xp2400.6
JOB 0.4 progr: 81.3% rendered on xp2400.4
JOB 0.6 progr: 81.6% rendered on xp2400.6
JOB 0.4 progr: 82.0% rendered on xp2400.4
JOB 0.6 progr: 82.3% rendered on xp2400.6
JOB 0.6 progr: 82.6% rendered on xp2400.6
JOB 0.4 progr: 83.0% rendered on xp2400.4
JOB 0.6 progr: 83.3% rendered on xp2400.6
JOB 0.4 progr: 83.6% rendered on xp2400.4
JOB 0.6 progr: 84.0% rendered on xp2400.6
JOB 0.4 progr: 84.3% rendered on xp2400.4
JOB 0.6 progr: 84.6% rendered on xp2400.6
JOB 0.4 progr: 85.0% rendered on xp2400.4
JOB 0.6 progr: 85.3% rendered on xp2400.6
JOB 0.4 progr: 85.6% rendered on xp2400.4
JOB 0.6 progr: 86.0% rendered on xp2400.6
JOB 0.4 progr: 86.3% rendered on xp2400.4
JOB 0.6 progr: 86.6% rendered on xp2400.6
JOB 0.4 progr: 87.0% rendered on xp2400.4
JOB 0.4 progr: 87.3% rendered on xp2400.4
JOB 0.6 progr: 87.6% rendered on xp2400.6
JOB 0.4 progr: 88.0% rendered on xp2400.4
JOB 0.6 progr: 88.3% rendered on xp2400.6
JOB 0.4 progr: 88.6% rendered on xp2400.4
JOB 0.6 progr: 89.0% rendered on xp2400.6
JOB 0.4 progr: 89.3% rendered on xp2400.4
JOB 0.6 progr: 89.6% rendered on xp2400.6
JOB 0.4 progr: 90.0% rendered on xp2400.4
JOB 0.6 progr: 90.3% rendered on xp2400.6
JOB 0.4 progr: 90.6% rendered on xp2400.4
JOB 0.2 progr: 91.0% rendered on xp2400.2
JOB 0.4 progr: 91.3% rendered on xp2400.4
JOB 0.2 progr: 91.6% rendered on xp2400.2
JOB 0.4 progr: 92.0% rendered on xp2400.4
JOB 0.4 progr: 92.3% rendered on xp2400.4
JOB 0.2 progr: 92.6% rendered on xp2400.2
JOB 0.4 progr: 93.0% rendered on xp2400.4
JOB 0.2 progr: 93.3% rendered on xp2400.2
JOB 0.4 progr: 93.6% rendered on xp2400.4
JOB 0.4 progr: 94.0% rendered on xp2400.4
JOB 0.2 progr: 94.3% rendered on xp2400.2
JOB 0.4 progr: 94.6% rendered on xp2400.4
JOB 0.2 progr: 95.0% rendered on xp2400.2
JOB 0.2 progr: 95.3% rendered on xp2400.2
JOB 0.4 progr: 95.6% rendered on xp2400.4
JOB 0.2 progr: 96.0% rendered on xp2400.2
JOB 0.4 progr: 96.3% rendered on xp2400.4
JOB 0.2 progr: 96.6% rendered on xp2400.2
JOB 0.4 progr: 97.0% rendered on xp2400.4
JOB 0.2 progr: 97.3% rendered on xp2400.2
JOB 0.4 progr: 97.6% rendered on xp2400.4
JOB 0.2 progr: 98.0% rendered on xp2400.2
JOB 0.4 progr: 98.3% rendered on xp2400.4
JOB 0.4 progr: 98.6% rendered on xp2400.4
JOB 0.2 progr: 99.0% rendered on xp2400.2
JOB 0.2 progr: 99.3% rendered on xp2400.2
JOB 0.4 progr: 99.6% rendered on xp2400.4
JOB 0.n progr: 100.0% rendered on 192.168.0.8.0

well i'm still waiting to hear what other people have gotten out of Maya's new satellite rendering setup :/

Koogle
02-04-2005, 11:26 PM
Has anyone had any better performance yet?

Steve McRae
02-05-2005, 12:57 AM
I would love to see what would happen over a wireless network :D

floze
02-05-2005, 01:52 PM
To give you guys a little hint:
Increase the 'Task Size' in the 'Memory and Performance' tab of your renderglobals to at least 64, or even better 128. This will give you less rendertiles for preview but a way better performance on network distributed renderings.
And I would not recommend using wireless setups, because sending over the scene at the start of the rendering can take forever. Also, set the network adapters' 'Optimization' option to 'Throughput', not 'CPU'.

Koogle
02-05-2005, 10:57 PM
thanks for the reply floze.. I got the task size bit sorted.. but i can't figure out where the network adapters option is in maya?.. I also checked the windows network adapter bit for anything like that but didn't find it

anyway performance with just the task size at 128... didn't make a difference.. the other cpu only took a tiny few parts in comparison

floze
02-05-2005, 11:22 PM
thanks for the reply floze.. I got the task size bit sorted.. but i can't figure out where the network adapters option is in maya?.. I also checked the windows network adapter bit for anything like that but didn't find it

anyway performance with just the task size at 128... didn't make a difference.. the other cpu only took a tiny few parts in comparison
Uhmm.. the network adapter stuff actually is located in the network adapters driver configuration.

You dont render a single sphere, do you? Rendering very simple stuff takes longer over network. If things get heavy, like with high sampling values, lens shaders, fg, etc. distributed rendering boosts your speeds up to the factor the other CPUs have compared to the one of your mother machine. For example, if the mother machine has 2GHz and you have two other 2GHz slave machines attached, the rendering will be almost three times faster.
However, distributed finalgathering hasnt got that performance (because it's calculated in very small tasks I guess).

Koogle
02-05-2005, 11:53 PM
You dont render a single sphere, do you?

hahah..ah nope i don't :) .. i've used the same scene all the time.. its the same scene from the code output in a previous reply.. its a fairly heavy scene -no final gathering either

anyway I still can't find that exact network configuration... to be honest I don't even think its got anything to do with my network setup either.

well I really hope can find a better way to get maya to split the work load evenly because its a really nice way of working on single image.

At the moment it seems like a wasted feature or just a quick thing Alias chucked in without optimizing it... I'll hold that view until someone can show evidence of it working better than mine, and then i'll want to know how :)

neutronic
02-06-2005, 01:18 AM
At the moment it seems like a wasted feature or just a quick thing Alias chucked in without optimizing it... I'll hold that view until someone can show evidence of it working better than mine, and then i'll want to know how man mentalimages literally invented the distributed shared database technology. it is ages that half of the world work with this one. it is just the last alias license feature, not really a new tech features.. and you are just the last arrived.


who said for example that what you see in the log is actually meaningful to understand cpu usage? mr do its own splitting tasks... take care.
JOB 0.2 progr: 0.1% computing final gather points on masterone.2
JOB 0.3 progr: 0.2% computing final gather points on masterone.3
JOB 0.n progr: 0.3% computing final gather points on moody.0
JOB 0.n progr: 0.4% computing final gather points on moody.0
JOB 0.2 progr: 0.6% computing final gather points on masterone.2
JOB 0.n progr: 0.7% computing final gather points on moody.0
JOB 0.4 progr: 0.8% computing final gather points on masterone.4
JOB 0.3 progr: 0.9% computing final gather points on masterone.3
JOB 0.n progr: 1.0% computing final gather points on moody.0
JOB 0.4 progr: 1.2% computing final gather points on masterone.4
JOB 0.2 progr: 1.3% computing final gather points on masterone.2
JOB 0.n progr: 1.4% computing final gather points on moody.0
JOB 0.2 progr: 1.5% computing final gather points on masterone.2
JOB 0.3 progr: 1.7% computing final gather points on masterone.3
JOB 0.n progr: 1.8% computing final gather points on moody.0
JOB 0.4 progr: 1.9% computing final gather points on masterone.4
JOB 0.3 progr: 2.0% computing final gather points on masterone.3
JOB 0.2 progr: 2.1% computing final gather points on masterone.2
JOB 0.n progr: 2.3% computing final gather points on moody.0
JOB 0.2 progr: 2.4% computing final gather points on masterone.2
JOB 0.n progr: 2.5% computing final gather points on moody.0
JOB 0.n progr: 2.6% computing final gather points on moody.0
JOB 0.4 progr: 2.8% computing final gather points on masterone.4
JOB 0.3 progr: 2.9% computing final gather points on masterone.3
JOB 0.n progr: 3.0% computing final gather points on moody.0
JOB 0.2 progr: 3.1% computing final gather points on masterone.2
JOB 0.n progr: 3.2% computing final gather points on moody.0
JOB 0.4 progr: 3.4% computing final gather points on masterone.4
JOB 0.3 progr: 3.5% computing final gather points on masterone.3
JOB 0.2 progr: 3.6% computing final gather points on masterone.2
JOB 0.n progr: 3.7% computing final gather points on moody.0
JOB 0.3 progr: 3.9% computing final gather points on masterone.3
JOB 0.n progr: 4.0% computing final gather points on moody.0
JOB 0.2 progr: 4.1% computing final gather points on masterone.2
JOB 0.3 progr: 4.2% computing final gather points on masterone.3
JOB 0.n progr: 4.3% computing final gather points on moody.0
JOB 0.3 progr: 4.5% computing final gather points on masterone.3
JOB 0.4 progr: 4.6% computing final gather points on masterone.4
JOB 0.2 progr: 4.7% computing final gather points on masterone.2
JOB 0.3 progr: 4.8% computing final gather points on masterone.3
JOB 0.n progr: 5.0% computing final gather points on moody.0
JOB 0.n progr: 5.1% computing final gather points on moody.0
JOB 0.2 progr: 5.2% computing final gather points on masterone.2
JOB 0.2 progr: 5.3% computing final gather points on masterone.2
JOB 0.2 progr: 5.4% computing final gather points on masterone.2
JOB 0.5 progr: 5.6% computing final gather points on masterone.5
JOB 0.n progr: 5.7% computing final gather points on moody.0
JOB 0.2 progr: 5.8% computing final gather points on masterone.2
JOB 0.3 progr: 5.9% computing final gather points on masterone.3
JOB 0.n progr: 6.1% computing final gather points on moody.0
JOB 0.2 progr: 6.2% computing final gather points on masterone.2
JOB 0.2 progr: 6.3% computing final gather points on masterone.2
JOB 0.3 progr: 6.4% computing final gather points on masterone.3
JOB 0.2 progr: 6.5% computing final gather points on masterone.2
JOB 0.2 progr: 6.7% computing final gather points on masterone.2
JOB 0.2 progr: 6.8% computing final gather points on masterone.2
JOB 0.2 progr: 6.9% computing final gather points on masterone.2
JOB 0.n progr: 7.0% computing final gather points on moody.0
JOB 0.3 progr: 7.2% computing final gather points on masterone.3
JOB 0.2 progr: 7.3% computing final gather points on masterone.2
JOB 0.2 progr: 7.4% computing final gather points on masterone.2
JOB 0.2 progr: 7.5% computing final gather points on masterone.2
JOB 0.n progr: 7.6% computing final gather points on moody.0
JOB 0.2 progr: 7.8% computing final gather points on masterone.2
JOB 0.2 progr: 7.9% computing final gather points on masterone.2
JOB 0.4 progr: 8.0% computing final gather points on masterone.4
JOB 0.n progr: 8.1% computing final gather points on moody.0
JOB 0.2 progr: 8.3% computing final gather points on masterone.2
JOB 0.2 progr: 8.4% computing final gather points on masterone.2
JOB 0.3 progr: 8.5% computing final gather points on masterone.3
JOB 0.2 progr: 8.6% computing final gather points on masterone.2
JOB 0.n progr: 8.7% computing final gather points on moody.0
JOB 0.2 progr: 8.9% computing final gather points on masterone.2
JOB 0.4 progr: 9.0% computing final gather points on masterone.4
JOB 0.2 progr: 9.1% computing final gather points on masterone.2
JOB 0.n progr: 9.2% computing final gather points on moody.0
JOB 0.2 progr: 9.4% computing final gather points on masterone.2
JOB 0.2 progr: 9.5% computing final gather points on masterone.2
JOB 0.n progr: 9.6% computing final gather points on moody.0


/n

Jozvex
02-08-2005, 06:47 AM
And I would not recommend using wireless setups, because sending over the scene at the start of the rendering can take forever.

Hehe yes it's true. I'm on a wireless (802.11g) network and at a max of 54mbits it takes a few minutes to send a 30meg scene file and textures across for rendering with satellite. Once the scene files are sent over though my other computer soon reaches 90% cpu usage and stays at that level pretty solidly.

However, the other computer only seems to kick in once 60% of the scene is already rendered by my main computer.

Koogle
02-08-2005, 10:43 AM
I just don't get it... howcome both of you guys are getting much better results than me ????

jooki
02-09-2005, 09:12 PM
Another thing worth trying is "nomaster" flag.
If "nomaster" is on, mental ray "tries" to allocate as much render job as possible to slaves.
You can set it through :
- for interactive render : menu -> render ->render current frame option box -> turn off "Render on this machine"
- for batch render : menu -> render -> batch render option box -> turn off "Render on this machine".

It will increase the slave usage, but master will still do some work.

Steve McRae
02-09-2005, 09:51 PM
all very cool - I can't wait till my copy ships . . .

mverta
02-13-2005, 04:05 PM
You know what would be cool? If there was a little checkbox in the Render Globals that said, "Satellite Render", so when you click Render in the Render View, it just goes out to the network, and returns the results to the window.

Seems to me that's how it should work. An integration that feels like you've just got "x" number of additional processors on board.

_Mike

francescaluce
02-13-2005, 07:21 PM
You know what would be cool? If there was a little checkbox in the Render Globals that said, "Satellite Render", so when you click Render in the Render View, it just goes out to the network, and returns the results to the window. that's exactly how it is.. but instead of your little checkbox you have just to type the names of your network computers.

Jozvex
02-13-2005, 08:28 PM
You know what would be cool? If there was a little checkbox in the Render Globals that said, "Satellite Render", so when you click Render in the Render View, it just goes out to the network, and returns the results to the window.

Seems to me that's how it should work. An integration that feels like you've just got "x" number of additional processors on board.

_Mike

Yep that's basically how it is! Any rendering stuff you do the other computers just automatically help.

ali-rahimi-shahmirzadi
09-14-2005, 08:07 PM
I have the same problem, but the important thing is that befor maya satellite 7 i used maya 6.5 + MR stand alone 3.4 and everything was ok but now.... its a problem. so there must be a problem with a satellite.any idea?

CGTalk Moderation
09-14-2005, 08:07 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.