This is actually my first port on this website and believe it or not it feels good. I've been a member for quite some time, but actually never had a chance to post anything as I have no idea what to say anyways. To tell you the truth I found you guys (CGSOCIETY) the best out there when it comes to cg stuff. I saw a lot of crazy cg pictures of medals, object, and many other weird things. And all I can say is "WOW!". It's just amazing to see some much talent. And that's just not from here, in Canada or US, but all around the world. So many new ideas...I don't know man, it's just awesome.
Now onto my question.
To clarify one thing, I am not into cg nor I really understand it, but I had an argument with my friend about the CG Rendering (for movies). He is type of person that if he does something once or just reads about it, he is "Mr. Know-it-all". But that is not my problem anyways. All I want to hear is your opinion or get an answer (or explanation if possible) on how this really works and how long it takes. I hope you guys can help me!
First of all, I am not stupid I know it takes long because the cg we see in the movies is just ridiculously high and therefore it needs a lot of power to render it, but hear me out as
I will use my understanding and logic (it may be limited because of the knowledge on the subject so please excuse me) what I can think of it.
Lets take a look at the latest games for instance. They look good, but no matter how they look like, it still needs power to get rendered in real time, right? Now, if we were to compare my PC specs, and for instance my friends PC specs, his PC would outperform mine in no time. Thus, the game it self (for instance, H-L 2) would run at the same resolution much better on his PC. Correct?
Now if we were to take any crazy CG within the movie (any latest movie), and were to render those graphics on the hardware/software that is appropriate for it, it would render maybe in few days or even weeks. Right? But what I am trying to point out here is this: If a company chooses to render it on lower end model than the highest end modem of those servers, it would take much more time to get the job done. Right?
Now at this point my friend did not want to agree with me on this. Maybe I was not clear enough, but whatever.
The other thing I was pointing out is this: He claims that it takes up to 20 to render one single sec. Now that is crazy if you ask me. But let me explain something. If I were a Hollywood company and were to make a movie. If the money is not the question (as it always is), I would go with the latest technology and utilize as much as I can. Lets say we see the whole cluster of servers in front of us, and that setup is worth 100s of millions of dollars or even a 1 billion dollars. Now my question is: If I had a lot of power to play with, and use it all at once to do one single task (and that is to render that 1 sec), should it not take me less that 20 days to get the job done? I understand that if you spread the work around, it may take 20 day to render one sec, but the questions is how many "seconds" are being rendered at the same time?
Do you see what I am trying to say/understand? Can some one just explain to me how this thing work. I am not asking for books and such, just your opinion or piece of your knowledge that you might want to share with me/us so that I/some of us can understand this thing better?