• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How soon before...

Theodore Kurita

Leader of the Draconis Combine
Joined
Mar 4, 2003
Messages
905
Home PC's will be able to render CGI Movie Quality games?

What I mean by that is, how much longer it be before home pc's or consoles will be able to render in real time ultra high end graphics in games that are on par with movies like Final Fantasy Advent Children or Final Fantasy Spirits Within.

I have been studying over this for some time, and I think that the technology will be avaliable within the next two decades at the very least.

There is some heated debate going around in the game community on this.

Most of it centers around the wonderful new console coming out in the Next two years that is known as the "Playstation 3"

It integrates a new processor technology called "cell" which basically makes your home console a parallel computer. EA seems to have really high expectations for what the system can produce, as well as a large amount of other gaming companies.

This same "cell" technology will be avaliable and mass produced within the next couple of years.

So the question boils down to again, how long do you think it will be?
 
It will never happen, because CGI movies will always look better than the games, simply because they have more time to render, and can thus spend more time on detail.

Perhaps what you meant to ask is, when will home PCs be able to render the quality of today's CGI movies.
 
thatguywhojuggles said:
It will never happen, because CGI movies will always look better than the games, simply because they have more time to render, and can thus spend more time on detail.

Sorry, this could happen one day, as there is a limited resolution of the human eye, so above a certain detail level, human eye will not notice any further improve.

But i think its some time of till human eye will not notice any difference between calculated pictures and reality.

Carn
 
Carn said:
Sorry, this could happen one day, as there is a limited resolution of the human eye, so above a certain detail level, human eye will not notice any further improve.

But i think its some time of till human eye will not notice any difference between calculated pictures and reality.

Carn

I still believe given more time to render, a CGI movie animator would find ways of making the movie better quality.
 
Some interesting data to consider.....

You are surely aware of Moore's law that indicates that the power of PC cpu's will double every 18 months. We've been holding to that rule pretty well for quite a while now.

There is another correlaton that has held equally long, but I've yet to see it mentioned in the media. Maybe we should call it Scott's Interval (law). Actually, I'm sure somebody must have noticed this and named it. Anyway.... There is about 14 years between the performance of the fastest machine on the planet and what is available as a high end desktop PC.

If that little correlation continues to hold, we should see Blue Gene type power on our desktops by 2020.

We are starting to see a couple of games support really accurate physics engines driving the animation today. (See the most recent Half-Life (2), The latest Unreal Tournament, and the lastest Burnout for examples)

Photo realistic rendering is a product ray-trace technology. This is too time consuming to do in real time, today. However, this is a problem that really can take advantage of massively parallel processing. And that is probably the way the future of ever increasing desktop performance will unfold. Hyper-Threading Intel CPUs are a psuedo stab at that. True dual core CPUs are entering the market now. It is also conceivable that there will be a future generation of 3-d accelerated video cards that have hardware ray-tracing support in them.

Anyway, looking NECs current Earth Simulator machine, and IBM's Blue Gene, they would appear to have the processing power to run a full physics ray tracing game engine.

So, my GUESS, 14 years or less.
 
scotth said:
Some interesting data to consider.....

You are surely aware of Moore's law that indicates that the power of PC cpu's will double every 18 months. We've been holding to that rule pretty well for quite a while now.

There is another correlaton that has held equally long, but I've yet to see it mentioned in the media. Maybe we should call it Scott's Interval (law). Actually, I'm sure somebody must have noticed this and named it. Anyway.... There is about 14 years between the performance of the fastest machine on the planet and what is available as a high end desktop PC.

If that little correlation continues to hold, we should see Blue Gene type power on our desktops by 2020.

We are starting to see a couple of games support really accurate physics engines driving the animation today. (See the most recent Half-Life (2), The latest Unreal Tournament, and the lastest Burnout for examples)

Photo realistic rendering is a product ray-trace technology. This is too time consuming to do in real time, today. However, this is a problem that really can take advantage of massively parallel processing. And that is probably the way the future of ever increasing desktop performance will unfold. Hyper-Threading Intel CPUs are a psuedo stab at that. True dual core CPUs are entering the market now. It is also conceivable that there will be a future generation of 3-d accelerated video cards that have hardware ray-tracing support in them.

Anyway, looking NECs current Earth Simulator machine, and IBM's Blue Gene, they would appear to have the processing power to run a full physics ray tracing game engine.

So, my GUESS, 14 years or less.

And it will fit on your keychain.

I have noticed that in the theater, computer graphics are okay. But put the same movie on a TV screen, they look like cartoons. Not impressive at all. For a good example, play the scene in Matrix 2 where Neo is fighting a couple hundred Agents simultaneously. It looks like a Tom and Jerry cartoon.

I notice this during commercials for new releases as well. Any movie trailer that has special computer effects look like crap. Almost as bad as the plastic model ships in a bathtub in the old Godzilla movies.

This seems counterintuitive to me. It would seem to me the weakness of animation would be more evident when blown up on a big screen. But maybe it is because you can't capture the whole picture with your eye in the theater, but you can on a TV screen.
 
scotth said:

You are surely aware of Moore's law that indicates that the power of PC cpu's will double every 18 months. We've been holding to that rule pretty well for quite a while now.

There is another correlaton that has held equally long, but I've yet to see it mentioned in the media. Maybe we should call it Scott's Interval (law). Actually, I'm sure somebody must have noticed this and named it. Anyway.... There is about 14 years between the performance of the fastest machine on the planet and what is available as a high end desktop PC.


Extrapolating is very uncertain, if we continue our today's growth rate, we will have well over 100 billion humans in 2100.
But that will not happen, unless we terraform mars, learn to make use of deep sea efficiently or develop some wormhole travelling.

There are always some limits, which do not affect something until it reaches a certain scale. With human growth, that is obviously the avaible living space and the amount of food earth can produce(even genetics can only effect this to a certain limit).

With computers its QM and SR. For a computer to process something a information has to get from somewhere to the processor and then the calculation result has to go again to somewhere. Speed of light limits how far the distances can be for a desired computing power. The only solution is to reduce distances and make everything smaller, but that means the currents carying the information get lesser. And when the number of electrons carrying the information is no longer very big, then quatum effects get suddenly very important, there is always a chance for qm causing you to have a sligtly lesser or bigger current and if that happens to often your computer suddenly produces errors as some 1s look like 0s.

To get around these problems a completely new technique is necessary, both optical computers and quantum computers have the potential to get better than current based computers, but even they have a limit.

Carn
 
Carn.... I'm quite aware of all those concerns.

That doesn't change the fact that there are machines on the planet today that are probably capable of doing this type of gameplay in real time. They got that fast by being massively parallel. That movement is coming to desktop PCs.

Again, we have machines, on this planet, today.... that don't use quantum or optical computing techniques that can solve this problem. And, historically, there has been about 14 years between fastest machine on the planet and the top of the line desktop machine.

Will this really pan out? I don't know.

It will not require an all new processing paradigm to solve.
 
Well, to give you all an idea of what is expected of the Playstation 3 graphically speaking, look at the screenshot below.

EA expects to be able to produce these graphics on a PS3. I have to say that this photo is extremely realistic looking:

madden.jpg
 
Theodore Kurita said:
I have to say that this photo is extremely realistic looking:

Indeed, but I'll wait to see what are they going to do with human motion capturing and digitizing. IMO this is where the greatest problem will be. Sports games today are supposed to be the top of the line as far as motion simulating is concerned, yet they still suck realism-wise.
 
El Greco said:
Indeed, but I'll wait to see what are they going to do with human motion capturing and digitizing. IMO this is where the greatest problem will be. Sports games today are supposed to be the top of the line as far as motion simulating is concerned, yet they still suck realism-wise.

True. In a sports game, or a fighting game, you can stuff a huge amounts of polygons into a single model.

However, when it comes to First Person Shooters, Action Adventure, and RPG games, there are generally lots of characters and lots of action. This in turn leads to a lower polygon count, and the realism is taken away.

However, I expect game developers to start using Engines like the Unreal 3.0 engine which allows for increadibly detailed characters with low polygon counts. It is hard to distinguish between a 2 million poly count model and a 7000 poly count model in the engine. The engine is basically a meshing engine, so in turn it does rely on things like Vertex and Pixel Shaders to create some of the effects.

Also, the only difference you can tell between a 2 million polygon model and a 7000 poly count model is the fact that you can see the mesh edges in the 7000 poly count model. But in order to notice this, you would either,

A: Have watched and played games long enough to notice fairly subtle things like that

or

B: You have to be paused in a game and staring at a character to notice
 
But I'm not talking about the quality or the detail of the moving sprite, I'm talking about the quality of the motion itself. Even a terribly pixelated, low quality video of a human looks natural, while I've yet to see an engine that simulates human movement with a barely acceptable level of realism. It's not (only) a matter of computational power, it's just that it would take an extremely advanced AI in order to mimic the details of human movement; a finger moving just a bit, an eye twitching, a coordinated movement of the human body. I think that we will relatively soon have photorealistic sprites, only they will be moving like robots for a much longer time.
 
This seems counterintuitive to me. It would seem to me the weakness of animation would be more evident when blown up on a big screen. But maybe it is because you can't capture the whole picture with your eye in the theater, but you can on a TV screen.

No i think there is another reason. I once saw an interwiew with a very famous Danish instructor/director that had worked with TV as well as Cinema movies. He said thatr you could do a lot of things in a cinema movie that yoiu never would get away with at TV because you had their full attention in the cinema. In the cinema you are isolated, the lights are dimmed and there are virtually no distractions (apart from some kissing in the back row and rattling with candy bags:D ) so it is much easier to "fool" people whereas when people watches TV reality is all around them. The dog is barking, the kids make noise, the coffee has to be made and there's allways a phonecall from your mother at the most critical moment in the movie.

I have seen a lot of movies on TV that i first saw in the cinema and i sometimes shake my head in disbelief when watching the special effects, "From Russia With Love" f.inst. is still a good movie but the effects are truly horrible........... Yet,- back then in the cinema i never noticed.;) You get much easier into "Fantasyland" when you are in a cinema.
 

Back
Top Bottom