View Full Version : Real Time Rendering of Final Fantasy on a GF4?

20th June 2002, 05:00
I find this hard to believe:


20th June 2002, 06:13
Me too. This has popped up several times, and each time the rendering looks like shit.

Note that they refer to some mysitcal "demo" here, but provide no link for those of us with kickass systems to try out.


- Gurm

20th June 2002, 06:58


Something smells like fish in this article.


20th June 2002, 09:09
I think that article says the truth and the false at the same time...

I'll explain this better:
1) It is a demo and not the original work.
So, even if it should look the same, we can be sure it's full of optimization. Reflection maps instead of real reflections, i.e.

2) Somehow, the article never says that it renders the entire film or that it renders entire scenes. Rather, I seem to understend that it render a character or an ambient. And that is a bit different from rendering real phisycs and so on...

3) It's not the GF4 that is rendering...better, it's not only the GF4!
As it states in the article, it has a powerful processor behind that: has not to calculate phisycs, has not to calculate IA, has not to calculate collisions, has not to calculate anything than the graphic of the demo itself!!

So, we can think that demo is real and real time rendering...but that not mean that it's rendering the real film or that it can do something similar for a game!

20th June 2002, 11:07
In other words, a marketing ploy by nVidia.


20th June 2002, 11:10
And it's been done before. Since the first time they made that claim (back with GF1's, IIRC) they have had demos out and about. I played with one of them. It sucked. Hardcore. *shrug*

I've seen some of the graphics from a GF3 (not too far from a GF4) rendering FF. It looks NOTHING like the final movie. NOTHING.

- Gurm

20th June 2002, 15:30

Real thing:


Close, but no cigar... :)

20th June 2002, 15:50
I wouldn't even give them "close." The GF picture looks like it was rendered on my home machine. Makes sense, since it pretty much was. The texturing, lighting, and polygon count are all way off.
Just look at her arm resting on (through?) that bar.

20th June 2002, 16:05
I wouldn't even call them "near". The GF4's is obviously computer generated - looks like a computer game straight off

The real one could be 'real' - you have to look quite closely to see it's not real, and even then it's not obvious

Look at her hair in the first one for god's sake!

20th June 2002, 17:04
And remember - every one of her hairs is animated. They made a big deal about that over at Square... dynamic hair strand rendering.

Now, do that not only for her face, but for her whole body - with 10 other people onscreen, and render it at some obscene... oh, wait. You CAN'T.

- Gurm

20th June 2002, 17:14
It's really good for real-time computer graphics, and I'd really like to play a game with that level of graphical realism (and own a machine that can render it) - but saying this is movie-like is just lying.


20th June 2002, 17:21
I'm not too impressed by the GF4 rendering. IMO even ATIs Rachel looks more "alive".

20th June 2002, 19:06
Well, I can tell you this much.. the first one has movable arms and you can dress it and play with it and it's fun for any girl 4-14...

But the second one's givin me a semi.. :D :rolleyes: :p

20th June 2002, 19:50
Whoa that second photo is down right scary looking...the freckles are nice convising touch!

21st June 2002, 07:27
Perhaps slightly off topic, but what movies could be rendered in real time ?

No doubt Luxo Jr. (see http://www.pixar.com/shorts/index.html ) can be rendered in real time by most current systems, as would some other older movies, but where would it stop ? Toy story ?


21st June 2002, 07:34
The original Tron?

21st June 2002, 08:15
Actually, I doubt that even Luxo Jr. could, properly, 100%, at 24fps.

Tron, on the other hand... also couldn't, mainly because it didn't use any conventional "3D Graphics" as we know them, so it would have to be completely custom coded.

- Gurm

21st June 2002, 09:17
Yes, the 100% is a good point... But suppose in 1024x768 ? It just seems strange that this could not be done realtime, although if you render stuff in 3Dmax, it is still clear that true rendering (esp. ray-tracing) requires a lot of power.


21st June 2002, 12:46
Bingo! Rendering REAL curved surfaces takes for-friggin-ever. Your example of 3DS Max is perfect. To get everything right, even for a relatively simple scene like in Luxo... takes HOURS of processing time.

- Gurm

21st June 2002, 16:59
Nuno, how do you know those picturese were rendered on a GF4 and not a GF3. The Directory of that image is :


There is a total of 10 images there. I have to admit the wire frame redering views seem to be missing a TON of polys for that to be movie class work. I just got done playing my Final Fantasy DVD movie while pausing at some of those scences, and they are very very close.

21st June 2002, 17:22
It doesn't matter much which of the two cards it was. nVidia made the exact same claims about both cards. We've seen the results of their claim about the GF3, that pretty much discredits their claims of the GF4.

21st June 2002, 22:26
Now, how about we add some displacement mapping... :D


22nd June 2002, 15:53
GF4=GF3 on steroids. It has another vertex shader unit, higher core/mem speeds and some tweaks in order to support ps 1.3, but AFAIK, there´s nothing that a GF4 can render that a GF3 can´t. Speed is the issue, and you can´t see it on the screenshot.

Now for the short answer, the links were shamefully taken from a similar thread on B3D forums :o :o

22nd June 2002, 21:13
As said I fail to see how this NVidia rendering should be superior to e.g. ATIs Rachel demo.
The Rachel demo is indeed quite nice for a consumer-level gfx-card, but nowhere near the quality of FF - and not even the marketing buttheads of ATI dared to call this "CG gfx rendered in realtime".
The same goes for the GF3/4 rendering: quite nice for consumer-cards and surely everyone would appreciate if any real game had Fx that good. But to compare this to real rendered film graphics is just stupid and shows a major reality-loss of the persons making those claims.

(For reference you can see some screenshots of the RachelDemo here:

23rd June 2002, 04:04
Just to clarify, I linked to those pictures to show how far Nvidia´s claims are from reality, and not the oposite.
Rachel´s and this FF demo are both good coding but nowhere nears prerendered quality, of course.