Title: cgi's Post by: 0 Kilz:M: on September 21, 2003, 02:40:44 am heres my deal, in the in game movies for most games the cgi looks so damn sweet. Why can't they make the gameplay look the same?
Title: Re:cgi's Post by: Ace on September 21, 2003, 04:54:52 am Those cinematic sequences come pre-rendered, so they can be of a much higher quality. Since the in-game graphics must be rendered in real time by your GPU, they have to be a much lower quality to maintain acceptable framerates.
Title: Re:cgi's Post by: Toxic::Joka on September 21, 2003, 10:10:44 am A kinda off topic question: but close enough ;)
Why does the quality settings lower the fps. The way i get it, the quality settings just determines what textures are used, the low, medium or high quality settings. Dont see how you get better fps with lower settings, it still loads textures with the same area.. ??? Just something i been wondering on. Title: Re:cgi's Post by: BTs_Lee.Harvey on September 21, 2003, 11:58:53 am Quote Dont see how you get better fps with lower settings, it still loads textures with the same area.. because w/ higher quality your comp has to load more data.. which takes longer... meaning lower fps.. does that simplify it for you? Title: Re:cgi's Post by: Mr. Lothario on September 21, 2003, 12:22:50 pm The quality settings do not only control textures; textures are just one of the most noticable effects. Quality settings in most games also control the number of polygons in models, the lighting effects used, the number of models (puffs of smoke, animals, soda cans, flying brass, nonessential models like that is what I'm talking about there).
Video cards are limited in how many pixels they can render and push. The rendering is the bulk of the time used by the card, and involves receiving instructions from the game (or other program) regarding what is currently in view, such as character models, walls, floors, cars, bridges, weapons, trees, and so on, ad infinitum. For each "thing" in view, the card must calculate its current position on screen (transformations), determine what of that thing is visible (clipping), then draw each of its polygons ("polygons" is actually a generalization. The only thing video cards draw nowadays is triangles, out of which any geometric figure can be constructed), texture the polygons (which requires scaling, rotating, and clipping of the texture, and possibly for more than one texture per polygon in the case of bump mapping, reflection mapping, transparency, etc.), light the polygons (calculate the effects of each light in the scene on each polygon and on each texture on each polygon for coloring and shading and reflection and specular highlights and etc., and that isn't even covering shader programs, which are pixel-by-pixel rendering instructions that can be used by the latest generation of cards), then newer cards can apply anti-aliasing and filtering to the scene, both of which involve per-pixel calculations and adjustments to the rendering, THEN FINALLY the completed frame (!) is "pushed" along the AGP bus to the display. In the case of textures, higher-quality textures mean more video memory is being consumed to store them, more effort is required by the GPU to manipulate those bigger (in terms of RAM) textures, and they require more resources to be shuffled around in the rendering pipeline. Anyway, Joka, the point is that there's considerably more going on in the GPU than your mental model is taking into account. Give your card some respect, it's doing a shitload of work many times a second even on low detail settings. : ) Title: Re:cgi's Post by: Toxic::Joka on September 21, 2003, 10:22:23 pm Yea, ok. I was just thinking that the computer has to load the textures and it dosent know if they are "high quality" or "low quality" and it has to load the same amount low or high. yea...well wrong I was :)
thx loth Title: Re:cgi's Post by: kami on September 22, 2003, 06:45:51 pm Thanks a loth. LOL!!!!11# :P
Title: Re:cgi's Post by: Cobra on September 22, 2003, 09:12:14 pm Oh teh GOD LMFAO!!1
Title: Re:cgi's Post by: Nail.Im.Ixam on September 23, 2003, 03:37:44 am As soon as computer have the power to render CGI like scenes in real time, they will. If you take a look at todays PS2 games, the graphics are similar to a PS1 games CGI sequences. Lok at the Gran Turismo series for an excellent example of this. GT3's graphics are probobly better than GT1's CGI sequences.
Title: Re:cgi's Post by: Mr. Lothario on September 23, 2003, 05:58:03 am Current top-end consumer videocards can render photorealistic images in real-time. The problem is, they can't do too much in those images. Nvidia showed off demos of effects possible using their shader language, one being a woman/fairy, another being a car. And that was all. Nothing else in the scenes. Just the car, just the fairy. So real-time photorealistic graphics are coming closer and closer, but they're not here yet. Probably another two or three generations of video cards.
|