Bethesda's epic sci-fi RPG is here, and it's a big one. From shipbuilding to exploring the surface of Mars, our thoughts so far.
Starfield Review... In Progress
The first trailer for Grand Theft Auto 6 is finally here.
Grand Theft Auto 6 Trailer
We take an in-depth look at Avatar: Frontiers of Pandora and tell you why it should be heavily on your radar!
Avatar: Frontiers of Pandora - a Deep-Dive into its Potential
Range-wise, the ROG Rapture GT6 is phenomenal, and it's ideal for all gaming and non-gaming-related tasks.
ASUS ROG Rapture GT6 WiFi 6 Mesh System Review
Whats the best vid card for my Duron system?
nikloas
Gold Coast, Queensland
113 posts
Heyas,

I have a TNT2 for pretty much forever now and its time to upgrade while I have some cash on me. Two options ive been looking at are the Gainward GF3ti200 ( http://www.pcrange.biz/prod_gwgf3.html ) and GF4ti4200 ( http://www.pcrange.biz/prod_gwgf4.html )

As you can see the GF3 is $239 and the GF4 is $319. Im trying to spend as little as possible as I dont have a job - and to get the best increase in performance/being able to use it in future games.

My system is a duron1gig w/ 512 ram. Just wondering if I would notice any difference on FPS using a GF3 vs GF4 when I only have a duron 1gig? I was pretty keen on the GF4, but I need to wait till its in stock, and I found the GF3 cheaper.

Thanks in advance.
12:00pm 01/07/02 Permalink
system
Internet
--
12:00pm 01/07/02 Permalink
Suhaib
Brisbane, Queensland
2833 posts
The bottleneck would be your cpu, so i'd say you will see *some* improvements,but you wouldn't be using the full capacity for your vid card. Go for the Gf4
12:03pm 01/07/02 Permalink
Cheetah
Adelaide, South Australia
973 posts
I say get what ever you can afford to get at the time.
12:06pm 01/07/02 Permalink
orbitor
Brisbane, Queensland
2034 posts
get whichever you can afford when it comes time to buy. I bought a Ti4200 from pcrange (a 128MB Golden Sample version) and I'm *very* happy with it.

Some games will have a large improvement when using a Ti4200 as opposed to a Ti200, and others won't have much (being more dependant on CPU/memory). So whichever you can afford :)
08:51pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5339 posts
One thing I'd like to point out, is that you will see a rather big increase in performance going from a TNT2 to a GeForce based card. This is due to the so-called "GPU" on-board a GeForce. Basically, a GPU takes a HUGE load off the CPU (when dealing with relatively slow CPUs) by taking care of the T&L (hardware T&L) among other things. When you have a videocard _without_ a GPU, the CPU has to perform the T&L (software T&L) which consumes a massive amount of CPU power.
08:56pm 01/07/02 Permalink
eYe_kAnDy
Brisbane, Queensland
2402 posts
erm isnt a TNT2 card stand alone? I thought it was the first Nvidia card to come out with a GPU on the card. In other words do you have to also run another 2D card with it?
09:20pm 01/07/02 Permalink
Tael
Brisbane, Queensland
1665 posts
eYe_kAnDy - You're a bit confused. All the TNT cards handle both 2D and 3D. Hunter's talking about T&L (Transform & Lighting) which are two steps in the 3D rendering pipeline. Traditionally the CPU would do these steps, because video cards didn't have the power to do it themselves, but now (after the release of cards like the Geforce) video cards do this sort of s*** as a standard feature, so there's less of a load on your CPU, which leaves it free to do other stuff.
09:40pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5341 posts
I'm not quite sure what you mean. A TnT and a Tnt2 are both 2d/3d combo cards and neither has a GPU. GeForce cards are the first consumer card to come out with a GPU (Geometry Processing Unit). Whilst a GPU doesn't draw the geometry (that's still done by the CPU) it does perform Transform and Lighting, which is/was done by the CPU if the videocard doesn't have the ability to perform it.

A very very simple description is this: up until the GeForce a 3D accelerator card only "painted" the scene (ie filled in the polygons). 3D cards with a GPU will still "paint" a scene but they do some of the work that would ordinarily be done by the CPU.

I hope that made sense, I don't think it did though.

EDIT: I didn't realise tael had posted when I hit post reply (I took ages to finish writing this post :))
09:41pm 01/07/02 Permalink
WarT
Brisbane, Queensland
8099 posts
*cough* taken from nividia.com *cough*

Graphics Processing Unit (GPU)
A high-performance 3D processor that integrates the entire 3D pipeline (transformation, lighting, setup, and rendering). A GPU offloads all 3D calculations from the CPU, freeing the CPU for other functions such as physics and artificial intelligence
so there is no geometry f00
09:44pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5343 posts

A GPU offloads all 3D calculations from the CPU,


What would that imply I wonder... *cough*. That aside, I've seen nvidia literature refer to it as a "Geometry Processing Unit", which technically it is - it _processes_ geometry, not creates it.
09:48pm 01/07/02 Permalink
WarT
Brisbane, Queensland
8102 posts
which is incorrect cause it's not called that
HELLO WAKE UP AND SMELL THAT YOU SAID THE WRONG WORD AND ADMIT IT FOR A CHANGE
09:50pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5344 posts
Also, a GPU does not offload ALL tasks... that's marketing bulls*** (if you don't believe me visit a few hardware sites for proof). The CPU still has to DRAW the scene (which will require a decent FPU among other things).

I'm awaiting the arrival of 3DLab's amazing VPU technology myself.
09:52pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5345 posts
WarT, if I could be bothered I'd happily provide you with the slick brochure (I collect a lot of crap) which says its a GEOMETRY PROCESSING UNIT. Either way who gives a s*** what it's called - both names imply the same meaning.
09:54pm 01/07/02 Permalink
Rubba-Chikin
Brisbane, Queensland
1143 posts
as far as i know GPU has always stood for graphics processing unit.
09:56pm 01/07/02 Permalink
Tael
Brisbane, Queensland
1667 posts
http://www.rivastation.com/review/geforces/geforces_2_e.htm
The main new features of the GeForce is the Hardware Transformation and lightning unit. NVIDIA calls it a GPU - a Geometry Processing Unit. Thatīs also the Reason why the chip is called GeForce: 'Ge' for Geometry.
09:57pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5346 posts
Thank you Tael :). Vindication is a sweet sweet feeling, right up there with revenge :P.
09:59pm 01/07/02 Permalink
WarT
Brisbane, Queensland
8105 posts
it's funny how NVIDIA says GRAPHICS considering they make the chip i think i'd lean towards them
09:59pm 01/07/02 Permalink
Rubba-Chikin
Brisbane, Queensland
1144 posts
well i think ud have to agree with something off Nvidia's offical site over some review place.

Maybe it does stand for both, who cares i dont.
10:00pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5347 posts

it's funny how NVIDIA says GRAPHICS considering they make the chip i think i'd lean towards them


Marketing is such a powerful tool.
10:01pm 01/07/02 Permalink
Rubba-Chikin
Brisbane, Queensland
1145 posts
i dont really understand how they would gain greater marketing by making GPU stand for Graphics Processing Unit and not Geometry Processing Unit...

thats just stupid
10:02pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5348 posts
No, my point is that they can say whatever they want regardless of how truthful it is and people will believe it. I mean, a lot of people now believe a GPU removes all 3D work from the CPU thanks to disinformation by nvidia's marketeers. This disinformation is then "passed down the chain" and becomes extremely difficult to overcome.
10:05pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5352 posts

well i think ud have to agree with something off Nvidia's offical site over some review place.

Maybe it does stand for both, who cares i dont.


You must have missed the part where I said I had an nvidia brochure with GEOMETRY PROCESSING UNIT on it so here it is:

What would that imply I wonder... *cough*. That aside, I've seen nvidia literature refer to it as a "Geometry Processing Unit", which technically it is - it _processes_ geometry, not creates it.
10:12pm 01/07/02 Permalink
Tael
Brisbane, Queensland
1669 posts
10:13pm 01/07/02 Permalink
mooby the golden calf
Brisbane, Queensland
108 posts
eye kandy was talking about the old voodoo rush style cards.

rember those... my dad told me about rush, lol
10:19pm 01/07/02 Permalink
SD Gundam
Brisbane, Queensland
1956 posts
Here's something from the flight gear website on the topic of 3D rendering

The Big Picture



Here is a bit of general background information on OpenGL and 3D
hardware acceleration contributed by Steve Baker
(sbaker@link.com)

Updated by Curt Olson (9/25/2000)



When you are drawing graphics in 3D, there are generally a hierarchy
of things to do:


  1. Stuff you do per-frame (like reading the mouse, doing flight
    dynamics)

  2. Stuff you do per-object (like coarse culling, level-of-detail)

  3. Stuff you do per-polygon or per-vertex (like
    rotate/translate/clip/illuminate)

  4. Stuff you do per-pixel (shading, texturing, Z-buffering,
    alpha-blend)


On a $1M full-scale flight simulator visual system, you do step (1) in
the main CPU, and the hardware takes care of (2), (3) and (4)



On a $100k SGI workstation, you do (1) and (2) and the hardware takes
care of (3) and (4)



On a $100 PC 3D card, you (or your OpenGL library software - which
runs on the main CPU) do (1), (2) and (3) and the hardware takes care
of (4).



On a machine without 3D hardware, the main CPU has to do everything.



The amount of work to do each of these operations goes up by one or
two orders of magnitude at each step. One eyepoint, perhaps a hundred
objects, tens of polygons per object, hundreds of pixels per polygon.



Hence, putting step (4) into hardware is vital - you could easily need
to draw a million pixels for each time you read the mouse. Putting
step (3) into hardware is also very nice - cards like the nVidia
GeForce are now doing this.



So, if you put a cheap accelerated 3D card onto a slow-ish PC, you
still get all the benefits of not doing the per-pixel math in software
- so your frame rate will speed up. But since that per-pixel stuff now
goes very fast, you are probably limited by the speed that your old
clunker can do step (3) - the per-vertex math.


A card like the GeForce which can do step (3) will bring you even
greater performance benefits on a slow-ish PC.



It has been suggested that even a 400MHz Pentium II couldn't feed
vertices to a Voodoo-2 card faster than it could process them. That
was for a 'typical' scene though. If your polygons are very large (in
terms of screen area) then the amount of per-pixel math will increase
up to the point where even a slow PC would end up waiting for the 3D
card to complete the rendering of one polygon before it can accept the
next.



If your polygons are very small then the 3D card may not give you much
benefit. There is a certain amount of overhead in sending each
polygon to the hardware from the main CPU - and that cost could
ultimately be more than the cost of drawing the pixels in software if
the polygons only cover a few pixels each.



If your 3D card doesn't provide a speedup then make sure you have a
proper OpenGL driver installed for that card - some cards only support
Direct3D which FGFS is unable to interface to. Some 3D cards can't
function if you set the display resolution too large or the pixel
depth to something that card won't support. Some OpenGL drivers
silently drop back to software-only rendering under those
circumstances.



In summary: Will a 3D card speed up FGFS on a standard PC? Yes -
immensely.



10:19pm 01/07/02 Permalink
Rubba-Chikin
Brisbane, Queensland
1147 posts
the eye kandy post was me on his puta

I used to have a Voodoo 2 and u needed to have a 2d card for it to plug into :)

10:24pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5355 posts
Ok that was a very obfusicated way of explaining it :). Sorta like spaghetti-code (using goto in C should be punishable by some sort of pain!).
10:25pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5356 posts

I used to have a Voodoo 2 and u needed to have a 2d card for it to plug into :)


Yes both the Voodoo 1, 2 and some other one (can't recall the name) were 3D only. They (3dfx) released a 2d/3d combo card but it was flop due to extremely poor performance, though it was advanced for its time in that it used AGP as opposed to PCI.
10:27pm 01/07/02 Permalink
Cam
Brisbane, Queensland
1321 posts
Leave it to hunter to completely change topic and f*** up a thread just by trying to be smart.

To answer your question Nikloas, I think you should go for the Gainward GF3ti200 to save the cash. The performance difference wouldn't be worth spending the extra cash on the GF4, because of your slightly older cpu.
When it comes time to upgrade your cpu (for Doom 3 of course =D), you'll want to get a new video card as well to keep up with it and by that time there will be much better video cards around than the gf4.
Get the gf3 to save the cash because the gf4 isn't worth it.
Hope I've helped you make a decision.
11:32pm 01/07/02 Permalink
Cam
Brisbane, Queensland
1322 posts
Hunter, 3dfx released the Voodoo Banshee as a combo 2D/3D combo card/chipset, and from what I remember it did really well. Offered the best 2D performance on the mainstream video card market at the time (NVidea's TNT chipset was yet to be released) and the 3D performance rivaled that of the Voodoo 2. Brilliant little card for it's time.
11:37pm 01/07/02 Permalink
SD Gundam
Brisbane, Queensland
1961 posts
Ok that was a very obfusicated way of explaining it :). Sorta like spaghetti-code (using goto in C should be punishable by some sort of pain!).
WTF are you talking about.
11:37pm 01/07/02 Permalink
Cam
Brisbane, Queensland
1323 posts
God just leave him alone guys. The best way to shut Hunter up is to ignore him. If you flame him for being a retard he just gets more fired up at the rest of the world and spams even more bulls*** than ussual.
11:39pm 01/07/02 Permalink
Cam
Brisbane, Queensland
1324 posts
BACK ON TOPIC
To answer your question Nikloas, I think you should go for the Gainward GF3ti200 to save the cash. The performance difference wouldn't be worth spending the extra cash on the GF4, because of your slightly older cpu.
When it comes time to upgrade your cpu (for Doom 3 of course =D), you'll want to get a new video card as well to keep up with it and by that time there will be much better video cards around than the gf4.
Get the gf3 to save the cash because the gf4 isn't worth it.
Hope I've helped you make a decision.

Now quit rambling and stay on topic everyone!

11:40pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5374 posts

WTF are you talking about.


The explanation you cut and pasted (I had read it before though) was overly complicated with 1s and 2s and 3s all over the place. I'm sure the author intended it to be a simple explanation for the layman but there are easier ways.
11:42pm 01/07/02 Permalink
SD Gundam
Brisbane, Queensland
1962 posts
You're a retard hunter as if that is hard to read you just lack reading skills.
11:45pm 01/07/02 Permalink
Hunter
Brisbane, Queensland
5376 posts
I can read it, but it's not pretty to read...
09:20am 02/07/02 Permalink
system
Internet
--
09:20am 02/07/02 Permalink
AusGamers Forums
Show: per page
1
This thread is archived and cannot be replied to.