Until yesterday, I would have voted for the Geforce3 Ti200. I have one running at 250/530 and I compared it to a Radeon 8500LE @ 300/315 back when the Catalyst 2.2 was first out. While 3DMark was equal, the GF3 was consistently running 10-20fps faster in most games at 1280 x 960, and when FSAA was enabled the Radeon fell further behind.
Fast forward to Catalyst 2.5 and a retail Radeon 8500 128MB at 315/310. The Radeon is consistenty 10-15fps faster in most games. When playing Unreal Tounament the GF3 gave me a maximum resolution of 1280 x 1024. When using 32-bit color, max anisotropic filtering and 2X FSAA, the GF3 @250/530 averages 75fps, the Radeon 128MB @315/310 averages 90fps! Funny thing though, when I enable vertical sync the Geforce still stays at 75fps, but the Radeon drops to 70fps even though my refresh rate is set at 85Hz. I like to use 2X FSAA at 1280 x 1024 to keep things really smooth, but when I turn it off the Radeon averages 140fps while the GF3 blazes at 175fps. Here's the real kicker, with the Radeon I am given the 1600 x 1200 option, and without FSAA (I really don't need it at 1600 x 1200) the Radeon looks slightly better than the Geforce3 at 1280 x 1024 x 32 at 2X FSAA. While the Geforce3 chugs away at a 75fps average, the Radeon at the higher resolution rocks out 115fps! I can now play the Unreal 2003 demo smoothly at 1600 x 1200, the Geforce3 chokes at this. If I turn off anisotropic and put set all the sliders to performance, the Radeon hits 10,100 in 3DMark besting the GF3 at 9000.
I would think a little research might turn up a Radeon 8500 LE for under $70 that has a good gpu and 3.3ns ram. If it will reach 300/300 than it will be the best bang for the buck!