Originally posted by: gibster
You should be pissed. I went to pick mine up yesterday, and there were probably 20+ on the shelves in the store in Denver. Price was $59 on them, didn't think much of it, as mine was confirmed and paid for.
That was pretty low of Micro Center to cancel orders - they could have honored the ones that were put in before the price change...
Now it could be that they want to fulfill whatever they can with the ones they have in the stores - chances are nobody will buy those at $59 anyway, even with the $20 MIR.
Hearing about the price change, I'm sooo glad that I ran down to the B&M that afternoon to buy one. They didn't have any on the shelves at my store, apparently they were "out back". Was a little suprised that the sticker on the box said $59.99. Now I know why. :|
I feel bad for those that ordered, and even got a confimation e-mail, and then got cancelled. That's kinda sleazy of Microcenter, but if they didn't charge your CC, they are probably within their legal rights to do so.
Originally posted by: gibster
For me, this was exactly what I was looking for my "console" gaming system. I will be playing at 800x600 (best resolution for TV out), so performance is perfect for that. I even went down to 16 bit, as I doubt I will see too much of a difference between that and 24 bit on my TV. That should even get me playable Doom 3, with the 2.8GHz Celery. There is still the option of 640x480 for D3, so I figure I'm set.
Why 16-bit color? Especially with Doom3, with all of it's photo-realistic visuals? I thought that some/most ATI cards, not only didn't gain that much of a speed advantage from running in 16-bit color, but sometimes actually ran slower. Plus, 16-bit color sucks, too much banding. In fact, I think that it is more pronounced when using the TV-out than on a VGA display, due to the reduced-dynamic color range on a television. I haven't actually popped the card in my machine yet, waiting for more stuff to download before I start swapping hardware.
Btw, is there any sort of mfg warranty on these cards? I couldn't find any mention of it anywhere, either on the box or in the rather-general manual. If there is none, I'll be slightly disappointed, but as long as it's not somehow DOA, then I guess I'm ok.
Has anyone attemped to overclock their cards yet? I've heard something about BIOSes being "locked", so that they don't run if OC'ed, need "Omega soft-mod drivers" to allow to OC, using "ATITool".
I haven't managed to track down the original FIC BIOS (do FIC even make these, or just re-sell them?), but I managed to track down what I think might be a compatible BIOS image file
here.
Search in the page for "Mosel". There is a link to a BIOS image file, for a Radeon 9200 64MB AGP, default clocks of 250/200, and with Mosel Vitelic 4.0ns memory. That matches the configuration of my board.
My board is on a red PCB, with "A92 ver. 1.2" silk-screened along the top. I've also noticed a few other things - there is a dual-position surface-mount DIP-switch on the board. I've no idea what it's for, but I'm hoping that I might be able to flip a switch and either: a) turn it into another card, or b) up the voltages, for higher OCs.
I also noticed, that I can't find the BIOS chip on the board. Is it under the GPU heatsink? Or do modern GPUs include some flash memory on the chip itself, storing the BIOS internally? (My Promise Ultra100 TX2 apparently does that, surprised the heck out of me.)
I think I'm going to stick a nice phat 5200-RPM fan on my stock heatsink and see how it does. Someone else on another board mentioned getting a 9200SE to 290/270 by adding a fan.
The fact that this board has DIPs really intrigues me. Are there any other Radeon cards with PCBs that look just like this one? Any other brand-model-number cards also based on the RV280 GPU? Is the RV280 pinout-compatible with another GPU?