Nvidia's little MX secrets

audreymi

Member
Nov 5, 2000
66
0
0

The MX is not a good solution if:

  1. You are
    interested in running at 32 bits colour depth where
    the memory becomes the limiting factor.
  2. You have invested in a high quality monitor that you intend
    to keep when running at resoulutions at 1280x1024 with high refresh
    rate or 1600x1200. This will be discussed at a later date.

1]Limited framerate at 32 bit resolution:

Before we get to the MX take a look at what you give up
relative to a real GeForce2 and the Ultra.

The Ultra used a 128 bits wide running DDR (2 chunks of
data for each clock ~=210 Mhz. See "2" factor below) for a bit rate of
128*"2"*210 bits/sec

The GeForce2 numbers are
128*"2"*166 bits/sec about 80% of the above number.

The only difference between the stock GeForce and Ultra is
the 210 number vs the 166.


Now for the MX, Nvidia has been confusing us with
DDR designations and also SDR designations. Guess what?
The SDR is actually faster than the DDR. Why?

MX DDR bit rate breaks down as follows:
64*"2"*166 bits/sec about 50% of the GeForce2 number.

MX SDR bit rate breaks down as follows:
128*"1"*166 bits/sec again about 50% of GeForce2 number.

So the numbers and performance should be the same, but SDR
benchmarks are shown to be a little faster (???). I guess
DDR has some inefficiency. So if buy MX, buy the SDR version.

On the recommendation of an
engineering friend, I followed up on
the Radeon and compared cards for about a month before I settled down on the 32 MB DDR Radeon. My son loves it.
128*"2"*183 bits/sec about 220% more memory BW than MX.

The December 2000 issue of Maximum PC compared them head to head
at 1024x768@32 in Quake III at the frame rate difference
was 56.5 fps(Radeon) vs 30.3 fps (for the MX). Note
at lower resolutions and colour depth, the difference would
be less as the memory bandwidth become less of a bottleneck.

Anandtech discusses this MX bus width issue
as well as Maximum PC.

Bottom line is that the MX is a solution that flies in the
face of "Ultra's" chief advantage and offering by
going to less memory bandwidth and trading it off less performance
high resolution and high colour depth performance.

 

AndyHui

Administrator Emeritus<br>Elite Member<br>AT FAQ M
Oct 9, 1999
13,140
6
81
Uh...that's why it's cheaper. That's the main reason for its existence. The performance is crippled so you buy it at a cheaper price compared to the GeForce 2 GTS/Pro/Ultra brothers.

DDR SDRAM has a larger overhead and slightly longer latencies than SDR....so yes, in most cases, the 128bit data path SDRAM will outperform the 64bit DDR SDRAM models.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
yeah dude, we're well aware of this. Not a surprise here, but we're pretty well informed here on this board

 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< 128*&quot;2&quot;*183 bits/sec about 220% more memory BW than MX >>


In the words of someone else &quot;stats mean nothing&quot;.. That would mean the radeon32 would perform alomst 2x faster than the MX, it doesn't.
To back that up you say this

<< was 56.5 fps(Radeon) vs 30.3 fps (for the MX). >>


Well if you go to your own link here you can see your numbers are way off for both cards. (The SDR is shown there, and the DDR is faster)
Don't get me wrong, I think the radeon32 DDR is a great card, but it's not in the same class as the MX price wise.
BTW: using Maimum PC for a reference is like using PC world...
 

Gatsby

Golden Member
Nov 6, 1999
1,588
0
0
I knew that going in when I bought my MX.

My main concern was that it was CHEAP!!.. CHEAP!!!

Lack of funds always drives you in a certain way.

Gatsby - 47
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
audreymi:

The MX is not a good solution if you are interested in running at 32 bits colour depth where the memory becomes the limiting factor.

No offence, but have you been asleep for the last 6 months?
 

audreymi

Member
Nov 5, 2000
66
0
0

I guess you might say been asleep for the last 5 months
except for the last month when I began searching for
an upgraded videocard that my family would all be happy with.

Yep my math was off, the factor of 220% implied a theorectical
speed advantage of 2.2x, or equivalently, an &quot;increase&quot; of 120%
and not 220% as I had stated.

Someone pointed out to me that the ELSA board is clocked near
a 175Mhz (not 166) so the advantage is 166/175*2=1.89x faster
or an 89% increase. This is much more in line with the 86%
measured frame increase in Quake III. Sorry about my
mathematical shortcomings. Now where did I put tomorrows math lesson.
 

CrimsonWolf

Senior member
Oct 28, 2000
867
0
0
No offense, but did you just now have this epiphany?

Please post this in an AOL forum or something. Anandtech readers are quite up to date and knew this the very day the GeForce2 Mx was introduced.
 

ahfung

Golden Member
Oct 20, 1999
1,418
0
0
Isn't it lame to compare a sub $100 MX to a over $150 Radeon DDR?

&quot;The December 2000 issue of Maximum PC compared them head to head
at 1024x768@32 in Quake III at the frame rate difference
was 56.5 fps(Radeon) vs 30.3 fps (for the MX). &quot;


Gimme a break, any MX would do at least 55fps under 1024x768x32 Q3A.
 

TheCorm

Diamond Member
Nov 5, 2000
4,326
0
0
Calm down everyone, chilllll.

The MX is a value card, no denying that, it's essentially is a scalled down (or cippled if you like) version of the GTS (just like P3 vs Celeron).

I like the Elsa Board, a minimal price increase on Crative for 128 bit SDR instead of 64 DDR, nice heatsink, better warranty and option Video I/O interface.

For Under £100 you can't expect screaming performence but I am looking forward to a substantial performence boost from my PCI based Riva TNT.

Corm

All Manufacturers of the Geforce 2MX have widely identified it as a value board but they have noted that it's not slow, just the same philosophy as the Duron
 

audreymi

Member
Nov 5, 2000
66
0
0
In reference to the previous post comments stating



<<
Isn't it lame to compare a sub $100 MX to a over $150 Radeon DDR?

  • &quot;The December 2000 issue of Maximum PC compared them head to head
    at 1024x768@32 in Quake III at the frame rate difference
    was 56.5 fps(Radeon) vs 30.3 fps (for the MX). &quot;
>>




At the $100 dollar level, even retired math teachers like myself
think little to spend 10% of a $1000 dollar budget on a video card.
Sorting out all this &quot;isn't this old 6 month old news&quot;, as one reader put it, is not as straightforward as some would make it seem.


The Anandtech article on November 27, 2000 only just
recently clarified misconceptions
about DDR designations in relation to to the MX video cards:


<<

  • &quot;...got quite some attention when they announced that they would be producing a &quot;DDR GeForce2 MX.&quot; For many users, the letters DDR equate to higher performance, as the DDR memory can transfer twice as much data per clock over SDR memory. Therefore, quite a good number of users out there were ecstatic to hear of a DDR GeForce2 MX card, thinking that it would bring the MX up to the levels of many GTS cards. The fact of the matter is that this would be true, if it were not for the fact that all DDR GeForce2 MX based cards use the narrower 64-bit memory bus, steering NVIDIA clear from any internal competition.&quot;
>>



My original post just try to put numbers to Anandtechs point
to further clarify his original statement above.

The last point my dealer made to me is that spending the extra
$50 for a 2x (O.K. 1.86x improvement) could not be had by
upgrading any other system component such as the CPU. This
and my particular needs (80% of time spend using 2D apps
using a high quality Sony Monitor) in
the end convinced me to purchase the 32 MB DDR product from ATI.

 

AndyHui

Administrator Emeritus<br>Elite Member<br>AT FAQ M
Oct 9, 1999
13,140
6
81
A 1.86 x increase in performance is theoretical, and not a real world performance increase.

Secondly, it was clearly stated in the original GeForce 2 MX review, on June 28, that

<< According to NVIDIA, the GeForce2 MX supports 64-bit SDR/DDR SDRAM or 128-bit SDR SDRAM. This raised a pretty big flag for us as it made no sense that the chip could only support 128-bit SDR SDRAM and not DDR SDRAM since supporting DDR SDRAM does not change the pinout of the chip itself. >>




I also wish to echo ahfung's statement regarding the performance of the GeForce 2 MX, and like lsd, I question Maximum PC's credibility in their benchmarks.
 

ahfung

Golden Member
Oct 20, 1999
1,418
0
0
&quot;At the $100 dollar level, even retired math teachers like myself
think little to spend 10% of a $1000 dollar budget on a video card.
Sorting out all this &quot;isn't this old 6 month old news&quot;, as one reader put it, is not as straightforward as some would make it seem.&quot;


This is just your opinion only. $50 CAN mean a lot to some ppl.

&quot;The last point my dealer made to me is that spending the extra
$50 for a 2x (O.K. 1.86x improvement) could not be had by
upgrading any other system component such as the CPU&quot;


I know u love your Radeon, but in reality it is only 1.23x as fast as a cheap ass MX, or 0.76x as fast as a similar priced GeForce2 GTS.

Oh forget to tell u, Radeon suffers from the same ill fate as GeForce when using on some Trinitron tubes. Hope this isn't too late.

 

audreymi

Member
Nov 5, 2000
66
0
0

Recent news a www.rage3d.com forums now
report that Trinitron and Radeons now get along famously.
It had something
to do with a new WHQL driver that altered the drive of the DAC
circuitry.

The new driver was tested across Sony's full line of monitors
showing quality now the equal of Matrox G400. I'll dig up the link
later on this afternoon.
 

audreymi

Member
Nov 5, 2000
66
0
0
The results show 1.86x and the calculation show 1.89x which
sort of line up.

Recent news a www.rage3d.com forums now
report that Trinitron and Radeons now get along famously.
It had something
to do with altered drive in the DAC
circuitry added to the recent WHQL driver.
I tried it and as impressed as I was with the Radeon
image (MX and Matrox G400 were available for
side by side comparisons) at the store, the new driver
somehow seems to squeeze that last tinges of 2D sharpness from
this card???

I read that the new driver was tested by ATI in conjunction
with user's who reported prior problems with their Trintron tube
based monitors (NEC, Diamondtron, Iiyama, ADS, and KDS)
and also field tested across Sony's full line of monitors
showing quality that I would say now is
the equal of Matrox G400 (my opinion).

I'll dig up the link later on this afternoon.
 

Looney

Lifer
Jun 13, 2000
21,941
5
0
yeah, i would say it is equal to Matrox's too.. i had a couple of G400s for the longest time, and when i moved over to the Radeon... i didn't really notice a difference, anything i did was psychological i think.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0


<<
The Ultra used a 128 bits wide running DDR (2 chunks of
data for each clock ~=210 Mhz. See &quot;2&quot; factor below) for a bit rate of
128*&quot;2&quot;*210 bits/sec
>>



Your numbers are wrong. The memory of the Ultra is clocked at 230 MHz DDR yeilding a bandwidth of 7.36 Gbytes/sec. The Pro is clocked at 200 MHz DDR yeilding a bandwidth of 6.4 Gbytes/sec. The GeForce 2 GTS has its memory clocked at 166MHz DDR or 5.3GB/s of available bandwidth.



<< MX DDR bit rate breaks down as follows:
64*&quot;2&quot;*166 bits/sec about 50% of the GeForce2 number.

MX SDR bit rate breaks down as follows:
128*&quot;1&quot;*166 bits/sec again about 50% of GeForce2 number.
>>



Your numbers are wrong. The Creative Labs DDR is NOT clocked at 166 MHz. It is clocked at 143 MHz DDR. That is why it is slower. The bandwidth is 64 bit path * 2 (DDR) * 143 MHz =

So let's look at the memory bandwidth numbers:

Ultra : 7.36 GB/s = 100%
Pro : 6.40 GB/s = 86%
2 GTS : 5.31 GB/s = 72%
SDR MX: 2.66 GB/s = 36%
DDR MX: 2.29 GB/s = 31% (or 86% of the SDR MX)

This does not begin to include the differences in core clock speed
(2 GTS/Pro vs. Ultra) OR differences in deisgn (GeForce 2 vs MX).



<<
On the recommendation of an engineering friend, I followed up on the Radeon and compared cards for about a month before I settled down on the 32 MB DDR Radeon. My son loves it.
128*&quot;2&quot;*183 bits/sec about 220% more memory BW than MX.
>>



Again the numbers are wrong. The 32 MB DDR version of the Radeon is clocked at 166 MHz, NOT 183 MHz. The 64 MB DDR version is clocked at 183 MHz. The math is left up to the reader this time.

Still, even this does not compare the differences in architecture of the Radeon vs. the MX. In essence, the Radeon has features (partial Direct X 8 implementation and 3 texel/clock pipelines) that the GeForce 2 line does not have. Of course, the features are also wasted as no games take advantage of them, yet.

Now, I own a 32 MB DDR Radeon. I bought it from Buy.com for $147. Last week with the $30 off coupon, free shipping, etc. it was available for $132. Either way, it is a great deal. I love the card. I can o/c to 200 MHz which allows me to play at 60+fps @ 1024x768x32 in Quake III on my Celeron 300A o/c to 450.

However, the MX is a good deal, not to mention being less expensive at about $130~$150 retail at B&amp;M stores. The Radeon is about $150 (BestBuy) to $200 (CompUSA) retail. (Not everyone knows about OEM &amp; on-line buying, yet, where you can find the prices cheaper.) It has its own niche, and fills it very well.

 

smp

Diamond Member
Dec 6, 2000
5,215
0
76
Okay.. I think that everyone failed to mention that the ATI card has superb video digitizing qualities.. I beleive it is the only card on the market (the all in wonder radeon 32 meg, not just the Radeon 32 meg) that digitizes at full video resolution.. correct me if I'm wrong.. cause I'm about to buy that card.. and what more.. for a whopping 400 CAN. in my system.. which I'm getting a good deal on..
I know you all were talking about the 32 meg Radeon, and not the All in Wonder Radeon..
I wish I could go for the cheaper MX but it's no good for me.. I am doin multimedia stuff and it helps to have a card that is capable of full screen digital output.. the all in wonder is more expensive than the plain radeon...
ATI all in Wonder Radeon 32meg ddr
ATI Radeon 32 meg ddr

 

smp

Diamond Member
Dec 6, 2000
5,215
0
76
The good deal I was talking about is on my whole system.. which is based on a 900 tbird.. and A7V ... I'm getting jipped on the card though, cause I saw it advertised for 350 CAN up here... (for you Canadian forum junkies)...
 

audreymi

Member
Nov 5, 2000
66
0
0

Thanks Lifeguard, your posts definitely expanded upon
my original posts and refined some of the numbers.
Are you sure about the default clock rate for the Radeon 32 DDR.
I'm asking because I purchased OEM on the advise of my dealer
who told me the only difference was that OEM products
were clocked at 166 Mhz instead of retail at 183 Mhz. He said
the combination of the low heat and fan allowed all of the
OEM units he had sold to be overclockable to the retail
value of 183 Mhz, which is what he did when he installed
the card into my system.

I actually like the idea of a fanless video card as my
computer is noisy enough on its own. Only the
MX cards had this option but I was warned not to
try overclocking as they ran quite hot to the touch.
Is this why the Creative products are clocked at 143Mhz?


The recent incremental improvements in
2D quality in recent Radeons is discussed in the link
below:

link for 2D Patch
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
The only Radeon that comes clocked at 183 MHz by default is the Retail 64 MB card. The rest come clocked at 166 MHz. That includes the OEM Radeon 64 MB, the OEM and Retail 32 MB DDR, the 32 MB SDR, and the AIW Radeon card.

Using Powerstrip, you can o/c most of the cards to 183 MHz. A few will go higher if you get lucky. I was a lucky one since I can o/c to 200 MHz, but 205 MHz is worthless. I have read on these message boards that 183 MHz cards can o/c to 210 MHz.

One interesting note: I have read where some of these &quot;166 MHz&quot; cards are clocked lower at default, say to 160 MHz. Mine came at 164.x MHz.

Finally, I have no idea why Creative clocked their DDR MX at 143 MHz. I would guess that it is to save on costs while scoring a marketing point. Originally I had bought one from Buy.com for $95 when it was on sale with a great coupon + free S&amp;H. The downside was that it was on backorder for several weeks. I gave up on it and cancelled the order after reading the reviews.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
The only Radeon that comes clocked at 183 MHz by default is the Retail 64 MB card. The rest come clocked at 166 MHz. That includes the OEM Radeon 64 MB, the OEM and Retail 32 MB DDR, the 32 MB SDR, and the AIW Radeon card.

Using Powerstrip, you can o/c most of the cards to 183 MHz. A few will go higher if you get lucky. I was a lucky one since I can o/c to 200 MHz, but 205 MHz is worthless. I have read on these message boards that 183 MHz cards can o/c to 210 MHz.

One interesting note: I have read where some of these &quot;166 MHz&quot; cards are clocked lower at default, say to 160 MHz. Mine came at 164.x MHz.

Finally, I have no idea why Creative clocked their DDR MX at 143 MHz. I would guess that it is to save on costs while scoring a marketing point. Originally I had bought one from Buy.com for $95 when it was on sale with a great coupon + free S&amp;H. The downside was that it was on backorder for several weeks. I gave up on it and cancelled the order after reading the reviews.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |