1) VirualLarry: because every time you move the mouse one "tick", it moves the crosshair several screen pixels
True that. More Hz. the PS/2 can run at 200Hz - 200 updates per second.
It's an uber-Pentium 4 effect.. except, with no exaggerated pipeline.. heh..
It definantly effects your capabilities.
USB operates at 125hz. 125 updates per second.
2) USB components and peripherals are a tad more expensive, but you get more with USB. Faster information exchange is the primary bonus. Instead of transmitting information one character or byte at a time, USB components transfer pre-determined sizes of packets of information that are typically 8, 16, 32 or 64 bytes in size.
Yes, they transfer more data per second, by far.
Yep. Alot of good that does when we're only sending Button, X, Y, WheelDelta, Battery Status..
It's like sticks of memory.. higher mhz = lower latency... BUT, when you compare DDR1 to DDR2 at 'relativly' equivelant bandwidths, you have DDR1 @ 200mhz effective 400mhz, and DDR2 at 100mhz effective 400mhz - because of it's extra "width" per clock cycle.. however, you will have lower performance with the DDR2, because your CPU will be wanting more commands to be executed than the memory can, so it queues them, and then executes them en masse - where with (edit; not DDR2) DDR, it's queue wait time is half of that, thus it can normally get more from DDR1 than DDR2, per scale.
This isn't as big of an issue if you're pulling huge textures, as your CPU will know to queue large blocks. It's not so much a problem with audio, again, large blocks. But now, let's get down to mouse movements..
PS/2 > USB, even if USB ((~100 kilobits) * (~125 cycles/sec)) at ~12 MEGAbits/s (12,000,000b), and PS/2 ((~128 bits) * (~200 cycles/sec)) at 256 KILObits/sec (256,000b).. because of the Hz.
It's all about the Hz baby. we need MEGAHZ. We do not have even a 1Mhz (1,000,000hz) input device, yet we have 3 GIGAhz (3,000,000,000hz) processors. If we had a 1Mhz human interface device, then we would not have this mouse-sensitivity-per-resolution problem that as of current, only high-resolution interfacing, gaming enthusiasts, and graphical artists (WHO I MIGHT ADD, USING WACOM TABLETS, USE THE PS/2 PORT, or even more rare input devices and interfaces) are able to pick up on.
Not everyone is a sniper, but there are a few.
I, for example, am not a sniper. I use a mx900, on a custom hack job laptop, and write applications, modifications, and other little doohickies - mostly for work.
But then I go play FarCry, or Doom 3, and I expect to play at 1024x768 minimum resolution - even if it means I must turn off the oh-so-beautiful antialiasing, anisotropic, extra lighting, etc, and be able to tag big baddies in their foreheads.
At 125 updates a second through USB, infront of a 2.4Ghz 1Mbs bluetooth buffer, being filled by a single internally buffered, antialiased, and over-motion compensating, 500hz optical sensor.. There are times I miss not only because I suck at aiming (although I am no reeling freebirth lancer)..
.. but because I was 1 frame out of 30 visual frames per second short, under a correct physical mouse movement - I was whacked by a blunt or clawed object, that could have been avoided.
I don't realise it when it happens at first, I just go 'dammit'. And try again.
After multiple occurances, possibly because of the way my mind buffers these frames, because I programmed the actions per frame, I am able to pick up on this effect.
Latency is the bane of the sniper, the close quarter combatist, the artist, the enthusiast, and on rare occasions, me.
If you can't feel the hert, we're not on the same frequency; suck, my clock, twitch.
I have a serious optical tick. You don't know how much it hertz.