AnandTech covered the Maxx's lag rumour in their Rage Fury Max review (
here is the page concerning lag). AnandTech ends up describing how the Maxx uses AFR as kind of like an complex version of triple-buffering. As for tearing, so long as V-sync was enabled, I never noticed any tearing at all. Since I get the same performance with my single cored Radeon LE (some tearing when V-sync disabled, none when enabled), tearing was not a problem caused by AFR.
I own a Maxx (though it's been sitting on my shelf for most of 2001 after I bought my LE), and I was very impressed by the card when I used it. Image quality was excellent for the day (and is almost as good as the Radeon's image quality), and the card played every game I had (and was fast, especially in games like Unreal Tournament and it was excellent for EverQuest when Kunark was released). While playing EverQuest I'd hear constant complaining about the GeForce cards choking on the massive, detailed (non-hardware T&Led) zones in Kunark, yet I was fine with my clipping plane out past the maximum allowable for most zones. My highest 3DMark99 score (the last 3DMark not to use T&L) at the default setting (800x600x16) was 9033marks with a 1Ghz processor - an impressive score from any graphics card.
To say that AFR was rough around the edges is kind of like saying a car engine sounds rough after removing the exhaust pipe. With V-sync enabled, AFR was a dream and a half - it took two scrawny little Rage128Pro cores and helped them compete with the brand new (at the time) GeForce cards. I've said it before and I'll say it again: The Rage Fury Maxx was the Rodney Dangerfield of graphics cards. It had hardware DVD playback, excellent 32-bit performance, line and edge Anti-Aliasing, Bump-mapping, and the first and as yet only useage of AFR. It was made at a time when ATi knew that they had a winner in the yet-to-be-released R100 (Radeon) core, and the Maxx bought ATi enough time to finish the R100 properly and not rush it out in fear of not having a card that could compete with the original GeForce.
If the R100 would have been rushed out the door, ATi would have most likely had to fix any problems that may arise when a product is released prematurely <cough> P3 1.13Ghz <cough>, and not been able to concentrate on making the R200 the powerhouse core that it is today. That's why I say that I don't think ATi will use AFR for the R200 - not because it doesn't do its' job, but because it did its job back in 1999 when it had to, and allowed ATi to make cores that were powerful enough without double-teaming the competition.