Whatever happened to OfficeBench in AnandTech's testing suites? Edit: Anand "investigating Sysmark issues"?

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
It's always wise to keep the pool of benchmarks fresh with new tests. Otherwise, manufacturers tend to optimize their hardware for particular tests rather than general usage. And the larger benchmark houses like Bapco and Ziff-Davis can grow arrogant when their tests become the de facto standard, as seems to be the case these days. Not to mention the sometimes questionable independence of these companies (apparently Bapco's headquarters were inside an Intel building at one point). Often a new version is often designed to "highlight" the particular architecture of a new CPU, becoming almost an extension of its marketing campaign, instead of just forcing it through important computationally-intensive tasks.

Anyways, what about OfficeBench by CSA Research? The company is known to be independent, and AnandTech seemed quite satisfied with the software up until about a year ago. We don't want the benchmarks we rely on to become stale, so let's bring it back.

Modus
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
I have to say that Anands' persistent heavy use of SysMark distresses me.

There is little doubt that SysMark is housed within Intel's Santa Clara domiciles, and the fact that Bapco went to such great lengths to cover up their ties to Intel makes it doubly suspicious.
This is a mute point though, as my reasons for dicsounting SysMark have little do do with Intel or any ties Bapco may have with them.

I see four serious problems with SysMark 2002:
- Bapco's documentation does not disclose what it does.
- Results cannot be adquately be reproduced in the real world, or in other Office application benchmark suites.
- Bapco's documentation does not disclose how the results are measures and copmputed.
- It doesn't reflect real world multitasking behaviour - Precious few people multitask in a manner comparative to the way in which Sysmark does.

They lack on data on how the applications were chosen for inclusion in the benchmark, and little data on how they are tested.

I presume that both eTesting Labs (Winstone) and BAPCo (Sysmark) did studies to determine which commercial applications were most popular based upon the install base (i.e. units shipped) vs. doing a study on which applications people actually use.

Of course, the next question is how they developed their usage scenarios?
eTesting Labs has some information on their site regarding a survey they did asking their readers to provide some feedback on various applications and what features/functions they were most concerned with regarding performance. You can find their articles here: http://www.etestinglabs.com/bi/contents/pcbmks_contents.asp

The results are almost at complete odds with what little data we've seen for SysMark's chosen applications, and Bapco themselves do not present any data on how the applications were chosen for inclusion nor how they are test and how the results are computed.

I strongly believe AnandTech is in desperate need of another office application test with which to correlate SysMark's results.
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
Yeah, when you pull on the Sysmark thread, it starts to unravel in a pretty ugly way. Funny that it needed to be "registry patched" to enable SSE support on the Palomino, when the software ought to detect such instructions independent of the CPU make or model. Funny too that Intel chips have almost consistently dominated the benchmark for the past two years.

Interestingly, OfficeBench grew out of an Intel-sponsored project to design an intense MS Office benchmark to show off the new P4. "Unfortunately," the test showed the P4 falling behind its competitors. Likely, the software was not tailored to the P4's specific architecture, but rather as a legitimate simulation. It's like designing a sports car that thrives on straight-aways, and then getting Road & Track to heavily weight their straight acceleration tests, when in fact the majority of the target market don't drag race.

You're absolutely right that AnandTech and the other online reviewers that follow its lead need to adopt other productivity performance tests. I don't believe there's any nefarious intent here, just a kind of malaise that sets in after a while, causing community leaders to become set in their ways. Look how long it took StorageReview to refresh their meager benchmark offerings.

Benchmarking of general purpose CPU's is by nature aggregate. No one test should be the final say on a chip's performance in a certain field. Look at Anand's gaming tests -- at least four separate titles in every review. But for productivity performance, still a major buying criteria for IT managers, it's basically "here are the Sysmark and Winstone scores".

I commend AnandTech on its innovative, homegrown and highly relevant server performance benchmarks. If only they could do something like that for office productivity tests.

Modus
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
I emailed some of the AnandTech staff about this, maybe we'll get a reply later.

Modus
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
Interesting quote from Anand's latest CPU review:
You may have noticed that SYSMark 2002 is absent from our performance comparison in this review; we're currently investigating some issues with the benchmark that we will be reporting on publicly shortly.

Modus
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I want to commend the reviewers here for using individual, unpatched, off-the-shelf programs like LAME in benches, to balance the synthetics that as stated above can be "influenced" to favor P4 with a little extra funding from intel. Any weighted stat can be abused, especially if the weightings are secret.

It was helpful (and amusing) to see the Celeron 1.3 with SDRAM beating the P4-Cel-1.7 in LAME and some games when I was choosing a processor for my MP3 jukebox upgrade.
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
AnandTech's server benchmarks use an actual script taken from a typical "day in the life" of the web, database, and forum servers. This script is replayed against an old database using the hardware in question. Such a method is about as real-world as you can get, and highly relevant.

Now why not try something like that for office productivity? The script could be something like:

* begin with a moderate load on the system -- 3 open browser windows, Outlook checking email every five minutes, MSN messenger online, norton antivirus 2002 running in agressive mode, some useless personal firewall active, etc.

* then fire up Office and open a large Word form letter along with an Access DB; perform a 500-document merge using contacts from an Outlook address book

* export the Access DB to Excel, sort by certain fields, and create a bunch of pie charts

* fire up Powerpoint and export a large presentation to HTML format

* perform a Windows file-search to locate all .doc, .xls, .mdb, and .ppt files

See, it's not hard to simulate real-world usage. This may not be the most stressful test but I guarantee it's closer to what your average office worker does. Note too that load times are important. And although the hard drive plays a large role here (as it should, this is a simulation after all) CPU/RAM speeds still affect application load times.

Modus
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
One thing I find mildly disurbing- 85% of reviewers are still using Intel's receommended scene to benchmark LightWave.

I found it fascinating when in an article at Aces Hardware alternatives scenes were benchmarks and the AXP jumped from 20% slower to above 5% faster then equivalent P4's.
I'm not picking out AnandTech when I specify LightWave, as I've no idea what scene(s) AnandTech may be using.

PhotoShop 5/6/7 is another benchmark whose results can vary DRAMATICALLY depending upon which filters you test, and on what scene.
If you know what your doing you can pretty easily pick out the appropriate filters to show an AXP as crushing the P4 or vice versa, you can pick out a decent set of filters that perform extremely well on a G4.
Hell, given enough time one can pick out a handful of filters in which even the VIA C3 performs respectably.
One has to be very cautious on which filters are applied and on what scene if you wish to fairly benchmark PhotoShop performance.


Even worse- awhile back some sites started benchmarking nBench 1 as a supposedly reliable indicator of processor performance.
While nBench isnt as blatantly biased as one might expect is still clearly and strongly favors AMD- as one would expect given that it was designed by AMD.



AnandTech's server benchmarks use an actual script taken from a typical "day in the life" of the web, database, and forum servers. This script is replayed against an old database using the hardware in question. Such a method is about as real-world as you can get, and highly relevant.

I agree, I wholeheartedly applaud AnandTech for their server benchmark based on recording from their own Web Server. That was a 100% true real world test of performance, and an extremely valuable benchmark.
That one benchmark I put more faith in then virtually any benchmark AT has ever used.




I've only briefly looked at CSA Research's OfficeBench, and have precious little first hand experience with it and what it does so I've no idea whether it's a suitable candidate to use it reviews but it sounds as though it may be worth looking into at least.




Funny that it needed to be "registry patched" to enable SSE support on the Palomino, when the software ought to detect such instructions independent of the CPU make or model. Funny too that Intel chips have almost consistently dominated the benchmark for the past two years.

I don't believe that was an example of bias at all, the version of WME used in SysMark 2001 was a perfect example of poor coding that only queried Intel processors for SSE capabilities.
The retail version experienced the same problem.

It's not Bapco's fault, but poor coding practices on the part of MS. Bapco merely happened to use the latest version of WME available at the time, which happened to have the problem.
What I do find amazing is that simply making sure that the Athlon's SSE units are recognized pushes the Sysmark 2001 Internet score up with 18%!

The Sysmark 2001 Internet score is a combination of SIX applications. Simply improve the Media encoder performance somewhat (Most of the time SSE improves the scores of application by 5-30%) and you get a 18% boost even though the rest of the benchmark does not change at all. It's not hard to see that WME must have had an INCREDIBLY high weighting for one application to have such a huge impact.
My question would be why did Bapco put such a hefty weight on WME performance?
Why even include WME encoder at all? According to research by eTesting Labs very few people reported using WME. I resume Bapco must have done some research but they've never publicized any.

To me that has always spoken volumes about how poorly designed SysMark 2001 was.
WME is still weighted WAY to heavily in even SysMark 2002 Ifrom what I've seen. I recall Bapco used to have WME run in the background throughout the entire test, I wonder if that's still true for SysMark 2002.

Funny too that Intel chips have almost consistently dominated the benchmark for the past two years.

In my experience Bapco SysMark has favored Intel chips through revision 98/2000/2001/2002. I did find it quite interesting that a new version of SysMark was released almost immediately after AMD processors caught up to the competition from Intel in the previous version, and the next revision always dramatically favored Intel processors initially.

That said, it may be a simple aspect of progression and for the last few years Bapco has revised their benchmarks on a fairly regular basis.
Hence I do not believe it is an indication of bias.


My reasons for disliking Bapco is that I find their testing methodology to be of dubious merit, and their tests do not accurately reflect the manner in which people typically multi-task.
Thair almost complete lack of documentation on the operations and inner working of SysMark and lack of any documented research into the application usage scenarios of people.

This combined with the potentially dubious background makes SysMark of very questionable reliability.

Winstone was once a viable alternative but they've let the suite age, and the majority of the tests are no longer applicable for modern software.


 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
Very nice post, Rand.

I knew about Photoshop benchmarking issues (who hasn't explained it to a Mac zealot at least once?) but I had no idea that most Lightwave benchmarks were using Intel's recommended scene. That's just absurd.

My email to the staff included a link to this thread; hopefully they'll stop in and read it.

Modus
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
I posted this a while back, and it is probably common knowledge, I feel it is relevant.

Although Sysmark 2002 is supposed to enable SSE for the Athlons, when compared to Sysmark 2001 (patched by Anandtech to use SSE) the Athlon loses considerable performance. For instance, the athlon XP1800, which beat the P4 2.0 by ~4% in Overall System Performance in 2001, now loses to the same p4 2.0 by ~8% in 2002. That is a 12% performance delta. A 14% variance in Sysmark 2002 is equivelant to the difference between a p4 1.6 and p4 2.0. So between versions of the Sysmark suite, Intel gained a ~20% advantage(by clock speed).

It has been argued that this deviation in performance is due to optimized coding of applications for Intel processors (SSE, SSE2, etc). It would be interesting to test this theory, as it would help silence bapco's detractors as well as enlighten the hardware community.

I am currious, of how often companies recompile the code to include optimizations for specific processors? It seems hard to imagine that each year Intel chips are getting ~20% faster in standard office applications.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
I just thought I'd note that Bapco is made up of volunteer representatives from a group of companies, each of which has one representative that stands in for them in Bapco's stead. Each company has one vote in all of Bapco's policy decisions on how to structure/design their benchmarks. The representative f any given company votes once on behalf of that company, and the votes are tabulated and taken as consensus.
Until recently AMD was not a part of Bapco, though they've now paid the fee needed to be apart of Bapco and should be part of the voting board in the next iteration.

Just pointing it out in case anyone here didnt know how Bapco was structured.


My email to the staff included a link to this thread; hopefully they'll stop in and read it.

Please do let us know if/when they reply. I'd be interested to know what thoughts the staff of AnandTech and specifically Matthew Witheiler/Anand Lal Shimpi might have on the matter.

On another note- I wonder whatever happened to Henry Kuo?
I've not heard of him on AnandTech in quite some time. I'd developed quite a lot of respect for him, he was (is?) one of the better Motherboard review editors on the Internet at one point in time.
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
Bump, this is important.

No reply yet from AnandTech. I think I'll cross-post this to the talk-back thread of the latest CPU review.

Modus
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
It seems some awfully serious allegations of bias are about to hit the fore-front, and word is there is some pretty damning evidence against SysMark being passed around.


AMD has released a .pdf with some potentially serious evidence that SysMark 2002 has been intentionally reworked and redesigned from SysMark 2001 to be biased towards the Pentium 4 platform.
They certainly list some pretty damning inconsistencies in SysMark 2002's application loading, and a good portion of evidence to back it up.

AMD .pdf

Van Smith, formerly of Tom's Hardware and now hosting his own site 'Van's Hardware' has an article up on AMD's .pdf, you can see the article here
I should mention that I have precious little respect for Van Smith, and I firmly believe he is biased towards certain companies.... and he's long displayed a consistent despise for Intel.

I can't say as I put much faith in anything Van puts forth, but this article is worth reading even for those that hold Van in low regard.
As is his norm he's thrown forth some pretty blatant accusations against Intel, but he nicely sums up AMD's .pdf on SysMark 2002.
Van Smith is a very intelligent guy, and he carries a bit of influence in the industry. It's unfortunate he chooses to use his knowledge to mislead people and present often unfounded claims against Intel.
For once he has an article worth reading though.

Perhaps most interesting of all though, is that Dean Kent of RealWorld Tech has apparently uncovered some data of his own regarding Bapco SysMark 2002. He's commented a few times in the past half year or so that he's been studying various industry standard benchmarks, SysMark 2002 among them.
Dean Kent is fairly highly regarded by many in the industry, and he's undeniably authored some fascinating technical articles in his time.
Up until now Dean Kent had questioned the metholodogy of Bapco SysMark 2002, but come out strongly against those that believed it was intentionally designed to favor Intel.
He's got a solid history in hardware/software technical analysis and I personally hold him in high esteem.

Therefore his recent post on Ace's Hardware's forum was both surprising and extremely interesting.
Linky to his post
He makes it abundantly clear in subsequent posts in the thread that it is certainly possible Bapco has a reasonably and entirely logical reason for their choices in regards to SysMark 2002.

He also stated this however:

You will see it on several sites before long - I am quite sure of that. I am very definitely not the only one that has seen it. The 'evidence' is pretty solid.

Now - the evidence *is* based upon observation, and not an admission of wrong-doing. Therefore it is *possible* that someone could provide some credible explanation based upon some survey results or software manufacturer's recommendation.

However, my position is still that if one can identify a methodology that will identify exactly what is being measured by each benchmark, then it doesn't matter how the benchmark was developed - each person will have all the knowledge necessary to determine if that benchmark applies to his/her own usage. This is, after all, the ultimate goal of a benchmark... or so it seems to me. .

His comment that other sites will be displaying this "evidence" along with AnandTech's own comment recently that they were investigating certain issues with SysMark 2002 makes it seem quite likely to me that AnandTech is one of those "several sites" he mentions.

All of the above information taken together certainly makes it seem as though some extremely serious allegations of bias, backed up by strong evidence is likely to be presented for public consumption soon.

 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Modus, as a followup to my earlier comments about how LightWave results could vary dramatically depending upon the scene rendered....


XBit Labs has a review of AMD's AXP 2400/2600+, contained within are two LightWave benchmarks.
Link

They nicely illustrate how different aspects of an application, and different data utilized within the application can have a very dramatic impact upon relative performance.
One benchmarked scene in LightWare shows the Pentium 4 leading by almost 20%, another "Sunset" scene shows the AthlonXP favoured by almost the same margin.

It's a good example of how one application could strongly favor either processor depending upon your specific usage requirements.
 

Macro2

Diamond Member
May 20, 2000
4,874
0
0
Humm, I thought everyone knew B0pcos benches were rigged toward Intel? Maybe Anand and Vans will finally expose it for good.
The big jump in P4 performance from Sysmark 2001 to 2002 means something is "Funny"

I won't even mention other benches that are thinly veiled bandwidth measurements.
 

Anand Lal Shimpi

Boss Emeritus
Staff member
Oct 9, 1999
663
1
0
I just thought I'd chime in here now that data seems to be leaking out about this.

Here's where things stand currently:

1) AMD went around to reviewers and distributed the PDF that you've seen posted in this thread. The data AMD produced is verifiable as I have done my own verification of the tests in house.

2) Here's the main problem: SYSMark 2001 ran a certain set of tasks but in the move to SYSMark 2002, a good deal of the tasks that AMD's Athlon was faster at were removed and replaced with tasks that the Pentium 4 was faster at. Both sets of tasks are perfectly valid tests of CPU performance (it's not like BAPCo just stuck in random tasks that don't do anything) but the point that must be made is that the changes were made seemingly without any user-level research to back them up. If there was some research that said "this is how most people use their systems" that caused BAPCo to change their methodology then this wouldn't have been a problem, but without that backing for their decision then it just seems as if BAPCo optimized the benchmark for the Pentium 4.

3) AMD's secondary complaint is that the benchmarks now use much larger datasets (e.g. Excel). This is more of a minor complaint since it penalizes the Athlon XP for having a smaller cache than Intel's Northwood. AMD would not have made the same complaint had their Hammer already been out since the larger datasets would mean that Hammer's on-die memory controller would give it the advantage.

4) I've been working with AMD on analyzing this information, it's very simple to obtain but requires a bit of effort to analyze. Even AMD today sent me an email saying that they had to order some special software in order to fully understand what's going on in the benchmark. It's too early to make any complete conclusions but what can be said is that SYSMark 2002 can no longer be used as a sole measurement of application performance.

It's pretty sad that it has come to this, but what I can envision happening (at least on AnandTech) is a larger set of office application benchmarks just as Modus and Rand have suggested in this thread. I would like to put together our own tests but it is definitely not an easy task; in light of these discoveries I will have to put much more thought into doing just that however.

The good news is that now that AMD is a part of BAPCo, SYSMark 2003 should become a much better and more balanced benchmark. Before, the only real input from a major CPU vendor was coming from Intel (I was always afraid that SYSMark 2003 would be released and it would show an incredibly unrealistic gain with HyperThreading enabled) but now with AMD involved things will hopefully become more balanced. According to AMD, BAPCo is infinitely more responsive to their needs now that they're a part of the organization and they should be having a formal meeting to discuss this issue very soon (if they haven't already).

I'll keep you posted on what's going on as soon as I get the info I need from AMD/BAPCo.

Take care,
Anand
 

Modus

Platinum Member
Oct 9, 1999
2,235
0
0
Edit: While I was typing my post, the Man spoke

Still. . . curiouser and curiouser.

I do hope the hardware community that is about to crucify Bapco (deservedly IMO) will at least give them a chance to explain themselves before running the article. I'd hate to see a big conflagration end with a sheepish reply from Bapco that explains everything and embarasses the hardware community "Bad Anand, bad Tom, you have to *enable* your L2 cache in the BIOS"

It's nice that AMD is now on the Bapco board. Unfortuantely, the structure of their administration makes it unlikely that we'll see a change in benchmarking software for at least a year. Even then, I don't see how AMD, as a single member, could sway Intel's influence away from the other dozen or so members. Most of those members have little or no stake in an accurate CPU benchmark of productivity performance, and some, like Dell, have a vested interest in keeping Intel's chips on top.

Frankly, I don't think any organization comprised of product manufacturers should purport to provide objective performance benchmarks of their own products. If AMD led the organization, they would probably be giving us the same nonsense. We need an independent organization, something like Spec, only with tests geared more toward real-world end-user performance. The board would be elected from a larger membership, and would rotate out every six months, to keep pace with technology. Board directors would have to approve every policy decision on benchmark creation, and the approval would have to be nearly unanimous, forcing them to achieve a consensus. Funding for software development would come from membership dues, sponsorship, and sales of the software. You could make it a non-profit, so that members could write off their dues.

Now that's probably pie-in-the-sky, but at least it shows what an ideal arangement might look like, and how we have nothing close today. The other model is even better, and is favored by some of the lesser known community reviewers like Tech Report. They favor obscure benchmarks developed by independent persons to test a specific task that is known to be computationally expensive. There are always little benchmarks like that coming out, some quite well done too. Only problem is, office productivity is essentially limited to a dozen or so common commercial apps, which makes a scripted approach like I outlined above the best solution.

BTW, that PDF by AMD is unbelievable. How could Intel be so blatant about this? In some cases, an entire slew of tasks that favored the Athlon in Sysmark 2001 were replaced in Sysmark 2002 with ones that favored the P4. And I tend to trust the accuracy of AMD's PDF. They are going out on a limb here to expose corruption, so they have a strong incentive to keep their accusations factual.

Modus
 

MadRat

Lifer
Oct 14, 1999
11,924
259
126
The best way to keep a secret is to stay mum on the subject. BapCo's secrets are finally going to be exposed. I feel that AMD's presence in BapCo will encourage Intel to spin off another fake believe company to do a whole new slew of benchmarks. Why would Intel want to stoop down to AMD's level and play at a level where some third party's decision can make or break my whole investment??
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
I just thought I'd chime in here now that data seems to be leaking out about this.

It was bound to come out at some point, and with Van Smith, Dean Kent both hinting at this along with AMD's PDF becoming publically accessible it had to happen.

I'm sure I spead for all of us one I say I sincerely appreciate your taking the time to answer out questions. Doubtlessly school, and AnandTech must live you precious little time for a personal life. I imagine it puts a great deal of stress on you, and certainly can't make it easy for you and your girlfriend to spend some quality time alone together.

If not for you none of us would be here right now, or have these forums to discuss such topics with like-minded individuals.

Regardless of what happens with reference to SysMark and Bapco I've no doubt that you will institute a fair, unbaied, and realistic office application benchmarking suite.
Be it created by CSA, Bapco, AnandTech or any other entity.


Humm, I thought everyone knew B0pcos benches were rigged toward Intel?

Most of us with more experience are well aware of the questionable background of Bapco, however there are a substantial number of those with less experience in the industry or that may not have the time and inclination to stufy the available background.
Many will simply look at the results given and take them at face value without taking into account the benchmark utilized or whether it is applicable to their personal usage criteria.


I feel that AMD's presence in BapCo will encourage Intel to spin off another fake believe company to do a whole new slew of benchmarks.

I think we need a little more evidence before going so far as to essentially say Bapco is Intel and is designed for the sole purpose of benefitting Intel.
I've never put much faith in Bapco's testing methodology and I agree they have a potentially dubious background but none of that shows with a certainty they are exist purely for Intel's benefit.

I think the biggest issue is Intel has a little too much sway in the decisions reached by Bapco. Their own personal vote, along with the votes of certain comanies such as Dell that have a strongly vested interest to see Intel succeed.
AMD's entrance into Bapco should serve to at least partially off-set this to some degree.

I doubt any of this would be an issue if Bapco was more forthcoming about documenting the testing methodology and what each specific sub-section of the benchmark did.
Along with what data/research they use to judge what should be included in the suite to best simulate and-user application usage.

I've precious little doubt that AMD can and would be every bit as underhanded as Intel is accused of being if the opportunity presented itself.

Remember we have yet to see Bapco's responce to these accusations, and it is quite possible they have a perfectly plausible reason for the changes they've made and solid research into end-user application usage to back up these changes.
Admittedly the evidence seems quite substantial, but Bapco deserves the right to defend itself against it's critics.
 

sprockkets

Senior member
Mar 29, 2001
448
0
0
Gee, anyone remember me saying that the difference between 2001 and 2002 in the CPU review where the 2.0A went from being a lamer to the best?
 

Burmese

Junior Member
Apr 9, 2002
8
0
0
The information in the .pdf speaks for itself. I'm having a hard time imagining any explaination from Bapco that would do anything but leave me ROFL. I'm guessing they will keep mum until someone bigger than Van or Dean turns up the heat; or someone says something inaccurate enough for them to attack, and thus divert attention from the real issue. What really interests me is why is this information just now coming to light? Did AMD, by joining Bapco, get access to the internal workings of Sysmark? If so, was it under some sort of NDA which it subsequentially broke? Did -all- the prior members of Bapco have a 'vested interest' in keeping mum on the workings of the benchmark? Or did some clever indepentant analyst figure out how to dissect Sysmark and alert AMD?

 

JustBrewIt

Junior Member
Aug 27, 2000
5
0
0
Did AMD, by joining Bapco, get access to the internal workings of Sysmark?

If you look at page 13 of the AMD PDF, it is clear that they had not yet joined BAPCo when it was written. I think what happened was that AMD figured out what was going on (not sure how), and decided to join BAPCo as a result.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |