igor_kavinski
Lifer
- Jul 27, 2020
- 22,306
- 15,563
- 146
That sweetens the deal even more!
5. People still complain...
Back in my day people didn't complain as much! Kids these days.It could cure Cancer, Baldness and ED, and some would still complain.
As far as the internet is concerned. Complaining is the new Porn.
i swear to god man. this argument is so wrong, i feel stupid even having to hear it.- Intel is probably losing money on this product, something they can't really afford anymore.
Correct. According to the Intel naysayers around here anything Intel ships they will lose money on. Well most people will write they will "loose" money on it, which I guess with twice the "o's" is twice the losses.i swear to god man. this argument is so wrong, i feel stupid even having to hear it.
on a plus note, i have received a ping from the retailer that my card is ready & shipped. i should have it thursday.
(i could have gotten it earlier, but im in morocco enjoying the weather and the ridiculously cheap food)
i have seen the benchmarks, and im fairly satisfied with what i see.
every reasonable person should also expect a post-launch driver improvement that seasoned cards like the 4060 will not be getting; this isnt me hallucinating a performance that isn't there, "just wait and see", but tiny bumps up are almost a given.
i do not have a 1440 monitor, so i am receiving less benefits, but i am very happy with a functional card for that little money.
i noticed that amazon.co.uk has already knocked £20-30 from the 4060, probably because the various vendors are worried of getting stuck with a whole bunch of very expensive bricks.
i currently own a very old, very used rx 590 FatBoy 8gb that the instant it's pushed, it BSODs. it's like a wall, the instant you go above what it can do, you dont see single-digit framerates, it just dies.
i really just want to play the occasional AAA title for s&giggles, and i care far more about the gameplay than the graphics. The kinda guy that turns off bloom, hdr, godrays .. before even starting the game.
and frankly, im just curious to see what a Intel GPU is like. ive had Nv, i had AMD (i liked Nvidia better), now i want to try the new guys.
i just hope stalker 2 and wytchfire work fine.
Dom had issues too. The team being aware of them, they will get sorted quickly.Most critical review I've read: https://www.pcgamer.com/hardware/graphics-cards/intel-arc-b580-review/
Maybe they weren't using updated drivers coz other reviewers didn't speak of CP2077 issues.
I think Intel, which of late has been the proverbial blind squirrel, has finally stumbled on a nut with Battlemage.
AMD's board going nuts as well because they are using TMSC and "loosing" money? Or can everybody except Intel make money with TMSC?When the Intel Board of Directors finds out Intel is buying expensive dies from TSMC and then selling them at loss, they will agree - what Intel stumbled upon is nuts.
AMD's board going nuts as well because they are using TMSC and "loosing" money? Or can everybody except Intel make money with TMSC?
Are you going to tell on them to the BOD? I think someone should in all fairness. You know what they say, if you see something, say something.
Is is possible Intel has other production efficiencies AMD does not have or is AMD simply economically more efficient than Intel across the board?Intel is selling 279 mm2 N5 die size for less money than AMD is selling 204 mm2 N6 die size.
Intel BOM costs are > 50% higher, while selling for less money.
AMD is barely breaking even, so imagine how much Intel graphics division is losing...
Intel is selling 279 mm2 N5 die size for less money than AMD is selling 204 mm2 N6 die size.
It’s almost like Intel is more desperate than AMD and especially nvidia and so will tolerate a different “investment” into entering the market.Is is possible Intel has other production efficiencies AMD does not have or is AMD simply economically more efficient than Intel across the board?
Where do you get the 50% higher from? From rough calculations about die size and N5 vs N6? The die would have to cost twice as much than AMD's die to account for 50% increased BOM cost by itself. Also AMD is not "barely breaking even", if they were barely breaking even based on your calculations for the rx 7600 they'd be losing money on quite a few of their products.Intel is selling 279 mm2 N5 die size for less money than AMD is selling 204 mm2 N6 die size.
Intel BOM costs are > 50% higher, while selling for less money.
AMD is barely breaking even, so imagine how much Intel graphics division is losing...
You are making so many assumptions and mistakes in this statement it's hilarious. Arc battlemage uses 5nm not 4nm, and you quote N7/N6 as 9500$ when N7 itself it already listed at 10k with the rx 7600 N6 being more expensive than that, at the same time you pull $20k for N4 price out of thin air. You also Literally jumped two nodes with your assumptions. From (N6 vs N5) to (N7 vs N4), huge difference. Also using your quoted tom's hardware estimates, TSMC N5/N4 is quoted as being close to 15,000 on an article from October 10th of this year and 16K in 2022, where did you get 20k$ from?It's even worse than it looks based on size - N4 wafers that Intel uses are priced at $20k whereas N7/6 are $9500 (Tom's Hardware estimates), plus yields will be different.
Where do you get the 50% higher from? From rough calculations about die size and N5 vs N6? The die would have to cost twice as much than AMD's die to account for 50% increased BOM cost by itself. Also AMD is not "barely breaking even", if they were barely breaking even based on your calculations for the rx 7600 they'd be losing money on quite a few of their products.
For example, using your analogy of die size, the RX 7700xt has almost 2x the silicon at 350 mm2 with 200 mm of that being on n5 and with the entire architecture using special packaging techniques to get the chips to work together. Yet even with a bigger cooler/heatsink/shroud/4GB more ram/more fans, the card is selling at 400$ compared to the 270 ish of the rx 7600. If the rx 7600 is at barely breakeven(costing ~250$ to produce/sell), then the rx 7700xt would be losing AMD 75-100 dollars for each card they sell.
The BOM and profitability of these products is a lot more than die size and node. Sure intel might actually be losing money on these products even minus R&D but it's not as huge as you claim it might be. Also losing a bit of money to push out another generation to iterate from is just part of the game sometimes especially when you are on generation 2 vs generation 2X from competition. They gain tons of data, experience, and generally pushing out a product gives them another milestone to reach competitiveness
at the same time you pull $20k for N4 price out of thin air.
Yea you are correct about the die cost calculation pretty much. Still that doesn't explain how 7700xt is not a loss at 400 dollars despite having over 1.7x the die size, 200m2 of their die on N5, and extensive use of special packaging for their MCD/GCD architecture. Clearly the die is only part of the equation and there are a ton of other factors to take into consideration. Semianalysis estimates the BOM of N32, not just the die, as being around 2.2x N33. I know that 7700xt is a cut down version of N32 so AMD is willing to take lower margins to still be able to sell some defective dies, but still. The 7700xt chip isn't only purely defective N32 chips, otherwise there'd be no volume, but also ones that were simply slightly behind their 7800xt targets and therefore binned. Unless AMD is willing to take a huge loss on slightly binned N32 dies, the 7700xt is still making profit at 400 dollars (1.48x rx 7600 price) despite costing 2.2x more to produce. If AMD can still make a profit on the rx 7700xt like this, I'm pretty sure Intel isn't losing boatloads of money on a product only 1.3x more expensive at most.- bigger die size: multiply by 1.36x
- more expensive process technology node: multiply by ~1.50x
- lower yields: multiply cost by ~1.05x
And I get 1.36 x 1.5 x 1.05 = 2.14
or +114% higher cost just on the die cost alone and +50% on memory
Other component prices probably comparable.
What are you talking about?N6 is continuation of N7 - they stopped making N7, and it's now N6. Same thing going for N5-N4.
Did you even read what they said. It says they can theoretically tolerate 10% price hikes from 4nm wafers from (18000 current price) to around 20k (theoretical price). NOWHERE does it say that the price is 20k for 4nm. Also like I said, what are you talking about with the N7=N6 and N5=N4 BS. They are not the same at all, it's like saying since TSMC developed N3X they discontinued N3,N3E, N3P and all wafers are now N3X.10 July 2024 - "Negotiations with AI and HPC customers, such as Nvidia, suggest these clients can tolerate approximately 10% price hikes for 4nm-class wafers from around $18,000 per wafer to around $20,000 per wafer. As a result, the 4nm and 5nm nodes, primarily used by companies like AMD and Nvidia, are expected to see an 11% blended average selling price (ASP) hike. " Source: https://www.tomshardware.com/tech-industry/tsmc-may-increase-wafer-pricing-by-10-for-2025-report