my problem is their coolers- they're ugly compared to the reference model and always so large while having the gall to ask more money. Nvidia shrunk back down to 2 slot coolers with the 50 series and yet the vast majority of AIB coolers are oversized still.There still PowerColor Reaper/Sapphire Pulse
This. While NVIDIA is certainly a greedy company, TSMC isn't blameless at all. They're the ones jacking up prices across the board because they're by far the best and most effective supplier. Their supply can't even meet the demand, so they ask for even more. There's a reason the US government invested billions in building that Micron plant in Idaho.This is the price that consumers pay, when there is no competition between companies.
Both Nvidia and TSMC have no proper competition, so they jacked up prices as much as their greed desired.
The TSMC situation is specially hellish. By the time the west started to think that maybe it was not a good idea to have all the chip production of the, that we use to literally everything, it was already too late.This is the price that consumers pay, when there is no competition between companies.
Both Nvidia and TSMC have no proper competition, so they jacked up prices as much as their greed desired.
Looks fine and like reference most - https://www.powercolor.com/product-detail224.htmhey're ugly compared to the reference model and always so large while having the gall to ask more money.
The TSMC situation is specially hellish. By the time the west started to think that maybe it was not a good idea to have all the chip production of the, that we use to literally everything, it was already too late.
I'm honestly starting to wonder what's the point of many of these discussions over semiconductors and fabsThis. While NVIDIA is certainly a greedy company, TSMC isn't blameless at all. They're the ones jacking up prices across the board because they're by far the best and most effective supplier. Their supply can't even meet the demand, so they ask for even more. There's a reason the US government invested billions in building that Micron plant in Idaho.
I'm honestly starting to wonder what's the point of many of these discussions over semiconductors and fabs
don't get me wrong, i'm sure they're important as they're needed to even make this tech in the first place- but don't these cards usually take like 300~ or so to manufacture on both companies ends?
If AMD wanted to they could release the 9070xt for 449 and get easy marketshare while still making OK margins. And Nvidia could drop their cards to turing pricing and still make a healthy profit on their end too.
Isn't this moreso on Nvidia/AMD than TSMC?
Not very knowledgeable on semiconductors/GPU manufacturing here which is why this is a question.
I'm aware of all that. It's the margin of profit that's the issue here. I doubt that these GPUs actually cost upwards of a thousand dollars for Nvidia to make, but they still charge so much for them. Whatever TSMC charges Nvidia is probably not that crazy considering how much cheaper a lot of other general tech products are by comparison- Iphones are still 800 bucks even despite TSMCs monopoly.The most expensive part in a GPU is the chip. So if wafer prices go up, that means the cost of the chips go up.
And that means the cost of the GPU has to go up.
Even if Nvidia maintained their margin of profit, they would still have to increase prices.
It's a bit more complicated than that. You can't look at the price of the chip AMD is paying and just assume what the margins are.I'm aware of all that. It's the margin of profit that's the issue here. I doubt that these GPUs actually cost upwards of a thousand dollars for Nvidia to make, but they still charge so much for them. Whatever TSMC charges Nvidia is probably not that crazy considering how much cheaper a lot of other general tech products are by comparison- Iphones are still 800 bucks even despite TSMCs monopoly.
Wow this put 9070 xt on 4080 super and 7900 xtx ballpark, just 10% slower than a 5080.
As a 7900XTX I’m glad to see more people getting this performance level for cheaper.
The numbers are a mix of RT and non-RT which could mean anything really as 7900XTX is below 3070 in path tracing.Wow this put 9070 xt on 4080 super and 7900 xtx ballpark, just 10% slower than a 5080.
Comparing vs 7900GRE mean 550$, but in last seconds AMD can put 600$I’m just worried about the XT price now, a lot of people I know are waiting for this card
And that's not counting the big image quality gains FSR 4 will provide if we judge by that Ratchet comparison, it seemed pretty solid even in Performance mode, which was kinda unusable for FSR3.
You can take just this and calculateThis is with RT games as well so pure raster should be lower.
You can take just this and calculate
![]()
and compare with this
![]()
i got 18-19 for nonXT and 37-38% for XT only raster.
This is with RT games as well so pure raster should be lower. Also this comes from AMD and they are lying in their info when it comes to GPUs.
This is interesting:
![]()
![]()
They used some gimped GRE...
Maybe he didn't notice that 9070 nonXT vs 6800XTTrying to understand the image but i don't get it, can you explain?
Trying to understand the image but i don't get it, can you explain?
Maybe he didn't notice that 9070 nonXT vs 6800XT
6800XT vs 7900GRe is 15% on TPU, 21% + 15% = 36%, so almost same as AMD saying 38%. +- different games.
because there no competition to 5080, but there also Kepler saying possible 3.4ghz, seems AMD left it for user OCWhy not name it 9080 and 9070 AMD?
because there no competition to 5080, but there also Kepler saying possible 3.4ghz, seems AMD left it for user OC
Thank Lisa for canceling 144CU/550W Halo part. Worst decision in her career
for RDNA4 wafers not problem, even, if there Halo part.So many good products, and so few wafers to go around.
for RDNA4 wafers not problem, even, if there Halo part.
We don't know what happens there. That company secretsBut AMD didn't book enough wafers for all of that, so they lost out on a ton of sales.
9070nonXT seems temp are good
![]()
someone calculate data Using TPU/Hub
![]()
We have data from AMD. People calculate it? People can have fun until reviews.How they can calculate that when no one knows how they even tested those games (and in what places)?.
We have data from AMD. People calculate it? People can have fun until reviews.
there no "UP to"(if they are not lying again).
You guys really love to be scamed by nvidia.This is with RT games as well so pure raster should be lower. Also this comes from AMD and they are lying in their info when it comes to GPUs.
This is interesting:
![]()
![]()
They used some gimped GRE...
there no "UP to"
You guys really love to be scamed by nvidia.
Not only that, they incorrectly bet/assumed that raster performance was the end all be all and allowed features like DLSS/RT/FG to be monopolized by nvidia and only now does it seem like FSR4 is where AMD is finally getting serious.That is not what I was talking about.
In 2020, AMD had a very good lineup of products. Consoles, the best CPUs in the market and competitive GPUs for price/performance.
And a ton of demand due to Covid and mining.
But AMD didn't book enough wafers for all of that, so they lost out on a ton of sales.
Not only that, they incorrectly bet/assumed that raster performance was the end all be all and allowed features like DLSS/RT/FG to be monopolized by nvidia and only now does it seem like FSR4 is where AMD is finally getting serious.
If FSR4 is on par with DLSS3 and this card is priced at $600 or less, then it is indeed a winner of a card.
Drivers haven't been an issue for a very long time, you don't even have to reboot anymore after installing them.I'm curious if anyone with experience can testify as to their experience with newer AMD cards.
I'm using an XTX and the only problem I had recently has been issues with Delta Force which got fixed with a new driverI'm following the developments here (a little anyway, with my very limited tech knowledge). My current rig is over 10 years old with only a mid-life cycle upgrade to a Nvidia 1080 base GPU. Any upgrade will be huge for me, but I'm not looking to break bank and I don't care about high-end performance. I brought up getting a new rig to a much more knowledgeable friend of mine who suggested waiting for the new AMD models next month. My hope is that they'll be midrange that can compete on price and availability against Nvidia's equivalent (we'll see I guess).
He told me that years ago AMD's drivers were kinda shit and so compatibility was a problem but that this has apparently improved. I'm curious if anyone with experience can testify as to their experience with newer AMD cards. My googling/chatgpting suggest AMD's drivers have gotten better for mainstream gaming and emulation.
Except one of Nvidias 'features' is low VRAM.I can't imagine getting a card without dlss and all nvidia features...
And I used to rotate AMD/Nvidia every generation...
yep.Except one of Nvidias 'features' is low VRAM.
Drivers haven't been an issue for a very long time, you don't even have to reboot anymore after installing them.
You act like you can just get in a line at TSMC for more wafers. TSMC gives more wafers to whoever is willing to pay more. AMD was never able to pay more than the likes of Apple, Qualcomm, and Nvidia. Their wafer allocation is in line with their ability to payThe worst decision of her career was not booking a lot more wafers at TSMC, for 2020, when they had the PS5, Series S/X, RDNA2 and Zen3 launch.
So many good products, and so few wafers to go around.
And then the mining craze and Covid hit, making it even worse.
You act like you can just get in a line at TSMC for more wafers. TSMC gives more wafers to whoever is willing to pay more. AMD was never able to pay more than the likes of Apple, Qualcomm, and Nvidia. Their wafer allocation is in line with their ability to pay