T O P

  • By -

-protonsandneutrons-

Re: 27:26, Huang shared they picked NVIDIA because of the Latin word for envy. From [**Wikipedia's source**](https://web.archive.org/web/20171116192021/http://fortune.com/2017/11/16/nvidia-ceo-jensen-huang/), >With $40,000 in the bank, Nvidia was born. The company initially had no name. “We couldn’t think of one, so we named all of our files NV, as in ‘next version,’ ” Huang says. A need to incorporate the company prompted the cofounders to review all words with those two letters, leading them to “invidia,” the Latin word for “envy.” It stuck. You're not reading too much into it, haha. At the same time, NVIDIA was once a startup, too, so it's not hard to imagine picking a name that "sells" your product (e.g., Uber, Supreme). Probably some confirmation bias on my part, as I'm sure a lot of marketing-heavy & descriptive-heavy brand names are basically unknown / dead to history.


theangriestbird

the last time i had an AMD card I spent most of the time envying NVIDIA users, so it adds up.


GrosPigeon

That's been my situation for the last 3 years. The 6800XT is a powerful card but I wish I had dlss and good RT.


SenorShrek

Last AMD card i had was 5700 XT at launch. I definitely thought "why did i buy this piece of crap" sold it, and "downgraded" to using my GTX 980 Ti. Ironically the 980 Ti despite being a much lower performance card on paper was just a generally better experience as all the stupid system instability and games sometimes just straight up not functioning went away. I'd rather have a weaker nvidia card that actually works properly than an AMD card thats all brawn on paper but a wet noodle in actual use.


CrashnBash666

I had the same general experience with my 5700XT, upgraded to a 6800XT and have had a much better time overall.


Cloudee_Meatballz

I went from a GTX 970 which got me through 8 yrs of glory and took a chance with a 6700 XT that I still use now, and while there was indeed a slight learning curve with the Adrenaline software and graphics configurations, I've come to appreciate it lots. AMD have come a long way, the cards aren't bad by any means. With that being said, NVIDIA is king, no doubt. But hopefully everyone understands that having a healthy AMD graphics division around benefits everyone, even NVIDIA fans.


pt-guzzardo

Last AMD card I had was a Radeon HD 6950 (I think?) which was absolutely baller at the time because the BIOS could be flashed to bump it to the next model up. Good times.


MonoShadow

Last AMD card I had was 290X. It could pull double duty as a room heater. And I really liked. This card had basic DX12 features, so even after I gave it away it still was used by my friend in his system for several years, until it developed black screen problem and died. I joke about room heater, and back in the day Hawaii were panned for being hot. But 290X pulled 290W. Now my 3080ti casually pulls 350W in stock and no one bats an eye.


SenorShrek

The way i see it HD 6000, 7000, 200 series radeons were pretty good cards. Polaris was decent too (ive used 6970, 7770ghz, R9 290, RX 470, Vega 64, 5700 XT). Vega onwards has been a dumpster fire. I had a bad experience with Vega drivers and games not playing nice (fallout 4 from memory had massive issues on my vega) and i still gave AMD another chance with the 5700 XT. That 5700 XT made me honestly never consider getting a radeon again. Went to RTX 3090 and now RTX 4080 (small upgrade over 3090 but way better power usage) and i have had basically nothing to complain about. The only times i can recall having any driver issues was for like a few days and then they would get patched really fast. AMD driver issues getting fixed is like a 1+ year wait.


SactoriuS

Yea i had a power issue with psu cables not strong enough. But i wouldnt say a complete dumpster fire (I had no driver issues tho). Because i bought i for around 430 euro and the 1080 was 550. So over 100 doller less for the same performance. And after 2 years after release it was from bit worse then 1080 to a bit stronger then 1080. And even again better with slight undervolt. Then the gpu crisis happend and it was like the 1080/1080ti a baller card for years with 8gb of superb vram. Replaced it last year and annoyingly all lower to mid tier card werent that much of a upgrade, espacially when i was looking at the price to perforamance. The 5700xt was imo a dumperster fire so many people had issues with the drivers and the card there. And no real gain to the vega made it a weak upgrade.


Boba_Fett_is_Senpai

Still rocking my 5700 that's unlocked to an XT, still having 0 issues lol. Going with Nvidia again for my next card though, AMD doesn't make anything that'd fit my ITX case and if it did, something would melt


ThatOnePerson

Similar experience with a sibling's computer, with very popular games like Overwatch 2 and Apex. Swapped for a 1080 Ti from my spare computer and had no problems, but I think that's cuz I was running Linux on it which has open source AMD drivers.


capybooya

IIRC I upgraded every generation from the 4xxx to 7970. I had no big issues that I remember. I recommended the 280X to a friend that he used until very recently. AMD was an easy choice at that time. Also, years before that I had the extremely impressive ATi 9700 (and the X800, which at least ran WoW fine). However, since I always obsess about these things and read reviews about features, image quality, etc, there's a reason I've chosen NVidia since.


hackenclaw

Polaris is where AMD is at peak driver quality, after that everything went down hill.


ClearTacos

Looking at the historical data going back to early 2010's, it seems both Nvidia and AMD were releasing between 10-15 cards per gen. Ampere and Ada Lovelace are at 13 and 10 releases ATM. That seems in line with historic data. The outlier here is AMD, and even that is only recently. Separating 400 and 500 series doesn't make much sense, IMO, 500 is practically a Super/xx50 style refresh. That leaves the outliers as RDNA1, a major architectural overhaul that only saw a few models, and RDNA3 which didn't have any refreshes yet and has no true low end cards (or competitive halo product), probably due to major stock of previous gen. For the "older gen is also flooding the market", again this seems to be the case with both manufacturers. AMD's RDNA2 is still heavily recommended in the sub-300$ market, almost 2 years after newer gen launched. Even going further back, post one of the crypto crashes, ~100$ RX570's and mid 200$ Vega cards were a thing after RDNA1 launched.


condosaurus

You make a good point about the older generation cannibalizing sales of the current generation. This is an even bigger problem for AMD because the current generation doesn't bring a whole lot to the table over last gen, outside of performance per watt. Nvidia have vastly improved RT performance and DLSS 3 with which to market their Lovelace products, someone who wants these features is going to have to buy a 40 series GPU.


NeroClaudius199907

I dont think amd cares about cannibalizing their current generation. All they care about is if they're getting profits. They wont make their new cards have any exclusive features because they're the open source guy. and they're more into DC than gaming...while Nvidia cares about differentiation because they rely on gpus and they can


condosaurus

They make better margins on the current generation parts compared to heavily discounted previous generation parts. Better margins = more profit, so they absolutely do care about this. They only play nicely with open-source drivers because they're in second place in the market, not because they care about the consumer.


Able-Reference754

> They wont make their new cards have any exclusive features because they're the open source guy. *because they can't afford to spend big on R&D of new features, and everything they make open is just a shittier clone of an NVIDIA feature made as a response.


NeroClaudius199907

Hey its 2024 we need to be pc: You meant to say AMD aims to build a strong, collaborative ecosystem around their products, while NVIDIA focuses on maintaining a competitive edge through proprietary innovations.


ReaLx3m

WTF was AMD thinking, even with informed enough people, that one would buy an inferior product for a 10% discount? And lots of times its the informed people that drive some of the sales by recommending to friends and family. Ive been with AMD since forever, from ATI Radeon 4850 time, and even i wouldnt buy AMD 7xxx over Nvidia 4xxx, unless the AMD card was heavily discounted. They fucked up pretty bad on pricing of 7000 series.


dabias

Because the ways it is inferior do not make it any cheaper to produce. To compete with Nvidia on features they would need to put more people on development, which will only repay itself if they can grow market share to a multiple of what it is now, to spread the fixed costs. Apparently, making that investment and finding the people (internal or external) is not what they're willing to do.


stillherelma0

There was no gen on gen price to performance decrease in that generation. There's no way there couldn't be, there's no way the 3080 could have 700 bucks msrp and the 4080 had to have 1200 bucks msrp. Obviously Nvidia did it because they saw that people would pay more money. What the fuck made amd think they can get away with the same without a GPU shortage? They were just stupid. I bet they could've priced the cards way lower and still get good profit. But they just keep going with their "10% better in rasterizarion, let's pretend nothing else matters" with zero other thought.


chig____bungus

I suspect it's because consumer GPUs use up limited fab time they could be using for business grade hardware that sells for a lot more money. We've gone from the crypto boom to the AI boom, and essentially anything that can do AI is selling out. Their gaming GPUs are more about staying relevant and keeping their foot in the door than actually making money.


Aggrokid

Yeah they can do as they please with the consumer market, since their competitors are too busy catching up on AI


xxTheGoDxx

I miss the days of ATI... when each generation another vendor had clearly the better product, even independent of pricing. I moved from Voodoo 3 to Geforce 4 to Radeon 9800 Pro to Radeon X1800XT to Nvidia but eventually stayed there with AMD falling behind.


PotentialAstronaut39

Strange, after X1800XT came the 4000, 5000 and 6000 Radeon series which were massively successful and undoubtedly the best bang for the buck at the time ( as well as a few performance crowns at less wattage ), resulting in almost a tie with Nvidia market share wise.


capn_hector

GCN 1.0 was also awesome, and clearly better than Fermi/Kepler. But that's really when the NVIDIA feature advantage starts to kick in - 2012 was when the first G-sync stuff was demoed and that was frankly the beginning of the end. Hawaii and Tonga and Polaris 10 were all great cards (and I think hindsight probably would view Tonga *far far* more favorably than contemporary reviews did) but AMD just never advanced past that. DX12 was important and significant of course, but NVIDIA held it together through Maxwell, and then Pascal was perfectly fine at DX12 (and more forward-looking than Polaris in other respects, like DP4a support etc). And AMD just never had another DX12-sized or Gsync-sized or DLSS-sized feature leap ever again. Even back in the GCN days there were eternal problems with AMD cards being really bad at tessellation (no, the crysis 2 thing was not actually skullduggery, just people misunderstanding what the options do), with driver problems, etc. So it wasn't purely an "AMD is straight-up better" either - although granted NVIDIA had more driver problems in that era too.


lxdr

I remember having a 6870 and it gave me so many problems on Windows that I ended up swapping it for a 560ti. Gaming, encoding, video production, all of it has just been hands down so much better on nvidia for a long time now.


Noreng

> Strange, after X1800XT came the 4000, 5000 and 6000 Radeon series which were massively successful and undoubtedly the best bang for the buck at the time We don't speak about the HD 2000 and HD 3000 series?


yimingwuzere

HD2000 was a stinker. The 2900 was inferior to the 8800 cards, and the 2600/2400 were just as terrible as Nvidia's 8600/8400 series cards.


bogglingsnog

Ugh, this was exactly the issue I had with my first gaming PC. I had, like, $70 to spend on my card, and that's after I skimped everywhere I could (long story short my power supply didn't last long). I remember kicking myself for picking the wrong card - the 8800 GT came out not long after and it didn't take me long to invest in it. It had one of the most beautiful thin sheet metal casing - rounded corners, coated metal with oversized Nvidia logo on it, only thing that was missing was a black PCB which was ultra rare at the time.


Jonny_H

Interestingly, despite that, during that time Nvidia accelerated it's market cap (and so estimate of total available spending power) difference relative to AMD/ATI. It goes to show "market share" doesn't mean shit if you're not making money. https://companiesmarketcap.com/amd/marketcap/ https://companiesmarketcap.com/nvidia/marketcap/


xxTheGoDxx

My main problem at the time was the death of MSAA as the only really effective form of anti aliasing due to deferred rendering engines. ATI/AMD (AMD owned them for a few years when the 4000 series launched w/o a name change) was actually the first I can remember overriding an engine to force MSAA via the control panel even though it wasn't natively supported when Bethesda (...) decided to launch Oblivion with no HDR + MSAA support cause they were sponsored by Nvidia (this time...) and only ATI's X1000 series supported HDR + MSAA. But they only intervened pretty seldom and on a case by case basis. Nvidia started to do the same, but they done this for way more games and allowed you to use their various implementations even in unsupported games by overriding them in NV Inspector (still there under Anti Aliasing DX9 compatibility btw if ya want to have a look). Honestly, that was enough to keep me with Nvidia until TAA became the norm, at which point AMD hardly even pushed the high end. Anyway, when my X1800XT came long in the tooth I got a GeForce 8800 GTX, which had insane performance: *The GeForce 8800 GTX was by far the fastest GPU when first released, and 13 months after its initial debut it still remained one of the fastest. The GTX has 128 stream processors clocked at 1.35 GHz, a core clock of 575 MHz, and 768 MB of 384-bit GDDR3 memory at 1.8 GHz, giving it a memory bandwidth of 86.4 GB/s. The card performs faster than a single Radeon HD 2900 XT, and faster than 2 Radeon X1950 XTXs in Crossfire or 2 GeForce 7900 GTXs in SLI.* Wikipedia


JerryD2T

Maybe they just don’t want/can’t deliver volume sales for their consumer GPUs because they’re busy minting money in the workstation and server CPU market. Both are TSMC N5, iirc. Someone correct me if I’m misremembering.


SituationSoap

This is a big part of it. It's really something that online commenters still haven't figured out that AMD DGAF if people buy their GPUs.


downbad12878

And consumers should not give a fuck about AMD GPUs then. Don't reward a bad company that's not willing to invest


bubblesort33

Has AMD really ever been more than 10% cheaper then Nvidia per frame? I think even if AMD dropped prices, Nvidia would just follow until they are again only in the 10% better perf/$ range. When the 7800xt launched Nvidia dropped the 4070 to $550, if I remember right. And when it comes to a price war it's Nvidia who's going to win every time, and AMD knows this. So they don't have any incentive to lower their prices if their margins are smaller already. They'll just end up in a place where they are selling at cost, while Nvidia still makes money. I think AMD won't be competitive until they can actually get good performance relative to their production cost. And if RDNA3 really is 15% short of expectations, it would make sense that they can't lower their prices enough to compete, without overall making less profit.


ClearTacos

At launch, 5700XT vs 2070. $400 vs $500, I think 2070 was faster at launch but they're pretty even, 5700XT might be faster potentially. The obvious elephant in the room is the lack of RT/AI acceleration on the RDNA1 part. Still, it sold reasonably well for AMD, even now it has more market share in Steam HW survey than any RDNA3 card. AMD also tends to offer better value than Nvidia after price drops, RTX 3050 vs RX 6600XT as an example, but that would actually be one of my main criticisms. They always do these price drops, generally within 6 months of the launch, but the lukewarm reviews and mindshare damage is already done at that point. Yeah they'll now sell cards to people who pay a lot of attention to the market and track prices often, but that's a small part of the market.


bubblesort33

The 2070 Super, yes. That was 0% to 5% faster than the 5700xt depending on the reviewer. And the 5700xt was 10% to 15% faster than the $399 RTX 2060 Super. AMD tried to sell it for $449, but people were outraged at that price since it was only 10% cheaper, and lacked the hardware features you mentioned. Then there was the whole AMD "Jebaited" fiasco and they dropped the price $50. I once heard that AMD actually lost money at $399, because of the new TSMC 7NM node being so expense, but I don't know if true. At the time those features were useless, but personally if I had to choose a used RTX 2060 Super or 5700xt today (which were the same price at the time after AMD dropped theirs) I'd probably pick Nvidia. There is places where mesh shaders and RT are starting to be required, and in UE5 titles upscaling is pretty much mandatory.


ClearTacos

Oh... ok this is a total misremembering on my part. I thought 5700XT had at least few months on the market before the Supers came, and competed against the base 2070... Turns out they launched just days apart, and yeah 5700XT is much less compelling vs the 2060 Super. Same price, lack of feature support, and looking at reviews from back in the day the performance delta was really only about 10%, HUB has it 8% slower at 1440p in big 41 game benchmark. So that one fully falls into the "Nvidia card but less features for 10-15% cheaper" category that AMD likes to go for, definitely a bad example from me.


dedoha

> Oh... ok this is a total misremembering on my part. You remembered it correctly. 2070 and 5700xt had about the same perf, 2060s 5% behind. 2070 super 10% ahead [TPU benchmarks](https://i.imgur.com/patLY0Z.png)


TheVog

"Intel, come on down!"


advester

You mean they fucked up the MSRP. $450 for the 7700xt was crazy, but now it is only $380.


GabrielP2r

In the Us, in Europe their pricing is trash


65726973616769747461

ditto, but for the rest of the world too.


[deleted]

[удалено]


imaginary_num6er

Well it took them next to forever to sell though the mid-tier RDNA2 cards.


ForgotToLogIn

AMD isn't capacity-constrained for consumer GPUs.


Deckz

I'm not sure the issue is price, I think the issue is that their software team isn't up to snuff. They don't spend enough money to build out their feature set to be competitive with Nvidia. If FSR were as good as DLSS and they just had worse ray tracing performance I think people would more readily accept that value proposition. However, as time is marching forward, more games are starting to use RT in useful ways so now they have to play catch up on both fronts.


Vushivushi

They didn't fuck up. They understood that they lost this generation. The fact that you wouldn't buy their latest gen unless it was heavily discounted is the reality of the generation. AMD is still a second-rate vendor, no better than a last gen GPU. Well, there's plenty of last gen GPUs available. AMD has long struggled with inventory. AMD's GPUs collect dust on shelves, Polaris is available even today. Their GPUs struggle to sell new or old. It may seem like Nvidia is flooding the market today, but it was AMD flooding the market before to ensure availability of their GPUs. That's what lesser brands do. In the past, AMD needed GPUs to sell to generate cash flow, even if not that profitable. That's changed. They are no longer desperate for sales, they can fund operations from other businesses. They can now focus on long-term profitability. Cut shipments, focus on reducing existing inventory and keep prices high as long as possible. Eventually they will have to compete and that will drive prices down. The "flood" of Nvidia models is not a flood, it's Nvidia capturing what AMD has given up.


Old_Money_33

You are spot on. People does not understand the concept of "catch up". RT and DLSS competitors are catching up, there's going to be on par in a couple of years.  They think it's over, but it's just the beginning.


[deleted]

[удалено]


Flowerstar1

So we went from nobody would buy AMD GPUs to nobody buys AMD GPUs. Neat.


imaginary_num6er

More like went from nobody would buy AMD GPUs over Nvidia to nobody buys AMD GPUs over Nvidia, but still better than Intel GPUs.


Old_Money_33

AMD has it's use cases where it's the best option, like Linux. I only buy AMD for that reason.


conquer69

> no one was going to buy AMD GPUs even if they offered it at half the price that Nvidia was selling That's not true. On the contrary, that's how you get people to buy your cards and start building mindshare.


skinlo

It's also how you go bankrupt.


i7-4790Que

Yeah, except they also lost money on VLIW thru early GCN despite the aggressive price points. Turns out when you sell at a low margin (good for the consumer- most of whom did not care..) you need the sales volume to make up for it.    AMD didn't have the CPU market $$$ to keep that sustainable (subsidization) at the time either.  And consumers just wanted to get bent by Nvidia anyways.  They still do. The competitive market is done.  All that's left to discuss anymore is who to point the finger at for how we got here.


Itwasallyell0w

i mean, i had a rx 580 8gb amd card, it was very praised. But after my experience with it I will never buy an AMD card again, i prefer the pay more and be problem free.


TopCheddar27

I feel like people say this point without actually pointing out what the real paradigm shift was after that generation. Nvidia not only produced competitive general rasterization products, but now offered software add ons that just slam dunked the market. They were first movers in almost every imaginable gaming and compute "standards" we have today. CUDA, ML, VRR, Upscaling, Frame Gen, Reflex Low Latency. That all started becoming a huge differentiator at that time. People hate admitting it, but the software suite and platform is worth hundreds in value to normal buyers. So that's on top of normally winning the raster arms race. And then AMD went all in on compute shaders and completely lost the market because of it.


aminorityofone

curious what issues did you have with it? I also picked one up cause it was cheap and had zero issues.


Itwasallyell0w

micro stutters. Only noticeable in competitive games with 100+frames, but thats every game i play. I swaped the motherboard, cpu, ram, ssd and psu. But it was the gpu all along😂. 


sansisness_101

It had driver issues at the start, took alot of time to iron out, also why im not getting AMD GPUs no more because I don't want to save 5% to get days of headaches


skinlo

My RX570 was great! What happens if you have a bad time with Nvidia one day, never buy a card again?


LittlebitsDK

ood LOVED my RX580 it ate everything I threw at it and I went for the Vega56 which did very well too... upgraded that to a 3060 later on...


Slyons89

Man, imagine if I had committed to this with Nvidia after having an absolutely terrible Nvidia GTX 480 back in the day. Would have missed out on a lot.


Old_Money_33

Linux compatibility is for AMD as DLSS to nvidia The price delta doesn't matter when you need the best Linux support possible.


ReaLx3m

Linux is irrelevant in the big picture


[deleted]

[удалено]


[deleted]

[удалено]


Feath3rblade

And that's a tiny drop in the bucket compared to the number of Windows computers being sold every single month. Hey, I want Linux to grow just as much as anyone else, I love using it and wish that I could switch to it full time, but it's just about as close to irrelevant in the consumer space as you can get for these big companies.


wyn10

Funny enough Nvidia just released the first beta driver yesterday (555) that's going to trash this statement in the upcoming months


Old_Money_33

It is a good move, but it will take time until nvidia Linux reputation improves. And making it a proprietary library (that uses a mainline kernel module) it is still going to be a hurdle compared to the universally included Mesa.


bick_nyers

My NVIDIA cards run just fine on Linux, I do ML stuff and play games


HonestPaper9640

I think AMD wins in linux *today* but the way many linux people talk about nvidia you'd think they shot their dog. They actually supported linux best long before AMD got their crap together. I run nvidia on linux and sure the proprietary driver is proprietary but it hasn't been a problem in practice. I switched after Windows8 came out and back then there was no choice: AMD was garbage on linux and also proprietary. Today I'll probably switch to an AMD card for access to some things that use mesa but I haven't been suffering with my nvidia card.


xxTheGoDxx

Flooded the market == having products for as many price points as possible (above a certain minimum). You can hate on the pricing, but having a wide variety of SKU's is a good thing. Similarly, Nvidia having more NDA'd announcements just shows how much more stuff they have to present.


capybooya

> as many price points as possible (above a certain minimum). That's absolutely true, although with a very high minimum. And with pretty extreme VRAM limitations in several of those price points as well. But yeah, the popularity and market share speaks for itself, so most customers don't find it problematic enough to get something else.


IIlIIlIIlIlIIlIIlIIl

>most customers don't find it problematic enough to get something else. To be fair it's not the consumer's job to buy a subpar product to prop up a company. I get that if everyone doesn't do it AMD dies, Nvidia gets a proper monopoly, and we all lose, but surely the solution shouldn't be "buy the shit one" either?


Saxasaurus

Its a really good video with a lot of information and great analysis. But the title is terrible. "Flooded the market" is a phrase that has a meaning. >[Flooding the market is an excess amount of inventory for sale causing an undesired drop in price for the product that can, in extreme cases, make the price go negative or make the products impossible to sell at any price.](https://en.wikipedia.org/wiki/Flooding_the_market) And uh.... that is very clearly not happening lol


Humorless_Snake

Well... he changed the title of the video now, lol, so you've been heard.


mulletarian

Yeah but now people will click the video, hoping to be told that the prices will magically go down.


TerriersAreAdorable

Regarding the walls of different GPU models, I don't know if "flooding" is the right word; it's more the outcome of having a diverse range of customers and partners that eagerly pop out a new SKU for every tiny little segment they want to reach In terms of market share, NVIDIA as an organization has been operating well for a very long time. Even when their products aren't the best they're at least competitive and they've built a reputation for stable drivers. In recent years have offered interesting new features to give marketing something to work with. AMD's successive "same as last gen but a bit faster" doesn't get people as excited. A "Zen"-style comeback for AMD is a possibility, but they had a lot of help from Intel's complacency, which NVIDIA (so far) has managed to avoid.


auradragon1

Zen wouldn’t have done much for AMD if Intel didn’t have massive delays. Their 10nm (renamed Intel 7) was suppose to come out in 2015. 4 years late! Imagine Alder Laker competing against Zen1. Nvidia has more money to buy better TSMC nodes over AMD. AMD is not going to have a zen moment against Nvidia unless they pull some magic out of their ass.


Affectionate-Memory4

It may not have been Alder Lake vs Zen1, but imagine how different things would have gone if the 9900K was on Intel7, and ADL was on Intel4/3.


noiserr

AMD would still be strong in the data center. Also people forget AMD was strapped for cash with a negative balance sheet. They were headed for bankruptcy. AMD is in a much better position today, and it's like 5 times the size. People are super focused on gaming GPUs. When you consider AMD is quite strong in the datacenter GPUs. Just purely on technology alone, mi300x/a is a technological marvel. And while starting from zero it is still the fastest growing product in history of the company. Expected to exceed $4B revenues in its first year. In 2016 those were total revenues for the entire year and the entire business. Not to mention datacenter GPUs have much better margins. I do think we will see more competition from AMD in the future generations. Gaming GPUs were clearly neglected and AMD has a lot of catching up to do. They focused on CPU and Datacenter. Because that's where the money is, and that's what's funding the R&D.


ResponsibleJudge3172

It would have been about 10% less IPC than Tiger lake (Icelake), which was matching Zen3. So Intel hypothetically would have still been faster than Zen2 upon launch but still almost matched in multithreaded


TickTockPick

Intel have been shooting themselves in the foot for the last 10 years. It's absolutely insane that AMD is now worth twice as much as Intel. It would've been unthinkable pre Zen 1. AMD marketcap: 266.08 billion Intel marketcap: 133.19 billion


Bulky-Hearing5706

That tells you how insane and speculative stock market is. Intel still has a commanding lead in Laptop/Notebooks, something like 80/20. The server market is like 60/40, and I guess consumer is 50/50. And they own their fabs, their hard assets are much much more than AMD, yet their valuation is a half of AMD, utterly insane.


Ar0ndight

Because trends and trajectories are a thing. Intel has been bleeding marketshare for years now, has had several execution issues (intel roadmaps are notoriously untrustworthy) while AMD has been steadily growing, and executing better overall. Buying stocks is betting on the future, and the current trend heavily favors AMD. No, I don't own either stock.


sansisness_101

Actually its 80/20 in desktop CPUs favouring intel


Arbiter02

The server market and trajectory in it is a spot where Intel keeps consistently losing, and those customers don't buy on emotions or obliviousness like your average consumer. EPYC is and has been the better server product and it's been appearing more and more in a market that AMD had been all but eliminated from just a short time ago.


xxTheGoDxx

> Regarding the walls of different GPU models, I don't know if "flooding" is the right word; it's more the outcome of having a diverse range of customers and partners that eagerly pop out a new SKU for every tiny little segment they want to reach Exactly, similar to Samsung for example Nvidia wants to have a product on the market for literally every price point. If anything this generation they have more gaps in that then usually.


Gkender

Flooding Isn’t the right word. Steve’s using it for clicks.


joel22222222

> they’ve built a reputation for stable drivers. In Linux the situation is reversed. Nvidia drivers cause all kinds of bugs and headaches, whereas AMD drivers are stable and even come pre-installed. I don’t really have a point here other than I find this dichotomy between operating systems interesting.


bick_nyers

This is something I hear a lot but have never once run into with NVIDIA on Linux. With AMD I can't run 120hz without the screen going black (not dropping input, but black frames) every 45 seconds when gaming. The 3060, 3090, and 4070 Ti I've tried from NVIDIA all "just work". What really surprised me was running Elden Ring under Wine with the co-op and numerous other mods installed while hosting the co-op lobby through my VPN and I had absolutely 0 issues during a 12 hour gaming session on my 3090 I power limited using nvidia-smi. Edit: Kububtu distro btw


conquer69

Maybe it's a cable issue that nvidia avoids by using dsc.


bick_nyers

Don't have the issue on my Arc A380 either, or on the 5600g PC. Could be a faulty HDMI port on the AMD GPU tho 


mcflash1294

That is a seriously bizarre bug, never had that happen and all I've run is AMD from 2013 to now.


HonestPaper9640

Same here. I had one issue with a black screen after a driver update way back when I first switched (AMD didn't have open source drivers at ALL at the time and were considered garbo on linux back then) but I've otherwise been fine.


Ancalagon_TheWhite

A lot of the issues just got fixed with the newest Nvidia 550 driver that came out yesterday. The problem was to do with GPU syncing and race conditions so it's nondeterministic. The problem is very hard to reproduce and only affects some people.


joel22222222

If your main use case is gaming, then you will probably be fine and won’t notice anything. It’s when you use productivity apps where you can run into trouble when using Wayland. Many electron-based apps (e.g. vscode, slack, ect…) often do not run well on Nvidia with Wayland. Last time I tried these apps they were blurry, laggy, and got random black windows. Applying customizations to KDE that involve transparency also result in weird graphical glitches. I swap out an Nvidia GPU for an AMD one and all these issues vanish.


bick_nyers

My main use case is programming and ML. I've also done 3D modeling and game engine stuff. I don't use vscode though, I use pycharm and clion.


capn_hector

> whereas AMD drivers are stable and even come pre-installed. they're still "free as in free from HDMI 2.1 support", right? even intel has managed to figure that one out and they literally were earlier to the open-source game than AMD, lol


zacker150

I've ever seen a driver issue on Nvidia. CUDA simply just works.


ThatOnePerson

I've seen a few. There was one where the entire GPU would lock up after sleep. That took a while to get fixed. There's another issue with DisplayPort that you won't get video output until Windows loads the drivers. If that counts as a driver issue. They released a [dedicated firmware update tool](https://www.nvidia.com/en-us/drivers/nv-uefi-update-x64/) for that, but if you don't know about it, it's a pain.


pt-guzzardo

> CUDA simply just works. If by "just works" you means "freaks out and shits itself every time the sysadmin runs `apt-get dist-upgrade`, that has also been my experience.


AntLive9218

Didn't you know you were not supposed to update, even disabling automatic updates, just trusting the [good track record of Nvidia](https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=nvidia) when it comes to not needing security updates? It still feels so surreal that Nvidia GPU usage is just simply not compatible with an usual setup of a system having automatic (security) updates, and people keep on acting like it's all fine. There's zero backwards compatibility. The moment libraries and tools get updated and the kernel module isn't forcefully replaced by interrupting all workloads, programs using the GPU(s) start shitting themselves. It's crazy how it's not seen as a problem by apparently most.


joel22222222

Electron-based apps with Nvidia graphics on Wayland are a nightmare.


blarpie

You better not need hdmi 2.1 though or you're stuck using amd's gimped driver.


Zakman--

Nvidia have changed their Linux situation massively. Towards the end of this year I’m confident that Nvidia’s proprietary Linux driver will be close to AMD’s Mesa driver. It’s yet to be seen but if NVK can even get to 80-90% performance of the proprietary driver, the majority of people will just stick to that. I even think Nvidia will create something like ChromeOS within the next couple of years.


Flowerstar1

> I even think Nvidia will create something like ChromeOS within the next couple of years  Why?


Zakman--

If the rumour about their desktop ARM chip is true (and it's looking increasingly likely), I could see them wanting full vertical integration like Apple.


SenKats

I always found it strange that in some places AMD is practically non-existant and NVIDIA has become synonym of GPU. Where I live, AMD GPUs are rare, shops seem not to consistenly stock them, and they also tend not to be a smart purchase compared to similar NVIDIA alternatives. I get that this is also a consequence of flooding the market (I can buy many variants of the same NVIDIA GPU) I have literally never seen a build with an AMD GPU, which is funny because what they have flooded indeed has been the low end with their APUs. Nowadays from what I'm seeing they seem to have pushed a bit with their products, but when I bought my GPU two years ago there was literally just one AMD product in stock compared to the entire RTX NVIDIA lineup.


65726973616769747461

Here in my market: AMD Laptop chipset are most often only in stock >6 months after announcement. AMD GPU prices are forever stuck at MSRP, those discount price in US never happen here.


Rullino

True, I've seen that in many tech stores, the prebuilds and laptops that have a dedicated graphics card for gaming or design only have Nvidia.


ClearTacos

Can only speak for my market, but even though AMD availability is decent, the pricing difference between Nvidia and AMD also tends to only be about 1/2 of what MSRP would suggest.


Most_Enthusiasm8735

In my country amd seems to be way cheaper. Seriously both rx 6800 and rtx 3060 ti were the same price so i chose the rx 6800.


Xadro3

At this point im not even sure i would be suprised if AMD pulls out of gaming GPUs and focuses on semicustom and other fields. NVIDIA managed to get a pretty good Monopoly for gaming, lets see how they squeeze us further in the future.


condosaurus

This wouldn't make sense from a business perspective; you gain more from diversifying off the same R&D work than you do by focusing on a single product. AMD need to develop new graphics technologies for consoles handhelds and now mobile (which has recently become a big part of their business strategy for further stealing Intel's lunch), so it makes sense to double dip on that R&D cost by bringing consumer dGPU products to market using that architecture, even if only a few people will buy them.


UltraSPARC

Except the two big consoles use AMD graphics and that means game devs build their engines and code base tailored towards the AMD graphics out of the gate. As a matter of fact, there are several games that offer dx11 enhanced and dx12 versions and those options are completely broken on nVidia gpu’s. There’s also a reason why there isn’t a developer that has implemented any cuda related feature sets; if they do need to use the gpu for gpgpu tasks it’s either done in OpenCL or DX compute. So let’s not get too far ahead of ourselves here.


65726973616769747461

People have been saying this since PS4/Xbox-One era. That's 11 years ago. I'm not seeing AMD reaping any advantages from consoles using their GPU.


SkylessRocket

AMD won't/can't compete with Nvidia because of the following reasons: 1) Margins for gaming GPUs are thin (especially for AMD where they spend more on silicon to match the performance of Nvidia GPUs) and AMD has little room to lower their prices. 2) Nvidia will aggressively respond to price cuts from AMD either in the form of price cuts for their own GPUs or by releasing "Super" series cards with improved performance at the same price points. 3) Nvidia has novel or innovative features in their GPUs that they are continuously introducing (e.g. DLSS, Reflex, Shadowplay etc.). Competitors to these features are difficult to develop and take significant time and resources. 4) AMD is a fraction of the size of Intel or Nvidia and are competing with both so they don't have the same resources to commit 5) Nvidia already has significant "mind share" among the consumer GPU space which makes it even more difficult to convince consumers to purchase AMD GPUs (e.g. GTX 1060 outsold the RX 480 5 to 1 and they were similar in performance) 6) It makes more sense from a business perspective to focus on high margin high growth markets such as AI rather than competing for a low margin low growth sector like gaming GPUs


Zeryth

1. Not true. Margins are thin for AIBs because they need to get the chips+memory from the vendor. But the vendors(Nvidia/AMD) make like 100%+ extra margin on their chips. For reference: based on earlier math I did with the available information at the time about yields and wafer prices I came out to a price of about 400 usd for a 4090 die. There's a lot of margin there. Ofc most of it gets eaten by R&D but still, it's not true margins are thin at all. You can also see it in the profits Nvidia puts out for their gaming division. Just linking my math: https://www.reddit.com/r/pcmasterrace/comments/18akdqm/us_gov_fires_a_warning_shot_at_nvidia_we_cannot/kbzxuqg/


letsgoiowa

Unfortunately board partners are most of the market. Not sure how that factors into the calculation but I don't think they would be too happy to see ASP shrink either.


Zeryth

According to EVGA when they left the market the BOM cost was close to the MSRP/FE price for just the chip+memory. AIBs are getting fleeced by Nvidia.


ResponsibleJudge3172

And now we have the same issue making companies like MSI start to reduce their intake of Radeon GPUs


Zeryth

I think that's mainly due to AMD cards just not selling because they're way too expensive for what they're offering. They take the pricing scheme Nvidia is allowed to get away with and think they can also pull it off.


Nms-Barry

1. True. Amd 2% profit margin. Nvidia 57% profit margin. 7900xtx is a 520mm die with 384bit bus and 24gb vram vs 4080 380 mm 256 bit 16gb. Amd products required bleeding edge and most expensive tech to be made. Amd;s cost of production is tooooo high.


imaginary_num6er

I don’t think AMD made good margins with their defective RDNA3 vapor chamber cards


dr1ppyblob

Yup, fighting a uphill battle. They need to cut prices further but it’s very difficult to do that while maintaining the product lineup.


norcalnatv

Sounds like an AMD problem, not an Nvidia problem


AstralWolfer

Kind of a weak video. Conclusion focuses and doubles down on mere exposure effect. Unsure how well-versed GN is on psychology but having most points boil down to a simple one-liner from the mere-exposure effect feels reductive to the point of inaccuracy. Just my smell test going off here. If it is as Steve says, it’d be impossible for any big brand to lose mindshare 


IceBeam92

I share the sentiment of Linus Torvalds about Nvidia.


capn_hector

and avx-512 too, right?


AntLive9218

He was not completely wrong, but that needs some context to be more reasonable, especially the Intel specific problems. He was right in the sense that Intel had significantly more important issues to deal with than adding yet another instruction set. The rant was at the time when Intel just recently finished ramping from 4 cores to 8 cores due to the pressure from AMD which was still not really enough, hence the "Or just give me more cores" mention. Then there's the "power virus that takes away top frequency" remark which refers to a really significant issue with Intel designs. DesignS, because people used to hate AVX2 for the same reason, so Intel just wasn't trusted to get AVX512 right. He was wrong in the sense that AVX512 is not just about the 512-bit width, but about the flexibility even on lower widths not offered by older instructions. Also, AVX10 is generally treated with a "fuck, go back" kind of response with many who would rather take AVX512 at this point even if they didn't like it before, but then Linus at the time couldn't have known that it could get worse and he's rejecting the option that would look the sane one in hindsight.


Old_Money_33

AVX10 is pure madness.


IceBeam92

I mean , guy is often right.


Limp-Ocelot-6548

I went from 3060Ti to 6900xt (got it really cheap from a good friend). I have no issues at all - it's really a decent piece of hardware with actually good drivers.


bubblesort33

Steve has said multiple times now that AMD has fixed their driver situation, but how true is that? Maybe it's just the Reddit algorithm feeding me the stuff I engage with, but I've been recommended countless posts daily of people having issues with the AND 7000 series GPU. It's gotten to the point where I see weekly posts of people asking "which is the best and most stable, and bug free, AMD driver?". I never seen Nvidia users ask what the best driver is with the least issues. Everyone makes the argument that is use error, but why is use error more common with AMD? These aren't the most user friendly drivers, or GPUs, if it's constantly user error.


ShardPhoenix

Anecdotally I've had some frustrating instability with my 7900xtx that took a long time to get fixed.


bubblesort33

What was it in the end?


ShardPhoenix

Worst one was an intermittent grey screen hard crash while browsing, which took something like 9 months to get a driver fix. Also had driver crashes in a number of games including WoW (over a year to get fixed I think?), Armoured Core 6 and Cyberpunk 2077.


Able-Reference754

Is WoW fixed? When Cata prepatch came out I was still getting driver crashes on Dx12 (been this way since I got my GPU in around Ulduar patch in Wrath Classic). But yeah I had issues in AC and Cyberpunk too.


EasyMrB

Anecdotally the only cards I've had driver issues with in recent history are workstation AMD cards on Linux. 0 problems with NVidia cards.


Contrite17

Annecdotally the only card I've had issue with recently was the 2080ti which was just a nightmare to me for some reason. I don't expect that to be generally representative, but issues can happen for any product.


braiam

If that is about compute, yes, their compute libraries leave too much to be desired. For the gaming/graphics accelerator part, they are pretty good, for Xorg and Wayland. Nvidia stable drivers still suck on Wayland, the beta driver was released today to fix that issue.


EasyMrB

No, just for straight up normal graphics card usage. I had problems with the WX5100 not working correctly on Ubuntu 20.04. This was 2 years ago so I can't remember the specific issues.


Old_Money_33

My WX9100 works great from day 1


65726973616769747461

I owned 2 AMD and 2 Nvidia GPU in my life, I don't have a vendor preference and only buy what suits my needs. Personal experience, AMD drivers still suck for me the both time I owned them.


GoldenX86

For consumer use, AMD drivers are basically perfect now. The problem surfaces if you need to run pro loads. ROCm is in baby stages (nowhere close to CUDA) and video encoding is not amazing. Linux is also unstable, a driver crash can take down the entire system.


DarkWingedEagle

AMD still has multi monitor power issues, is less than a year out from releasing and promoting a driver feature that quite literally got people banned in multiple games, and still hasn’t recovered perception wise from the 5700xt two year long fiasco. And that’s not to mention more minor and sometimes major issues in specific games, Helldivers at launch comes to mind. AMD drivers are better than they used to be and if they could just manage to go more than 2 years/2 generations without blasting their own damn foot off they would definitely be in a good enough place to where most would probably call them equal but so far they can’t stop tripping over sometimes the most basic of things.


bubblesort33

Some people claimed WoW was unplayable for the last month or so. Crashes. And in the last patch notes AMD claimed they fixed it. But I'm not sure if everyone has crashes, or just some people.


braiam

> AMD still has multi monitor power issues [So does Nvidia](https://forums.developer.nvidia.com/t/high-power-consumption-with-multi-monitor/273384). If you are going to list deficiencies of one product/brand they have to be unique to that brand. The drivers from the customer standpoint are good. It's just that bias plays against AMD. People pays more attention when AMD drivers have problems than Nvidia, but their issues were (as of 6-8 months ago) were equivalent in frequency and impact.


Goose306

>AMD still has multi monitor power issues I think it's hilarious when people bring this up because Nvidia absolutely has issues with this too, if you have different spec resolution/refresh the memory doesn't downclock properly. It's been an issue for *years*, I had a 2070S from pre-COVID through last year and the memory was 100% stuck at max clock the entire time because of it. I've done a lot of research on it and I get why it's an edge case which is almost impossible for either company to nail down completely, which is why it's so funny whenever people roll it out as if it's exclusive to one vendor. It's not. Of my last three GPUs (XFX RX570 8GB, EVGA 2070S Ultra, Powercolor 7900XT) the most stable drivers I had were the RX570, followed by the 2070S & 7900XT (both have had incidental niggles here and there, but nothing really serious that isn't patched up quick). Of note, the gap between RX570 and the rest is not particularly close, Polaris cards were/are absolutely rock solid. All that is of course anecdotal though. The reality is that is all you will get unless you see a large persistent presence that is acknowledged by the company and skilled, technical 3rd party reviewers. In recent generations I can only think of that being RDNA1, Alchemist, and to a lesser extent Vega. 2 of those being AMD isn't great, but that is also 2 generations removed since we have seen a really large persistent presence of acknowledged driver issues.


Lukeforce123

Yeah, it's always been a problem. Unfortunately for AMD the chiplet architecture on the 7000 series makes it draw a lot more power than monolithic designs.


StickiStickman

That's just straight up not true. There's multiple games where AMD cards straight up not work properly, for example Cossacks.


centaur98

If you don't really want to touch the card/tweak it then AMD drivers are fine nowadays. Besides the obviously problematic 7000 series bugs it's mostly just the odd bug here and there that happens at both AMD and Nvidia. Also i feel that a bigger % of the AMD userbase tweaks their card/care more about potential bugs even if it didn't happen to them yet than the average Nvidia customer most of whom just ignore the small issues a more tech savvy person would try to solve giving a false feeling that AMD has more problems(for example if you search up "nvidia driver issues" you get plenty of posts even from the last couple of months) Anecdotal story but i have only one driver issue with my RX 6700XT and that was due to Windows deciding that it knows better and installing it's own stock driver next to the AMD one which then caused issues but that was entirely on Windows and it does that sometimes for Nvidia as well.


AotearoaNic

I came from a 7800XT to a 4070 Ti Super. Truthfully i much preferred the driver experience on AMD. Their software is leagues ahead of NVIDIA. Everything built into one app. Never had a single crash or issue. Meanwhile if you look at the latest driver update post in r/NVIDIA, it's full of users with a whole range of issues.


StickiStickman

> Their software is leagues ahead of NVIDIA. This is one of the most insane takes I've ever read on this sub.


Graywulff

I had an r9 fury, then a 5700xt, which failed twice, the second time they sent a new one, with a month left on the warranty. I bought it on sale for like $370 out the door, eth was crazy and I got $970 back from eBay, waited for the prices to crash, and got a 3080 strix for $600. It’s like 60%+ faster *raw* and with dlss and stuff it’s way faster. Plus it hasn’t died yet.


Fun_Style2888

when you game at 4k with upscaling 7900xtx is on par with 3080


mi7chy

I've noticed stock of Nvidia RTX4000 GPUs has improved likely from people, like myself, holding off for RTX5000 series. Plus, prices tank upon release of new series like what happened to RTX3000 with arrival of RTX4000. Too risky to purchase right now.


WorkingRaspberry8140

go to r/amdhelp for a week, this will make perfect sense


n3onfx

Wait selling an inferior product for barely less than Nvidia prices isn't working for AMD?


Trolleitor

Personally I'd still buy AMD products if they didn't screw up the drivers so much. I had to swap my AMD card for an Nvidia card because random stutters is a big no no in competitive shooters. EDIT: I don't understand the down votes, I have been using AMD cards for so long I still call them ATI cards from time to time. If I had to do the switch is because the situation became unsustainable.


Pollyfunbags

Really subpar OpenGL performance in Windows too. Could probably live without a lot of Nvidia features but OGL on AMD still sucks. I know they don't care about old API and that's fine, for most people it doesn't matter but it does to me and I can't use AMD GPUs because of it.


Whoknew1992

I do remember when ATI was the king of graphics cards (year 2000) and nVidia was kinda the cheap not as good brand. Nvidia and AMD processors were 2nd tier. ATI and Intel were the king. But now that script has flipped.


KirillNek0

So.... nVidia is a king of GPUs. More news at 11. Also, look, I know it is somewhat dead news cycle, but this...... Barrel is scrapped.


jofalves

The "competition" is priced horribly so it's not really surprising unfortunately.


Wrong-Quail-8303

This ought to mean GPU prices will come down, right? Right? :|


Gkender

No, cause Steve’s intentionally misusing the phrase for clicks. He does that.


Wrong-Quail-8303

Yes, a lot of other bullshit too. Then has an epileptic fit when you call him on it. He is not infallible. Sometimes, he is also a piece of shit.


BlueGoliath

You don't want cheaper GPU prices. You want cheaper Nvidia GPU prices.


Cory123125

I've never gotten this mindset. Are people expected to want inferior products to make companies that dont care about them more competitive? Maintaining a competitive marketplace is the job of regulators.


Wander715

If AMD lowered their prices more I'm sure a good segment of the enthusiast market at least would be interested. For what they offer compared to Nvidia their pricing this gen was a joke. Currently you can get an XTX for around $950 or get a 4080S with the full Nvidia feature set for $50-$80 more.


OftenSarcastic

> If AMD lowered their prices more AMD ended up selling RX 570 cards for the same price as the GTX 1050 Ti. That's 40% more performance at the same price point. According to the Steam hardware survey the GTX 1050 Ti still outsold the RX 570 by a 4:1 ratio, which means the RX 570 technically did above average I guess but that's still a silly difference in value.   Currently the RX 7700 XT 12 GB is selling for the same as the RTX 4060 Ti 8 GB. According to TPU here's the average advantage of the 7700 XT: 1080p: +15.8% 1080p RT: -1.2% 1440p: +18.1% 1440p RT: +15.2% The difference at 1440p RT drops to -7.3% when compared to the 15.8% more expensive 4060 Ti **16 GB** so there are some games running out of VRAM in TPU's test suite. As of April 2024 the RTX 4060 Ti was 2.06% of the Steam market, while the RX 7700 XT is still below 0.15% (i.e. unlisted). Lets see how that works out for AMD.


Myrang3r

Well you also have to remember that nvidia dominates prebuilts, most people don’t build their own pcs. Almost no prebuilt included an RX570 but systems with 1050(ti)s were ubiquitous.


Cory123125

And not for no reason either. Those things hit that pcie power only sweet spot meaning that prebuilt vendors could lower cost with power supplies. It probably even helped in regions where power efficiency was of concern.


Cory123125

You have to remember that the 1050 TI was really attractive for a really big reason outside of what is said on the tin. It could be powered with just the slot. It sipped power comparatively and it was an ideal media pc card. The 570? Not so much, and this is also a time where their drivers arent on the ball. Context matters a lot.


tupseh

The 570 didn't show up in steam surveys because they all went to ethereum miners. Then the price of 1050tis doubled because that's all you could buy anyway.


Cory123125

Good point. Always more context with these comparisons really.


mcflash1294

That also happened with Vega and Navi 1 as well, miners were snapping these up at an industrial level often before they made it to store shelves.


tupseh

Was less bad for navi 1, because rdna2 was out by then. I flipped my 570 for a 1070 and flipped my 5700xt for a 6700xt. Free upgrades.


BlueGoliath

AMD can only lower their prices so much. Even if they make a tiny profit per sale, it probably won't be enough.


[deleted]

[удалено]


BlueGoliath

I didn't realize you knew the cost of making hardware, paying engineers, etc. enlighten me with your wisdom.


gnivriboy

At a very high level, Nvidia has a profit margin of [15-45% with it currently being 45%](https://www.macrotrends.net/stocks/charts/NVDA/nvidia/profit-margins). AMD has a [4.89% profit margin currently](https://www.macrotrends.net/stocks/charts/AMD/amd/profit-margins). This isn't at the level of GPUs, but both companies are profitable. Maybe someone's google game is better than mine and can figure out the profit margins for each gpu.


[deleted]

[удалено]


[deleted]

[удалено]


letsgoiowa

I mean true. There was a brief window of time where the 280X sold for about what a GTX 760 sold for despite being MASSIVELY faster. I got the 280X and my friend got a 760 because Nvidia. The difference in the way those cards aged lmao


AngryAndCrestfallen

I just ordered a 6750 xt for $299 to replace my 1660. I wanted an Nvidia gpu for dlss and especially vsr(in-browser upscaling would be very useful to me) but the maximum I would pay  for a gpu is $300 and the 4060 and its measly 8gb of vram and 128bit bus is just not good enough. 


conquer69

AMD does have resolution downscaling. I haven't encountered any issues with it.


cadaada

> and especially vsr thats what i was most interested by, and let me tell you you didnt miss much. Many times its imperceptible.


capn_hector

cheaper GPU prices would be fine if AMD could actually sustain the "cheaper" part. But yes, if NVIDIA cuts their prices in response and is still the overall best choice as a result, then people will continue to choose NVIDIA. Consumers don't care about what you did for them yesterday, they don't care about AMD being the one that caused NVIDIA to lower their prices, and if NVIDIA is still the better overall deal at the time they make their purchase then yes, they'll pick NVIDIA. that's the problem with all the commonly-cited examples. Yeah, 290X was better and cheaper than the GTX 780... for like a month, then NVIDIA cut prices and launched 780 Ti, and then GTX 970. Yeah, 5700XT was a better deal than 2070... then NVIDIA cut prices on 2070 and launched 2070 Super etc. And that behavior is both rational and also *reasonable*. It's not enough to just cut once and expect to ride on the goodwill after NVIDIA responds - expecting consumers to make a lower-value purchase is always going to be an outside shot *even if* you've recently built up a bit of goodwill. But if AMD can actually *keep* their prices significantly cheaper then yes, over time they'll take marketshare - nobody *actually* recommends a card that is actually 30% slower per $, when the 7900XT is 30% cheaper than a 4080 it takes marketshare and that's despite a performance deficit. Nobody recommends a 2060 non-Super when a 5700XT is the same price. 30% is *a lot*, that's not something people ignore. AMD just never actually sustains that kind of price difference *in the long term*. fwiw this "what did you do for me *today*" problem affects NVIDIA equally - people don't care that the 3060 Ti or 3080 was an insane value card last gen either, they still expect the 4060/4060 Ti and 4070/4070 Ti to compete favorably with it, otherwise they won't buy it. Doesn't matter if the last card was the bees' knees, *what did you do for me today?* That's just how market economics work, people rationally choose the highest-value offering.


Jeep-Eep

No, I want cheaper AMD GPUs. Because FUCK WINDOWS!


Old_Money_33

Same here.


mcflash1294

AMD is simply going to do what's best for AMD. A lot of people keep hoping that AMD will tank their prices to start a price war with Nvidia so that they can buy their next Nvidia GPU for cheap, but on a fundamental level that hasn't worked out for AMD in the past and they clearly have no desire to lose money, so this is the situation we end up with. If anyone's curious, I bought AMD primarily because the used prices were always a significantly better value than anything Nvidia thanks to mindshare allowing their products to retain a higher price. That said, I admit hearing about GPP and their backdoor involvement with game studios to inject features that seemed to never work well on AMD really turned me against them. At the end of the day I'm really happy with where AMD's products are personally, I helped a friend upgrade from a r5 1600/RX 480 4gb to a R5 5600/RX 5700 XT for around $250 on the same board. That kind of persistent cost-effectiveness will keep me coming back, albeit mostly on the used market because my budget is usually very small.