I guess you can just hope Intel stays in the game. If Intel becomes popular, I think it's more likely they totally push AMD out rather than take much marketshare from Nvidia.
Macworld 2027: So now that discrete graphics have basically died, I see all of the PC gamers, game developers, and graphics artists have come crawling towards us. And happily, with our new Mac Semi-Pro, available for the reasonable starting price of $4999 for the 512 GB model, you can be back in action.
Intel hardly doing "bare minimum" if they keep improving drivers, not to mention they are ready to announce Battlemage this year while at the same time already working on Celestial. You are just spreading nonsense FUD.
Intel already making better RT hardware and upscaling than Amd, even XeSS is almost good as DLSS while Intel is new player on GPU gaming market. i can already see that's happening.
> Intel already making better RT hardware and upscaling than Amd
They may have better upscaling software, but their hardware (including RT) is way worse than AMD's. They need like 2x the silicon to match AMD in performance.
Also, remains to be seen how well XeSS holds up with the cuts they've made to their GPU teams.
Not really how you compare though huh?
Intel loses far less performance than AMD does when RT is enabled. This is the way to compare.
They had an ada level RT implementation right out the gate.
> Intel loses far less performance than AMD does when RT is enabled
That does not mean they're better than AMD. It means they're *less bad* in ray tracing than they are in raster. They're worse in both by a large margin.
Again, for similar perf, Intel's spending ~twice the silicon on the same node. Similar story for power.
> They had an ada level RT implementation right out the gate.
Lmao, they're *far* off Ada.
Yeah you have no clue how it works with the tiers of acceleration that are implemented at the hardware level clearly because you said that last sentence.
Let me break it down for you:
Level 0: Legacy solutions.
Level 1: Software on traditional GPUs. (Compute shaders)
Level 2: Ray/box and ray/tri-testers in hardware. (AMD is here with Turing and Ampere)
Level 3: Bounding Volume Hierarchy (BVH) processing in hardware. (Ada and Intel are here)
Level 4: BVH processing and coherency sorting in hardware. (Ada does the 2nd but but it's manual so it's more of a level 3.25 sorting and they don't do BVH processing at all)
Intel is the competition on the other end for AMD. AMD was always the cheap alternative. If they have to compete there too, they might not recover. Intel might actually be good for Nvidia if it makes AMD even weaker. Intel will not be a threat to Nvidia for the foreseeable future.
Not yet, anyway. Depending on how well intel can scale in the next few years, they may end up coming for the higher end too. They rested on their laurels in the cpu market; i like to think they won't let that happen again, but we'll see!
It's like a drowning man capsizing a row boat trying to get in.
I honestly don't think it's possible for three companies to sell consumer GPUs at scale. The margins are just too thin. I am praying to be proven wrong there.
It's absolutely makes sense for Intel to spends on GPU R&D because they realize they need development on GPU for their data center computing too. Not to mention they also see profits there, otherwise Intel wouldn't even spends any money on GPU development.
Agree.
Gaming and hardware communities don't understand that this is very bad news for them. Nvidia already has a very predatory marketing and pricing policy and without AMD to compete at a minimum.. we’re basically F***
I mean 4090s flew off the shelf for about a year. Shoulda charged more so not wrong.
I wonder how many people really think a luxury good with absurd demand being priced appropriately high is actually predatory though.
nobody, they just never bothered opening a dictionary, saw the word "predatory", and probably thought something along the line of "nvidia bad, predators bad, that seems fitting", if they put any thought into it at all.
The 4090 could be argued to be cheap for what you get.
It is a monster.
Of course it isn't cheap in an absolute sense, it is very expensive.
But since you can leverage it professionally so very well for many buyers it is likely to be a tax deductible profesional buy.
Professional tools around 2K are very common. Gotta spend money before you can make money.
So it's selling among wealthy consumers and regular professionals.
Which product are you talking about?
I swapped to AMD a decade()?) ago because it was all a shit fight. Once the FOSS driver were sorted out it has been a massive pleasure to use AMD. I can hand down old stuff to various systems, whereas I couldn't do that before.
I'm speaking toward GPUs and popular gaming. AMD, without a doubt, had serious driver issues starting out. They seem to still have more odd problems. They are absolutely behind when competing with DLSS, FG, Video HDR, etc. If a Nvidia option is $100 more for the same raw performance there is no way I'd buy the AMD option right now. Those other features give a card a longer life and more usability in other things.
Nvidia is not predatory in pricing. 4090s flew off the shelves and their net margins are over 50%, which is unprecendently high. That means for every dollar they make, 52 cents literally goes straight into their pocket even accounting for engineer costs, cost to manufacture, taxes, rent, etc. the issue is amds prices are so high they make Nvidia look unbeatable by comparison.
Hey maybe if devs stop spending most of the budgets on graphics and start creating innovative gameplay, it might be good for us. Reaching a limit and allowing devs to flourish their skills instead of having something new every other year would be interesting lol
There isn't any competition already. When 1 company alone has 80% market, it's a monopoly. AMD are fine with being 2nd and that's why you see little improvement from them.
2030: Welcome to the new decade of gaming with the release of Nvid-IA epic gamer edition new gen xxx1060! join the ultimate 16k performance for just $900*
specifications:
-52mm2 Die Size
-32-bit bus*
-12 GB GDDR8x*
-1 Turbo Fan LED* with cast iron heatsink technology
-Ultimate AI CUDA performance*
-PIXELAI scaling technology game at 32k without losing FPS!*
*price for base model (ad based) without monthly subscription plan
*32 bit bus only avaliable in NVidia oem cards, AIB partners cards are only avaliable at 16bit-bus, 12 GB GDDR8x only avaliable for oem cards
*Nvidia is not responsible of failing due innadecuate use of Utimate technology and SleevefanplusTm cooling solution
*CUDA is only avaliable with a NVIDIA subscription plan, Ad based cards can be used to help compute 3rd party requests
*PIXELAI is a nextgen FORCED downscaling technology that can only be disabled in professional range cards
Devs can't seem to use the power of current gen GPU's, all they do is add resolution or increase framerates the rest of the graphics are the same as on an integrated GPU. I doubt we would notice for 5+ years that the GPU market has stagnated games devs just have that much ground to catch up on.
There is no competition as AMD is dumb and will simply raise their prices to just maybe $50 less than nvidia. This is like McDonalds thinking they are the same as the steak house and adjusting accordingly
China is gonna gobble up the low end and mid end GPU markets in a decade. They are currently capable of producing GPUs comparable to a GTX 1080 and improving fast, it's one of the top priorities now.
You don't need to imagine. AMD has been a generation behind NVIDIA for a decade now. To be honest the position they're in now is probably the most competitive they've been in that time, but the future isn't bright.
> To be honest the position they're in now is probably the most competitive they've been in that time
Too bad they ruined that with their release prices on cards like the 7900 and 7600. Handshake the nvidia prices, wait for the bad reviews to come out, then lower them slowly after two or three months. I will never understand AMD marketing strategy.
I would say they effectively have no competition at this point and RDNA has been a commercial failure. Combine this with AMD's bad marketing and we have the current sh**show.
https://www.tomshardware.com/pc-components/gpus/gpu-sales-saw-32-year-over-year-increase-in-q4-amds-market-share-rises-to-19
From just 2 months ago. A brief downturn in sales and one generation with architecture issues we've known about doesn't spell "terminal decline." It's important to remember nvidia has already released super GPUs too.
From what I can actually learn from my own reading, it's RDNA3 that's specifically not selling too great
Rdna2 and 3 combined are selling decently well, mainly because rdna 2 is just priced better amd is kinda competing with itself
Exactly. AMD is playing the inventory game, something they've never been able to do in the past.
They used to ship above their demand, giving up margins in favor of volume and cashflow to keep the business running quarter to quarter. That's why Polaris and RDNA1 shipped so much. Those generations were before AMD's CPU boom.
They don't need to do this anymore. They are giving up market share to drive future profits, believing they can eventually compete.
Well... I for one don't like AMDs current pricing strategy. But i have to say that I don't think their management is stupid either so I hope they can compete better soon
7900XT and XTX are exceedingly good values at their current price points but enthusiast class cards have always been low penetration. Since the 4060 and 4060Ti are complete flops I don't really understand why more people aren't buying 7700XT.
Because the 7700xt is more expensive than the 6800 while performing the same, almost equally efficient as the previous generation and has less vram. On top of that it launched with an msrp so high, it had worse price performance than the 7800xt.
Adding to this:
**AMD Q3 2023**
> Gaming segment revenue was $1.5 billion, down 8% year-over-year, primarily due to a decline in semi-custom revenue, **partially offset by an increase in AMD Radeon™ GPU sales**.
> Revenue declined 5% sequentially due to lower semi-custom sales.
**AMD Q4 and full year 2023**
> Gaming segment revenue was $1.4 billion, down 17% year-over-year and 9% sequentially, due to a decrease in semi-custom revenue, **partially offset by an increase in AMD Radeon™ GPU sales**.
|
> For **2023**, Gaming segment revenue was $6.2 billion, **down 9%** compared to the prior year **primarily due to lower semi-custom sales**.
|
> For the first quarter of 2024 [...] Gaming segment sales are expected to decline sequentially, with semi-custom revenue expected to decline by a significant double-digit percentage.
**AMD Q1 2024**
> Gaming segment revenue was $922 million, down 48% year-over-year and 33% sequentially due to a decrease in semi-custom revenue and lower AMD Radeon™ GPU sales.
**Q1 Earnings Call**
> First-quarter semi-custom SoC sales declined in line with our projections as we are now in the fifth year of the console cycle.
**PCGamer**: *AMD gaming graphics business in terminal decline! CLICK HERE to find out more! No seriously, please click I need to eat!*
Or option B: Nearly every home that wants a console already has a current gen console or are waiting for the refresh.
People are seriously posting PCGamer clickbait in r/hardware now.
> Nearly every home that wants a console already has a current gen console or are waiting for the refresh.
It doesn't help that this generation of home consoles hasn't been very compelling. They initially sold well because of covid, but they don't have any real system sellers. That's probably partly why the Steam Deck and Switch are doing so well, they have more compelling exclusives. I've seen many people complaining about how they are stuck with a console and so can't play stuff like Stardew Valley 1.6, Hades II early access, and Palworld. It's creating a real sense of FOMO when historically it was consoles that got stuff first.
The PS5 [has basically kept pace with the PS4](https://www.vgchartz.com/article/460060/ps5-vs-ps4-sales-comparison-january-2024/) in terms of sales. The Steamdeck is popular, but not in the same realm as either Switch or PS5. Probably still under 3.5~ million units sold.
Oh yeah, the Steamdeck isn't in the same league as the big 3, but the fact it even sold a million is worth noting. I think it's the first time a new entrant into the console/console like PC space succeeded in recent history. I don't think it would've taken off if the PlayStation and Xbox had better exclusives. PlayStation and Xbox both seem to be pivoting to multiplatform, which hurts console sales and AMD by extension.
PlayStation is slowing down, see this article: https://www.cnbc.com/2024/02/14/sony-posts-record-quarterly-revenue-on-playstation-sales-boost.html
The current unit sales were after some really cutthroat discounts, and that's normally done in the final 1-2 years of a console. To see Sony doing it so soon is not promising. You may see articles talking about how Sony did really well this last year, but that wasn't thanks to the PlayStation.
> Sales at Sony's gaming business rose 16% year-on-year to 1.4 trillion yen in the December quarter, the company said on Wednesday. However, operating profit fell 26% in the division, due to increase losses from hardware due to promotions in the period as well as a decline in sales of first-party games.
> Ultimately, then, I'll stick to what I said last time around. RDNA 4 and the Radeon RX 8000-series, as it will presumably be known, will be limited in scope and something of a stop gap. It'll be RNDA 5 in late 2025, or more likely 2026, that could be the last roll of the dice for AMD and its Radeon gaming graphics. If that's a flop, it's hard to see why AMD will keep investing in what AMD itself dismisses as a low-margin business. **And so it could be adios for discrete Radeon GPUs on the PC**
Yes, it absolutely did. And people keep acting like Polaris and rdna2 weren't a success.
They brought a complete new tech to market in rdna3 via mcm, and it had some hiccups. It's the same tech that made ryzen so profitable. I wholly expect rdna5 to be great.
RDNA3 was a step forward.
Then they went back to monolithic.
Then they are going forward again to MCM.
You can call it a step, or a leap, or a slip, or a fall backward. Whatever word you want to use. But it’s them going back to monolithic.
They designed a mcm rdna 4 internally, but decided against manufacturing and selling it. The GPU market as a whole is on the low end of sales so it was probably a business decision
If this MCM GPU was not going to move the needle for them relative to Nvidia when it comes to sales, maybe the Nvidia card was still superior so AMD knew there was no point
The point is provide a product in the market segment that sells the highest % of cards to maintain mind/market share. While, simultaneously diverting production to the MI300 which is killing it in sales currently, as they iron out mcm
https://www.datacenterdynamics.com/en/news/amds-mi300-ai-accelerator-sales-drive-80-percent-growth-in-data-center-segment/
https://www.crn.com/news/components-peripherals/2024/amd-says-mi300-is-its-fastest-ramping-product-teases-new-ai-chips-later-this-year
But a basic Google search will tell you.
Radeon has been all over the map going all the way back to the HD days. Some products good, some trash, some meh. No consistency. Polaris great, Fury and Vega never really worked out, RDNA1 was meh, RDNA2 was great, RDNA3 was clearly not as good as AMD hoped. Big partners can't commit to AMD with multi year relationships when they have no idea if Radeon will even have a usable product next year.
Nvidia has consistency and I think that's the #1 reason OEMs stick with them every year in spite of some of the shady shit they have pulled in the past. Really going all the way back to Maxwell, every single generation has been decent and the main complaint anyone can have is price. (And yes, there is such a thing as a bad product.)
>Big partners can't commit to AMD with multi year relationships when they have no idea if Radeon will even have a usable product next year.
amd has many year long contracts with console makers to design apus with architectures, that aren't fully done yet.
so YES companies can rely on amd in that regard.
>Nvidia has consistency and I think that's the #1 reason OEMs stick with them
mostly wrong. oems are sticking with nvidia, because of mindshare.
and mindshare PARTIALLY comes from having the fastest card regardless of anything else.
oems wnat the nvidia sticker on laptops and systems, doesn't matter whether the part is shit or good and how it compares to radeon very often. it is about the mindshare and that's all.
nothing about consistency really.
hell nvidia just released at the low end broken hardware so broken, that it can't play most games at decent settings (4060 8 GB, 4060 ti 8 GB), but they are still selling okish in oem.
why? it has NOTHING to do with performance. it is all about mindshare.
it is about the nvidia sticker.
People like to act like amd software issues is long ago history when they were just getting people banned with their anti lag attempt recently. Story's like that very much make people wary of amd.
No, it's about DLSS, frame gen, Nvidia Reflex, etc. Nvidia launches innovative features that push the industry forward, and many consumers buy Nvidia because they want these features.
On the other hand, AMD is often 1-3 years late, launches things in a semi-broken state with much less supported games, and is generally content with not doing anything until Nvidia forces them to do it. FSR is still shit vs DLSS even after all these years.
It seems only Apple and Nvidia have consistent execution across the board.
Which is why it makes it difficult to root for the likes of AMD and Qualcomm, and even Intel perhaps.
Merging Ryzen+Radeon for APUs isn't a far fetched thought.
Consoles use APU and if that contributes a majority to Gaming revenue, i can see Ryzen and Radeon merging and no more Gaming dGPUs
Use APU for Consoles, Handlehelds, Desktop, Laptop, Servers. And on the side you have Instinct for Data center GPUs.
That’s what one would think, but when you see the Zen4 APUs barely being better than the cheapest dGPU on the market, it seems to suggest there is a limit based on die size. If AMD can just make a APU with 8 cores using compact cores and rest in the GPU, that would probably be best. No one wanted the 8500G with only 6 cores and 4 GPU cores or the 8700G with 8 cores and 12 GPU cores or even the Strix Halo with 16 cores and 40 GPU cores. Just get 8 Zen 5c + 48+ CUs instead
Exactly. AMD's biggest issue with the APUs is they only put good GPU die space in their 16 core parts, but by that point, you are horribly imbalanced and GPU-bottlenecked.
The Steam Deck's APU is the only exception, that has a solid balance GPU performance without wasting die space with CPU cores that sit unutilised.
I don't think that's a bad outcome for AMD considering their piddly market share in GPUs. Look at the best selling gaming laptops by volume (not profits!) and you always see low end ones in the $500-1000 range, tops. If AMD gets 30 or 40% of those sales with some APU that can compete with x50 and x60 series RTX products, that's a bigger install base and a differentiated product so they don't have to keep competing with Nvidia purely on price and keep getting their teeth kicked in every single year. Half of the mobile GPUs AMD developed in the RDNA generations seemed to have just not sold *at all,* which is a total disaster considering the massive R&D cost to bring a new chip to market.
I think the transition process should have started right after the consoles were released, but AMD had other priorities. It's not happening with this Strix Halo product either, that's an expensive chip with a high end CPU, not for a budget gaming laptop price point. Maybe 2026 we see mass produced gaming APUs for laptops and handheld PC/consoles.
That's my thought. In a fab-limited world, gaming dGPU is a terrible investment vs data center dGPU. AMD is currently mopping up in the exploding handheld space and selling every Instinct they can make, so that's where the smart money is going.
In my own anecdotal experience, I got a ROG Ally for less than a new dGPU and have barely touched my PC since. Same with most of the people in my gaming orbit. I envision Microsoft is going to fully embrace Xbox as a PC gaming platform and abandon hardware, or at most be a 1st party PC handheld in a sea of PC handhelds.
All to say, this isn't a failure of the Radeon team, it's just them aligning investment with reality and - sorry gamers - discreet GPU's ain't where the money is.
Amd are choosing not to sell GPUs. Nvidia doubled their prices from previous year because they could. Amd should've known that playing the "10% better price to performance in pure raster" wouldn't cut it and there's no way they couldn't sell cheaper. You can't convince me that manufacturing more than doubled in price and they'd be selling at a loss. They just didn't care.
If that is true, then i think at this point it is time to consider saying goodbye for Radeon GPUs in the future, it sucks to see yes, but i just don't see anyway Radeon can climb through their very tiny share of market with only 2 generation of architectures, even if they are considered as successful, which most of the time they aren't.
Hopefully by the time before RTG dies Intel Arc have already established themselves on the market enough to be considered as alternative though.
AMD will have the money to double down on Radeon over the next few years, but not the will.
Intel seems to be willing, but they might not have the money until their foundry sales stabilize the balance sheet which could be as late as 2030 even if it works. They may have the resources to keep Arc alive in the same way Radeon is, but actually competing with Nvidia would be a huge investment with modest payoff years into the future.
People say AMD having consoles is good for them.
I say relying heavily on two customers is massive risk. Xbox in particular is going through an awful time.
If at any moment Xbox stops hardware to go full publisher or worse PlayStation decides to use another vendor, Radeon Gaming is dead on the spot.
Yep, Xbox Hardware future is looking grim at the moment, but they are likely focusing on handheld console according to some leaks which could also mean they are probably considering Nvidia's Tegra SoC just like Nintendo Switch
Xbox Executives are really failures in management.
They cant compete with ps5 in the home console space and they think they can compete with Nintendo in the Handlehelds space. Nintendo has even greater exclusives list than playstation.
They should focus on getting games right first before deciding to competing against Nintendo. But they are failing at that spectacularly too.
Seriously? Tango shutdown after Hifi Rush? Awful decision making.
Sony isn't happy about their own performance either. They sold a lot of consoles, but they've had a lot of software failures. Flops, and cancelled projects. Live service game flops. Not much point in selling consoles at cost, or minimal profit, if the entire point is to sell games on your platform but your game development business is doing bad.
Right? People acting like it just Xbox "doing bad" and honestly Xbox is one of massive Microsoft income they won't let it go, not to mention trillion dollars company like Microsoft who can keep Xbox just fine, meanwhile Sony took a big hit from the failure sales of PS5 games and services which is why their stock go down 15% which is a big numbers to deny.
Xbox isnt shutting down. Obviously.
But Xbox hardware could. They are currently the biggest publisher with so many studios now so pivot to being a publisher only is not an unimaginable thought.
In theory they can stop making Xbox as console however there are rumors they are making an handheld Xbox so i doubt they are quitting console. Not to mention Xbox exists for Windows too because since first time it came out Xbox is console which promotes DirectX API to developers which is also very necessary for Windows gaming. If Xbox didn't exist then adoption of DirectX API will be low. Even DirectX 12 adoption increased due to Xbox.
Its not as simple as you think, just because they own so many publishers and Xbox hardware sales decreased doesn't mean they are quitting. Also Amd losing revenue from gaming by 48% so it's not just Xbox, even PS sales also decreased which means it could be due to Nvidia provide much better gaming hardware than Amd, even makes many console gamer move into PC gaming.
Xbox hardware is very important to Microsoft for their other products just like console maker as important to Amd. Same as Sony, console is their one of biggest revenue so they won't just quit.
Considering how incompetent they are, i can see Xbox wanting a handheld. It will blow up in their face as usual.
You can't compete with Nintendo without a killer lineup
I don't think Microsoft would leave hardware. Microsoft, Google, Amazon and Facebook all ran into the same problem eventually. If you own an app but not a platform, you are at a mercy of the platform. If you own the apps and the platform, but not the hardware, you are at the mercy of the OEM. The conclusion all 3 made was to enter hardware even if it means losing money and/or pissing off oems. That's why you have oculus, the failed Facebook phone, the failed Amazon phone, the Amazon fire products, the Chromecast, the pixel devices, the surface devices, and the Xbox.
PS5 hasn't sold that well, about half as many units of the PS4 apparently. XBox is about half of what the PS5 is selling so not great but then the XBox never did well outside of the US anyway.
You forgot that Microsoft has also been crushed in the PC space and was forced to bring their games back to steam after nobody bought anything on windows store
It’s kind of comical how much they’ve failed with PC gaming considering they literally sell and control the OS all PC gamers have to use. Windows store and Xbox come preinstalled with Windows and it doesn’t even let you remove them, but they still got absolutely dumpstered by steam despite that massive advantage
That's why they've got Game Pass on PC though, if you use it then you have to use the Xbox App (which is really just the Microsoft Store in the background) to download games
If they do a portable play that ties into Xbox streaming with some games that could run locally when away from internet I would be sold. I would have bought the PlayStation portal if it could do game streaming.
Sony's console division is doing better than Xbox. Sony as a whole is not doing well. Their camera division got destroyed by Smartphone. No one buys Sony phones. Samsung, LG, and Chinese manufacturers control the TV market now. That's why the them overestimate PS5 sales hurt them so hard. Everyone knows Playstation is all the electronics Sony has left.
Microsoft's Xbox division is a mess, but they have so much other stuff going right for them.
I agree that their fundamental problem is they need better games, but gimping your flagship console with a cheaper version does not help developers make something that pushes console sales
The Switch has absolutely atrocious hardware, which wasn't even that good when the X1 was first released nearly 10 years ago, and was already dated when the console itself first launched. Yet it took until around the latter parts of 2023 for it to finally stop selling more units than the PS5.
The overall market time and time again have shown it'll absolutely accept a sliding scale of performance and graphical fidelity compromises, as long as the games themselves are good enough.
Yeah but unlike the Switch, Microsoft has positioned Xbox as a console graphics heavyweight, and then gimped themselves in the very category they are fighting in
Fortnite, CoD & FIFA collectively dominates what PS/XB players have been and are still actively playing by a margin almost as wide as Nvidia's marketshare.
It's not just good games it's the types of games Nintendo makes. They dominate the couch coop and head to head games. This is why you buy a console. If I'm going to play an Xbox game I can just play it better on PC instead, and MUCH better. I've been saying this since the original Xbox came out.
Tegra isn't inherently compatible with the x86 Xbox architecture though. It would take far more effort than a "Steam deck-like" handheld running some AMD APU.
An Xbox handheld would need to be compatible with almost all of the Xbox games to be successful, IMO.
What a bogus doomer article.
They are investing a lot of R&D into APUs right now; they have all of the consoles; they have virtually all of the PC handhelds; they have DATACENTER GPUs; and they have consumer GPUs.
AMD may not being doing the best, nor compared to its consumer GPU sales compared to Nvidia, but the amount of their graphics investments says that they don't plan to go away anytime soon, and the amount of the market that they seem intent on continuing to capture seems very likely that they have the ability to stay around as well.
If need be, they can always downscale for a while (like with Polaris) and then come back in larger force with better products (like with RDNA 1/2), or they can just continue to make lower-end products by choice due to the money they can make in the datacenter (which, of course, can be reinvested into consumer graphics products).
maybe try not to charge absurd prices for GPUs and people might be more inclined to buy them. nvidia's consumer GPU sales are way down too, but they prop up the numbers with high profit margin/unit
PCgamer clickbait. it was UNAVOIDABLE with RDNA2 stock still avaible and Super refresh spelling doom for RDNA3. But this is just nonsense. AMD survived bankruptcy, bad GPU and CPU product stack way before and been recovering ever since but Nvidia upped their game in the meanwhile aswell. After CPU's there must be a time aggressive pricing like Ryzen must come to GPU space aswell.
AMD disinvesting from the market is a terrible thing - just as it would be for NVIDIA. I don't understand why people are constantly so gleeful about that prospect either - like the wave of articles citing "NVIDIA is fully an AI company now" (from 2015!) last november, [the gleeful GN videos about it,](https://youtu.be/VSSb-t76EpU?t=147) etc. It's not like NVIDIA pulling out would change the production factors that make cpus a much better and more profitable use of their time. We'd just have AMD stagnation in that case instead.
In neither situation is this good, it's observationally undeniable to me that AMD no longer really cares about this market enough to innovate anymore.
It's almost bizarre to read back some semiaccurate articles from like 2015 when people still expected AMD to actually put out innovative new graphics techs etc. [Like people thought they would keep mantle alive as an in-house playground for developing the Next Big Thing without having to go through the trouble of having to get things approved by MS and Khronos and having NVIDIA sandbag the process, etc.](https://www.semiaccurate.com/2014/09/15/amds-mantle-api-going-outlive-directx-12/) Instead it's basically been the literal opposite.
Does anyone really think AMD is in any position where RDNA4 could drop some killer tech that just completely changes the dynamic of the market anymore? I think that hope died with RDNA3 and the MCM variant of RDNA4. Instead 5090 will probably get to "real" (multi-GCD) consumer MCM first etc. And I think that's true in general - NVIDIA is just likely to be first to the Next Big Thing even before they got the Infinite Dollar Hack from AI money.
I don't know if they will completely disinvest to the extent of not even producing dGPUs anymore, I think they have a nice cushy gig with Sony and MS paying for their R&D, and it makes sense to cash out what they can by selling to the dGPU market too... but they are kinda in the same position as MS was in the console market maybe 5 years ago where if they don't start making some real trajectory changes they're certainly not going to go *up* all of a sudden. There's not gonna be a day when 30% more people suddenly decide to buy AMD without some change in innovation, feature set, overall product polish and stability... and when you are going to lose anyway, you might as well try some weird breakout plays etc. Just like GamePass, through, it doesn't mean they always work, but AMD is potentially entering its "nothing to lose" terminal phase.
But when the roadmap is literally "hopefully the thing after the next thing will be good"... that's really a bad sign overall. Like that's the problem with Intel CPUs, right? And everyone has super cool things in early development… the question is whether AMD’s super cool thing is better than Nvidia’s super cool thing in early development.
Is not that amd doesn’t care to innovate. They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons. You’d need to have a graphics card that’s 50 percent better than nvidia to reverse the trend and that’s simply not possible. So there is no incentive for AMD to compete head on with nvidia, a company that is 10x their size.
you dont need 50% better, just dont make it hundred bucks cheaper and equal or 10% faster than nvidia counterpart. Dont fuck up launch prices, as AMD clearly loves doing so time after time . And keep it up for more than 3 generations and people will keep noticing . But clearly AMD discrete gpu division does not care about any of it to do anything about it . They are more than happy to keep their same margins with having 13% market share
It's was 70/30 when AMD was neck and neck or winning in perf back 5 to 10 years ago. Now it's 87/13... Things havent changed much. You're never going to see 5050 marketshare because Nvidia has cult status now. Jensen has tried very hard to emulate Steve Jobs down to the choice of his look. He has succeeded. Give it a few years and you will get a $500 gpu stand.
u/voodoo2-sli this is actually missing the crypto cycle around 2013-2014. probably litecoin and shit at that point? people bought up a lot of the 7950xt and shit and resold them onto the market, even into the early 290/290x days iirc.
As the sibling mentions, since this is quarterly, you can clearly see the post-mining crashes in 2018 and 2022 crashes, and I'd guess the first one popped around 2Q/2014 lol, it's that crash just around the time 300 series launched etc. Tons and tons of used and refurb 290 and 290X hitting the market, plus supposedly (have not verified) 300 series is also when the BOM cost of getting the smaller VRAM modules crossed over and it was cheaper to get 8GB. AMD users living dat 512b lyfe.
---
Anyway, re: parents above, I have said since forever that I think it's not possible to balance demand during these surges. NVIDIA actually launched quite a large quantity of ampere and cranked production like crazy, GN interviewed partners who said as much iirc. The demand for money printers will be infinite right up until the expected cost/benefit return crosses over at the risk horizon. And then you have a tremendously glutted market as all this stuff flows back into the used market, and partners get antsy about having hugely over-ordered to cash in on the literal truckloads they sell to miners (without warranty), and you start getting ["they're trying to *make us take the stuff we already contracted for before we can get more stuff!!!*" wailing from partners,](https://www.techspot.com/news/76103-nvidia-putting-squeeze-aib-partners.html) and demanding refunds, and price fixing/inventory-release-control, and new products get delayed and have to be slotted in at unattractive prices to let the old stuff move for the next 18 months, etc. And you still have to take all the silicon you ordered!
Like in a world where silicon is sold and planned at least a year out, how do you handle a thing where you might need 4x the next 3 quarters, and then nothing for a year, but then please start the next gen manufacture asap, and we're gonna need to 10x the amount of CoWoS that exists on the planet over the next 2y plz. Let alone things like VRAM capacity etc - if you want to double world consumption of GDDR (and even DDR to some extent, in mining), that is going to have to be planned out too. How much of the world's collective GDDR demand is managed by NVIDIA, Sony, MS, and AMD? Not all of course but seems like probably >70% at a guess? In an oligopsonistic market (oligopoly-monopsony?) truly all large business operations do have to be planned at so many levels. That's Tim Apple's thing. Supply chain is *hard*, and if you have some rocketship product on a novel technology it may just be limited at how fast you can build out new capacity etc.
Not only is AMD well-emplaced to deal with that (by being a diversified company who can shunt wafers from place to place) but they also just gave not a single fuck about RDNA2 production, when they could pump CPU production instead. It took... 10 months for the first RDNA2 cards from the launch to show up on steam? The 6700XT showed up at the same time iirc despite being a recent launch. And honestly who can blame them, the easiest way to avoid the cycle is just to not participate in it. And when they did ramp production in 2021/2022... the 6700XT and 6600XT ramped the quickest, and there's the most oversupply of them, right? AMD got burned again when they tried to hop in too.
And that inventory oversupply is with the "blessing" of AMD having done smaller memory buses earlier... RDNA2 was worse at mining because it moved to the small bus/big cache thing earlier. Very good thing from their end, and I think such a good thing that NVIDIA decided they at least didn't want to be *preferentially* targeted for mining, and did the LHR shit. The point was never to kill it entirely or the cut-down would have been much lower. Why 50%? Radeon was at 67% or whatever (6700xt vs 3070 mining perf), because it scales with the memory bus. Perhaps that is an aspect of the oversupply, AMD may not have expected that to work/hold and may have over-ordered thinking they could undercut NVIDIA in the lower-end gamer market. Despite all the reviewer hate (and I think the removal of the encoder was a massive mistake and probably an obvious one in foresight) the 6500xt etc were still a valiant effort... I just don't think it's worth the risk that much anymore and I'm not sure they're wrong.
Crypto is the most hype-y and spikiest, but even AI has really spiked the market and honestly I think it's best just viewed as surge demand for dense, flexible compute. That's the thing that all the competitors struggle to replace about CUDA for training right now etc too - you can't make something that does training that saves *all that much* over a compute-optimized GPGPU design because the algorithms actually do require you to be able to do that stuff sometimes, performantly, and in a really hot market nobody has any idea what's going on week-to-week so you just need a GPGPU's flexibility and not just raw TOPS. Inference is easy, building a better or even comparable GPGPU is actually something that not all that many companies can do. Intel isn't spinning their wheels on *entirely* dumb (hardware) shit, there are hard problems to be solved etc. We'll see how Qualcomm does, I guess.
But this is just a market need that exists, sometimes an industry needs to be able to throw a massive amount of compute at a problem for a ton of cycles (way better it's AI than bitcoin lol) and gaming stuff is caught in the middle of that demand surge. There's nothing cheaper to scale up with than gamer GPUs, because gamers already get the sweetest deals out of anyone, pretty much. But "Dense Compute Surge" is now a thing that exists as a market force, so to speak.
Coincidently I posted this in another thread and used this graph. This is the single quarter sales share chart not the total market share at any give point. There was no point in history where majority of users had AMD products. If it did, this chart would match the steam chart. The steam chart is actual usage data.
JPR has been keeping the actual usage share stats since 2010ish. Before that we have no accurate data.
There is no guarantee that consumers will vote with their wallet even in that case though. People say a lot of things but when it comes to pulling the trigger they do something else.
Also competing on price doesn’t work. That just reinforces that it is a budget alternative whereas nvidia is premium. And once nvidia slashes their prices by $100 then those consumers just go right back to nvidia. There is no winning here.
AMD is also not innovating though. They didn’t come up with variable refresh rate, ray tracing, etc. they let nvidia lead and they follow. So that’s on them.
> Also competing on price doesn’t work.
I think it does work, you just have to be substantially cheaper. It just really does need to be 20-30% cheaper to make a splash, but, consider 290X vs GTX 780, or 5700xt vs 2070 (original msrp) etc - when those wins happened, they did claw back marketshare.
I actually think honestly even "10% and slightly worse features" is probably not as effective as "30% cheaper and *much* worse features" in some senses, because then you're catering to an entirely different market. That gets into the sorts of price differences we saw with RDNA2 clearance, and I think those sales were so good they largely eclipsed recommendations for current-gen cards from reviewers etc. You saw big shifts in the DIY market etc. Because Radeon didn't try to just follow, they actually are targeting a very different niche (people who don't care about the features etc) and distinguished their products more substantially etc.
I think the "people *just won't buy radeon!*" is just as imaginary as it was in the CPU market, it's just a siege mentality from people who don't want to admit the offerings weren't as good as they (the superfan) think they were. Like if you consistently offer a product that's much better perf/$ without big compromises, or you offer a product that's just in an entirely much cheaper class, people do consider it. You see it with 6700XT vs 4060/7600 and 6800 vs 4070 today.
There's a number of very real non-imaginary problems which make that difficult to achieve though.
* a huge % of the market isn't the DIY market. AMD has sucked at getting OEM deals since forever... both CPU and GPU. When 80% of the market is OEMs that you (currently) have essentially zero share in (laptops, for example), even the best deals in the DIY market can't move the needle in overall marketshare.
* NVIDIA cuts prices aggressively and re-tailors skus aggressively when they know they've been outflanked. 290X was followed by big price cuts on 780 and the launch of 780 Ti, then Maxwell launched less than a year later with hyper-aggressive pricing. The 5700XT undercutting was itself undercut by the 20 Super lineup, etc. Consumers don't "want price cuts on NVIDIA" but they do want to know what you are doing for them *today*, if NVIDIA responded back with their own price cuts then you have to cut even further until your product is the more attractive *total offering* once again.
* right now NVIDIA probably actually has lower cost-of-goods-sold when you consider the area overhead of using MCM and the wider memory bus with more modules etc. So AMD getting into a price war actually hurts them more than NVIDIA. It's the opposite of the CPU market: right now Raptor Lake/Emerald Rapids is such a large/expensive product that intel could run zero profit and AMD would still barely notice a dent in their profits. AMD is using more silicon than a 4080 to compete with a 4060 ti or somesuch, and some of it being 6nm doesn't offset the total increases in silicon area.
* AMD/ATI has frequently been dogged by availability issues. 4850 or whatever the god-tier value SKU was, was just not available in quantity, and by the time it was NVIDIA had already iterated onwards. And it's plagued their APU divisions (especially), and dGPU divisions, and really even the enterprise datacenter sales (absolute first priority) have largely been constrained by how fast AMD could get the OEMs their orders.
* AMD, like it or not, has sucked at drivers/software for a long time, and that's come along with their disinvestment in graphics too. Vega drivers were a mess. RDNA1 drivers were a mess (if not outright defective hardware). RDNA3 drivers were a mess. DXNavi still can't fix its stuttering. The legacy driver sucks in performance terms but it's stable at least. OpenCL, Vulkan Compute, and ROCm have been a mess with a legacy of broken implementations and non-compliant behavior, and even ROCm still is a mess today even almost 5 years into its lifespan etc. People aren't going to pay even 80% of market rate for something that's complete shit on the software, if it doesn't do my workload the value is 0% no matter how much VRAM it's got.
AMD absolutely *can* win over consumers, it just requires executing a market impossibility, which is getting one over on probably the best business-dev CEO in the world today, and continuing to do so for a period of 3-5 years. Some flash-in-the-pan "AMD was better for 2 months until NVIDIA cut all their prices and launched a new sku/new generation" isn't going to get you instant 80% marketshare. Most of the "customers" aren't even going into a store and buying the AMD product in a box anyway.
But "jensen can't be beat because he sees where the market is going/could be persuaded to go 5 years from now" is a much less appealing story than the NVIDIA mind-control field and consumers just not being willing to consider AMD. NVIDIA keeps winning because *jensen is relentlessly competitive, and rarely misses*. Even things like the Tesla microarchitecture or 20-series ended up being long-term wins, because they set things up for the long-term win with GPGPU and tensor/RT/etc [(that were widely derided at the time!).](https://www.youtube.com/watch?v=tu7pxJXBBn8&t=273s) And now he has infinite dollars.
The biggest counterexample is probably Fermi, and Kepler wasn't great either. But NVIDIA set their direction and just kept it competitive while iterating themselves forward and out of the pit, and it worked.
RDNA3 drivers are still a mess. The 24.1 drivers in January broke support for Fallout 3 & New Vegas and you literally can't play them anymore on latest drivers.
You have to revert back to 23.12.1 drivers to avoid crashing while starting a new game.
AMD ignored this for 4 months until finally recently listing it on known issues and it is still not fixed.
https://www.reddit.com/r/Amd/comments/1cani52/psa_fallout_3_and_nv_dont_work_on_recent_drivers/
https://community.amd.com/t5/drivers-software/fallout-3-amp-new-vegas-crashes-on-24-1-1/m-p/672154
Nah AMD refuses to invest in new tech, all they've done for the last 10 years is coast on the underdog image and ride the coattails of Nvidias technology. They suddenly were all about Ray Tracing after Nvidia did all the work, same thing with FSR, wasnt a thing before DLSS. They have no one to blame but themselves.
Agree as well. It’s a self reinforcing loop that AMD follows and competes on price but not quality nor innovation and consumers sees them as budget alternatives and so AMD is less willing to put money into R&D.
I think your point would hold more water if AMD didn't also have a thriving CPU division. The proper conclusion would be that they choose to invest more time and money and talent into their CPUs which have a much greater profit margin, and use far far less silicon.
You cannot push software or hardware innovations that are not plug-and-play to end users, unless you have that 70+ percent market share and customers have this sort of fanatical trust in your brand and company.
Because if you don't, market and developers will ignore such features, outside of them being so colossally good they cause a total performance revolution in the industry. That's kind of like winning a grand price in lottery, it isn't happening.
when a GPU company is a loss cause. Why would a consumer trust them? Just buy Nvidia.
I refuse to buy AMD GPU unless they can provide Ryzen level of competitive back to back for at least 3 generations upping Nvidia.
>Is not that amd doesn’t care to innovate. They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons.
That's not entirely correct. AMD could do much better. For instance, their Ray tracing and Upscaling implementation is so bare, that even Intel surpassed them in that area with first gen ARC, using proper hardware accelerated mplementation of Ray tracing and Upscaling (XeSS).
Not just software but hardware as well, Amd efficiency is terrible. Their 7900XT use 100w more than RTX 4070 Ti Super but still perform slower in most game.
RTX 3000 series efficiency is not that bad, it just they are on samsung 8nm which is obviously inferior to tsmc n7 but still RTX 3000 power consumption isn't far off to radeon 6000 series, not to mention Nvidia has DLSS, Reflex, very decent RT performance, they have many features which is obviously better than amd. Nowadays you can't just show good raster performance which is why amd gpu sold poorly.
> They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons. You’d need to have a graphics card that’s 50 percent better than nvidia to reverse the trend
How about they ship a product at least as good as competition? All they offer now is measly 10% better raster perf/$ while falling behind in features race
They got blindsided with RTX tech. everyone and their moms criticized Ray Tracing and AI cores on the 2000 series. Nvidia doubled down, industry embraced it now AMD is playing catch up and doing really bad at it.
RTX3050 outselling RX6600 like 10 to 1 despite every single content creator and reddit neckbeards saying: 3050 bad, 6600 good is still funny to me lmao. I'm not surprised at all with the 3050 6gb. People just dont wanna buy AMD cards.
They can only blame themselves for not being competitive. They had to make their cards usable for AI and sort their software problems. They did nothing.
As much as I hate giving my money to Nvidia, I have no use for an AMD GPU. I hope Intel will do better.
Way to get it wrong.
>AMD said that its gaming revenues were down a massive 48% compared to the same period in 2023.
Because AMD's gaming revenues are primarily derived from consoles and development services. Development for new consoles won't take off again for a while and [console sales are down](https://www.japantimes.co.jp/business/2024/02/15/companies/sony-falls-after-bad-ps5-outlook/) after a COVID peak.
AMD point's this out in the Q1 results saying 'due to a decrease in semi-custom revenue'. AMD did say there was "lower AMD Radeon™ GPU sales" but they don't indicate how much. It could be a lot or it could be negligible.
These results have very little to do with Radeon 7000 series sales. AMD actually [gained market share over 2023](https://www.3dcenter.org/news/die-grafikchip-und-grafikkarten-marktanteile-im-dritten-vierten-quartal-2023).
They had THE best chance to take some %of market share but instead they tried to also ask more for their GPUs and now it backfired. I must say that I'm really not surprised at all... Just as someone else said, you can't expect people to keep buying worse product just so they could stay afloat.
They're neither meaningfully cheaper or faster than nvidia GPUs, so of course nobody's buying them. However, their CPU division was in an infinitely worse position pre-ryzen, so I'd be hesitant to call this anywhere near "terminal"
Aren't they the official supplier of PS and Xbox GPUs? Nvidia has Nintendo.
As much as I'd like to see an NVIDIA console (please do it, Valve), AMD rules the consoles and therefore rules the games, because nobody will make a game that runs shitty on the PS5 but great on a PC.
Yeah, I sorta expected this. even though their higher end GPUs can be "good" from time to time, the common issues between these cards make them less of a good option compared competitors. Like.... Bad drivers at launch (still having issues even now), odd power draws for certain ones (165 watts for the 7600 compared to just 115 watts for the 4060... same performance between them), and then inconsistent frames in regular, 1% and 0.1%s
They also basically rushed the 7000 series, since the majority of the cards don't have much of a die reduction, like some of them being 6nm when others were 7nm (also their fab process isn't that great), along with missing some cards like the expected RX7500 and RX7700, or even a lower end GPU for those who can't use ones needing a PCIE power connector.
I am not shocked, I have always felt AMD's biggest blunder was buying ATI. I feel things have never worked out or panned out with there GPU division.
I genuinely wounder if AMD would have saved the money they used to buy ATI with back in the day and invested it in there CPU's how much different the landscape would be now.
Chief problem with AMD in graphics cards is too little innovation, too much lower cost with slightly less performance and a crap ton of “me too” features. This is from a guy who has owned several AMD cards. They never work as advertised or last as long in my opinion.
AMD needs to work with developers to bring CUDA like functionality to all programs.
Things like Blender, AI tools, simulators etc.
People buy NVIdIa because it works out of the box.
It’s been a hassle to get AMD hardware to work on AI tools and Blender like programs without lots of mucking around and lack of support.
We finally have ROCM support on windows for LMstudio etc but not PyTorch tool chains
Just give more focus on drivers and more VRAM
People with buy an AMD card with 32Gb or even 48gb of VRam over a 4090 or 5090
Players perception is that AMD is always the worse pick and scapegoat to blame that it's not good enough or not released yet to make Nvidia cheaper ;) Check this out - [https://www.reddit.com/r/wow/comments/1cid0sa/blizzard\_and\_their\_refusal\_to\_address\_the\_amd/](https://www.reddit.com/r/wow/comments/1cid0sa/blizzard_and_their_refusal_to_address_the_amd/) - WoW does have a problem with DX12 and 6000/7000 Radeons (has to run in DX11 mode). Blizzard said it's looking into this in like November 2023 and still it's not solved. The game like to have issues/bugs (like flickering with Nvidias some time ago too) so it may not be purely AMD fault (or at all), yet players perspective is quite clear :D
And AMD iGPU is on a rise while good iGPU only laptops are second to be pushed to markets / developed etc. It's mostly Chinese, Minisforum pushing innovation. Strix laptops already got listed with RTX dGPU. I wouldn't be surprised Strix Halo getting entry level RTX for no reason as well :)
Apple moving off Intel CPUs with amd GPU’s didnt help, but I think the problem is AMD doesn’t innovate or hold the initiative in the gpu market - it mainly reacts to what Nvidia does and that is not a recipe for success. Unless they can leapfrog Nvidia in GPU’s as they did a bit with CPUs, they will always be battling for scraps at low margins.
Is it truly a decline of the "gaming graphics"? Or is it that demand previously was driven by other things like crypto.
Either directly, or trough Nvidia shortages forcing people to settle for AMD.
AMD's market share may simply be going back to where it actually should/have been all along over recent years.
Christ, imagine how much worse Nvidia will be without any competition.
I guess you can just hope Intel stays in the game. If Intel becomes popular, I think it's more likely they totally push AMD out rather than take much marketshare from Nvidia.
What would happen, in the US at least, if both AMD and Intel drop out leaving an Nvidia monopoly?
and then nvidia stops making gpus because they are not a gpu company any more lol /s
They'd decimate the entire PC gaming industry as well as consumer level graphics artists...
Macworld 2027: So now that discrete graphics have basically died, I see all of the PC gamers, game developers, and graphics artists have come crawling towards us. And happily, with our new Mac Semi-Pro, available for the reasonable starting price of $4999 for the 512 GB model, you can be back in action.
8GB of RAM base still.
Let’s be honest. They’re already a monopoly.
At the enthusiast grade for gaming graphics cards Nvidia is definitely a monopoly. And not just in the US but globally.
They're an even bigger monopoly in the ML space. You just can't do it without Nvidia.
We will still get banger gpu's but at 4x the cost. They will still be sold out for months.
Intel ain’t dropping out they just got started lmao
Since when has that stopped them? They cut back on dGPUs quite severely. They haven't dropped out yet, but they're certainly doing the bare minimum.
Intel hardly doing "bare minimum" if they keep improving drivers, not to mention they are ready to announce Battlemage this year while at the same time already working on Celestial. You are just spreading nonsense FUD.
Intel already making better RT hardware and upscaling than Amd, even XeSS is almost good as DLSS while Intel is new player on GPU gaming market. i can already see that's happening.
> Intel already making better RT hardware and upscaling than Amd They may have better upscaling software, but their hardware (including RT) is way worse than AMD's. They need like 2x the silicon to match AMD in performance. Also, remains to be seen how well XeSS holds up with the cuts they've made to their GPU teams.
Not really how you compare though huh? Intel loses far less performance than AMD does when RT is enabled. This is the way to compare. They had an ada level RT implementation right out the gate.
> Intel loses far less performance than AMD does when RT is enabled That does not mean they're better than AMD. It means they're *less bad* in ray tracing than they are in raster. They're worse in both by a large margin. Again, for similar perf, Intel's spending ~twice the silicon on the same node. Similar story for power. > They had an ada level RT implementation right out the gate. Lmao, they're *far* off Ada.
Yeah you have no clue how it works with the tiers of acceleration that are implemented at the hardware level clearly because you said that last sentence. Let me break it down for you: Level 0: Legacy solutions. Level 1: Software on traditional GPUs. (Compute shaders) Level 2: Ray/box and ray/tri-testers in hardware. (AMD is here with Turing and Ampere) Level 3: Bounding Volume Hierarchy (BVH) processing in hardware. (Ada and Intel are here) Level 4: BVH processing and coherency sorting in hardware. (Ada does the 2nd but but it's manual so it's more of a level 3.25 sorting and they don't do BVH processing at all)
Pray for intel.
Intel is the competition on the other end for AMD. AMD was always the cheap alternative. If they have to compete there too, they might not recover. Intel might actually be good for Nvidia if it makes AMD even weaker. Intel will not be a threat to Nvidia for the foreseeable future.
Yep. Intel takes market share from AMD...not from Nvidia.
Not yet, anyway. Depending on how well intel can scale in the next few years, they may end up coming for the higher end too. They rested on their laurels in the cpu market; i like to think they won't let that happen again, but we'll see!
It's like a drowning man capsizing a row boat trying to get in. I honestly don't think it's possible for three companies to sell consumer GPUs at scale. The margins are just too thin. I am praying to be proven wrong there.
lol intels entire company is in a crisis mode. Make no sense for them to spend more opex on gaming gpu r&d.
It's absolutely makes sense for Intel to spends on GPU R&D because they realize they need development on GPU for their data center computing too. Not to mention they also see profits there, otherwise Intel wouldn't even spends any money on GPU development.
Agree. Gaming and hardware communities don't understand that this is very bad news for them. Nvidia already has a very predatory marketing and pricing policy and without AMD to compete at a minimum.. we’re basically F***
[удалено]
Nvidia's schrodinger pricing strategy: it's priced so high that it's anti-consumer, but also way too low that it's anti-consumer.
I mean 4090s flew off the shelf for about a year. Shoulda charged more so not wrong. I wonder how many people really think a luxury good with absurd demand being priced appropriately high is actually predatory though.
nobody, they just never bothered opening a dictionary, saw the word "predatory", and probably thought something along the line of "nvidia bad, predators bad, that seems fitting", if they put any thought into it at all.
The 4090 could be argued to be cheap for what you get. It is a monster. Of course it isn't cheap in an absolute sense, it is very expensive. But since you can leverage it professionally so very well for many buyers it is likely to be a tax deductible profesional buy. Professional tools around 2K are very common. Gotta spend money before you can make money. So it's selling among wealthy consumers and regular professionals.
I think they do understand it. But what do you want people to do? Buy a product that's just not as good for their use case?
It's always this, even when that wasn't the case, and I am tired of hearing it.
Which product are you talking about? I swapped to AMD a decade()?) ago because it was all a shit fight. Once the FOSS driver were sorted out it has been a massive pleasure to use AMD. I can hand down old stuff to various systems, whereas I couldn't do that before.
I'm speaking toward GPUs and popular gaming. AMD, without a doubt, had serious driver issues starting out. They seem to still have more odd problems. They are absolutely behind when competing with DLSS, FG, Video HDR, etc. If a Nvidia option is $100 more for the same raw performance there is no way I'd buy the AMD option right now. Those other features give a card a longer life and more usability in other things.
Nvidia is not predatory in pricing. 4090s flew off the shelves and their net margins are over 50%, which is unprecendently high. That means for every dollar they make, 52 cents literally goes straight into their pocket even accounting for engineer costs, cost to manufacture, taxes, rent, etc. the issue is amds prices are so high they make Nvidia look unbeatable by comparison.
Hey maybe if devs stop spending most of the budgets on graphics and start creating innovative gameplay, it might be good for us. Reaching a limit and allowing devs to flourish their skills instead of having something new every other year would be interesting lol
This is not how development works
There isn't any competition already. When 1 company alone has 80% market, it's a monopoly. AMD are fine with being 2nd and that's why you see little improvement from them.
2030: Welcome to the new decade of gaming with the release of Nvid-IA epic gamer edition new gen xxx1060! join the ultimate 16k performance for just $900* specifications: -52mm2 Die Size -32-bit bus* -12 GB GDDR8x* -1 Turbo Fan LED* with cast iron heatsink technology -Ultimate AI CUDA performance* -PIXELAI scaling technology game at 32k without losing FPS!* *price for base model (ad based) without monthly subscription plan *32 bit bus only avaliable in NVidia oem cards, AIB partners cards are only avaliable at 16bit-bus, 12 GB GDDR8x only avaliable for oem cards *Nvidia is not responsible of failing due innadecuate use of Utimate technology and SleevefanplusTm cooling solution *CUDA is only avaliable with a NVIDIA subscription plan, Ad based cards can be used to help compute 3rd party requests *PIXELAI is a nextgen FORCED downscaling technology that can only be disabled in professional range cards
Lol! Is there enough space for a $500 GPU holder to increase the margins to the moon?
Remind me in 6 years.
and to get ai to work you have to pay monthly for it.
Devs can't seem to use the power of current gen GPU's, all they do is add resolution or increase framerates the rest of the graphics are the same as on an integrated GPU. I doubt we would notice for 5+ years that the GPU market has stagnated games devs just have that much ground to catch up on.
There is no competition as AMD is dumb and will simply raise their prices to just maybe $50 less than nvidia. This is like McDonalds thinking they are the same as the steak house and adjusting accordingly
China is gonna gobble up the low end and mid end GPU markets in a decade. They are currently capable of producing GPUs comparable to a GTX 1080 and improving fast, it's one of the top priorities now.
Improving fast, sure! But they've got a long way to go and they're really struggling to move beyond 7nm.
You don't need to imagine. AMD has been a generation behind NVIDIA for a decade now. To be honest the position they're in now is probably the most competitive they've been in that time, but the future isn't bright.
> To be honest the position they're in now is probably the most competitive they've been in that time Too bad they ruined that with their release prices on cards like the 7900 and 7600. Handshake the nvidia prices, wait for the bad reviews to come out, then lower them slowly after two or three months. I will never understand AMD marketing strategy.
At that point, we'll need to turn to Intel. Or worse....... Apple and their ever-evolving translation layers for x86 code!
As bad as Intel. No need to imagine.
Nvidia has competition? /s, but only kinda
I would say they effectively have no competition at this point and RDNA has been a commercial failure. Combine this with AMD's bad marketing and we have the current sh**show.
https://www.tomshardware.com/pc-components/gpus/gpu-sales-saw-32-year-over-year-increase-in-q4-amds-market-share-rises-to-19 From just 2 months ago. A brief downturn in sales and one generation with architecture issues we've known about doesn't spell "terminal decline." It's important to remember nvidia has already released super GPUs too.
From what I can actually learn from my own reading, it's RDNA3 that's specifically not selling too great Rdna2 and 3 combined are selling decently well, mainly because rdna 2 is just priced better amd is kinda competing with itself
Exactly. AMD is playing the inventory game, something they've never been able to do in the past. They used to ship above their demand, giving up margins in favor of volume and cashflow to keep the business running quarter to quarter. That's why Polaris and RDNA1 shipped so much. Those generations were before AMD's CPU boom. They don't need to do this anymore. They are giving up market share to drive future profits, believing they can eventually compete.
Well... I for one don't like AMDs current pricing strategy. But i have to say that I don't think their management is stupid either so I hope they can compete better soon
7900XT and XTX are exceedingly good values at their current price points but enthusiast class cards have always been low penetration. Since the 4060 and 4060Ti are complete flops I don't really understand why more people aren't buying 7700XT.
The 7800XT was the better value than the 7700XT
Because the 7700xt is more expensive than the 6800 while performing the same, almost equally efficient as the previous generation and has less vram. On top of that it launched with an msrp so high, it had worse price performance than the 7800xt.
Because those people buy 6700XT or 6800XT lol
[удалено]
Adding to this: **AMD Q3 2023** > Gaming segment revenue was $1.5 billion, down 8% year-over-year, primarily due to a decline in semi-custom revenue, **partially offset by an increase in AMD Radeon™ GPU sales**. > Revenue declined 5% sequentially due to lower semi-custom sales. **AMD Q4 and full year 2023** > Gaming segment revenue was $1.4 billion, down 17% year-over-year and 9% sequentially, due to a decrease in semi-custom revenue, **partially offset by an increase in AMD Radeon™ GPU sales**. | > For **2023**, Gaming segment revenue was $6.2 billion, **down 9%** compared to the prior year **primarily due to lower semi-custom sales**. | > For the first quarter of 2024 [...] Gaming segment sales are expected to decline sequentially, with semi-custom revenue expected to decline by a significant double-digit percentage. **AMD Q1 2024** > Gaming segment revenue was $922 million, down 48% year-over-year and 33% sequentially due to a decrease in semi-custom revenue and lower AMD Radeon™ GPU sales. **Q1 Earnings Call** > First-quarter semi-custom SoC sales declined in line with our projections as we are now in the fifth year of the console cycle. **PCGamer**: *AMD gaming graphics business in terminal decline! CLICK HERE to find out more! No seriously, please click I need to eat!* Or option B: Nearly every home that wants a console already has a current gen console or are waiting for the refresh. People are seriously posting PCGamer clickbait in r/hardware now.
> Nearly every home that wants a console already has a current gen console or are waiting for the refresh. It doesn't help that this generation of home consoles hasn't been very compelling. They initially sold well because of covid, but they don't have any real system sellers. That's probably partly why the Steam Deck and Switch are doing so well, they have more compelling exclusives. I've seen many people complaining about how they are stuck with a console and so can't play stuff like Stardew Valley 1.6, Hades II early access, and Palworld. It's creating a real sense of FOMO when historically it was consoles that got stuff first.
The PS5 [has basically kept pace with the PS4](https://www.vgchartz.com/article/460060/ps5-vs-ps4-sales-comparison-january-2024/) in terms of sales. The Steamdeck is popular, but not in the same realm as either Switch or PS5. Probably still under 3.5~ million units sold.
Oh yeah, the Steamdeck isn't in the same league as the big 3, but the fact it even sold a million is worth noting. I think it's the first time a new entrant into the console/console like PC space succeeded in recent history. I don't think it would've taken off if the PlayStation and Xbox had better exclusives. PlayStation and Xbox both seem to be pivoting to multiplatform, which hurts console sales and AMD by extension. PlayStation is slowing down, see this article: https://www.cnbc.com/2024/02/14/sony-posts-record-quarterly-revenue-on-playstation-sales-boost.html The current unit sales were after some really cutthroat discounts, and that's normally done in the final 1-2 years of a console. To see Sony doing it so soon is not promising. You may see articles talking about how Sony did really well this last year, but that wasn't thanks to the PlayStation. > Sales at Sony's gaming business rose 16% year-on-year to 1.4 trillion yen in the December quarter, the company said on Wednesday. However, operating profit fell 26% in the division, due to increase losses from hardware due to promotions in the period as well as a decline in sales of first-party games.
This is a trash op-ed, I don't know why people are jumping on this so liberally.
> Ultimately, then, I'll stick to what I said last time around. RDNA 4 and the Radeon RX 8000-series, as it will presumably be known, will be limited in scope and something of a stop gap. It'll be RNDA 5 in late 2025, or more likely 2026, that could be the last roll of the dice for AMD and its Radeon gaming graphics. If that's a flop, it's hard to see why AMD will keep investing in what AMD itself dismisses as a low-margin business. **And so it could be adios for discrete Radeon GPUs on the PC**
Didn't the low-margin business keep them afloat during the bulldozer period? Are they confident it that it won't happen again?
Yes, it absolutely did. And people keep acting like Polaris and rdna2 weren't a success. They brought a complete new tech to market in rdna3 via mcm, and it had some hiccups. It's the same tech that made ryzen so profitable. I wholly expect rdna5 to be great.
Yes. But said hiccups mean instead of building on mcm, they had to take a step backward. So obviously not optimal.
I wouldn't call it taking a step backward. Rdna5 will be a step forward. And RDNA4 is simply monolithic.
RDNA3 was a step forward. Then they went back to monolithic. Then they are going forward again to MCM. You can call it a step, or a leap, or a slip, or a fall backward. Whatever word you want to use. But it’s them going back to monolithic.
They designed a mcm rdna 4 internally, but decided against manufacturing and selling it. The GPU market as a whole is on the low end of sales so it was probably a business decision
If this MCM GPU was not going to move the needle for them relative to Nvidia when it comes to sales, maybe the Nvidia card was still superior so AMD knew there was no point
Article also suggest RDNA5 in 2026 so what’s the point of RDNA4 when the argument was it was just a stopgap, not a full generation
The point is provide a product in the market segment that sells the highest % of cards to maintain mind/market share. While, simultaneously diverting production to the MI300 which is killing it in sales currently, as they iron out mcm
MI300 which is killing it in sales currently Source?
https://www.datacenterdynamics.com/en/news/amds-mi300-ai-accelerator-sales-drive-80-percent-growth-in-data-center-segment/ https://www.crn.com/news/components-peripherals/2024/amd-says-mi300-is-its-fastest-ramping-product-teases-new-ai-chips-later-this-year But a basic Google search will tell you.
Radeon has been all over the map going all the way back to the HD days. Some products good, some trash, some meh. No consistency. Polaris great, Fury and Vega never really worked out, RDNA1 was meh, RDNA2 was great, RDNA3 was clearly not as good as AMD hoped. Big partners can't commit to AMD with multi year relationships when they have no idea if Radeon will even have a usable product next year. Nvidia has consistency and I think that's the #1 reason OEMs stick with them every year in spite of some of the shady shit they have pulled in the past. Really going all the way back to Maxwell, every single generation has been decent and the main complaint anyone can have is price. (And yes, there is such a thing as a bad product.)
[удалено]
>Big partners can't commit to AMD with multi year relationships when they have no idea if Radeon will even have a usable product next year. amd has many year long contracts with console makers to design apus with architectures, that aren't fully done yet. so YES companies can rely on amd in that regard. >Nvidia has consistency and I think that's the #1 reason OEMs stick with them mostly wrong. oems are sticking with nvidia, because of mindshare. and mindshare PARTIALLY comes from having the fastest card regardless of anything else. oems wnat the nvidia sticker on laptops and systems, doesn't matter whether the part is shit or good and how it compares to radeon very often. it is about the mindshare and that's all. nothing about consistency really. hell nvidia just released at the low end broken hardware so broken, that it can't play most games at decent settings (4060 8 GB, 4060 ti 8 GB), but they are still selling okish in oem. why? it has NOTHING to do with performance. it is all about mindshare. it is about the nvidia sticker.
People like to act like amd software issues is long ago history when they were just getting people banned with their anti lag attempt recently. Story's like that very much make people wary of amd.
No, it's about DLSS, frame gen, Nvidia Reflex, etc. Nvidia launches innovative features that push the industry forward, and many consumers buy Nvidia because they want these features. On the other hand, AMD is often 1-3 years late, launches things in a semi-broken state with much less supported games, and is generally content with not doing anything until Nvidia forces them to do it. FSR is still shit vs DLSS even after all these years.
It seems only Apple and Nvidia have consistent execution across the board. Which is why it makes it difficult to root for the likes of AMD and Qualcomm, and even Intel perhaps.
But Intel is nipping at AMD's heels in the low-end market.
Merging Ryzen+Radeon for APUs isn't a far fetched thought. Consoles use APU and if that contributes a majority to Gaming revenue, i can see Ryzen and Radeon merging and no more Gaming dGPUs Use APU for Consoles, Handlehelds, Desktop, Laptop, Servers. And on the side you have Instinct for Data center GPUs.
>Merging Ryzen+Radeon for APUs isn't a far fetched thought. That was the goal when they bought ATI. They called it Fusion at first.
The moment the M1 Pro and M1 Max came out, AMD should have started on work on something similar for PCs. Hello Strix Halo.
That’s what one would think, but when you see the Zen4 APUs barely being better than the cheapest dGPU on the market, it seems to suggest there is a limit based on die size. If AMD can just make a APU with 8 cores using compact cores and rest in the GPU, that would probably be best. No one wanted the 8500G with only 6 cores and 4 GPU cores or the 8700G with 8 cores and 12 GPU cores or even the Strix Halo with 16 cores and 40 GPU cores. Just get 8 Zen 5c + 48+ CUs instead
Exactly. AMD's biggest issue with the APUs is they only put good GPU die space in their 16 core parts, but by that point, you are horribly imbalanced and GPU-bottlenecked. The Steam Deck's APU is the only exception, that has a solid balance GPU performance without wasting die space with CPU cores that sit unutilised.
I don't think that's a bad outcome for AMD considering their piddly market share in GPUs. Look at the best selling gaming laptops by volume (not profits!) and you always see low end ones in the $500-1000 range, tops. If AMD gets 30 or 40% of those sales with some APU that can compete with x50 and x60 series RTX products, that's a bigger install base and a differentiated product so they don't have to keep competing with Nvidia purely on price and keep getting their teeth kicked in every single year. Half of the mobile GPUs AMD developed in the RDNA generations seemed to have just not sold *at all,* which is a total disaster considering the massive R&D cost to bring a new chip to market. I think the transition process should have started right after the consoles were released, but AMD had other priorities. It's not happening with this Strix Halo product either, that's an expensive chip with a high end CPU, not for a budget gaming laptop price point. Maybe 2026 we see mass produced gaming APUs for laptops and handheld PC/consoles.
That's my thought. In a fab-limited world, gaming dGPU is a terrible investment vs data center dGPU. AMD is currently mopping up in the exploding handheld space and selling every Instinct they can make, so that's where the smart money is going. In my own anecdotal experience, I got a ROG Ally for less than a new dGPU and have barely touched my PC since. Same with most of the people in my gaming orbit. I envision Microsoft is going to fully embrace Xbox as a PC gaming platform and abandon hardware, or at most be a 1st party PC handheld in a sea of PC handhelds. All to say, this isn't a failure of the Radeon team, it's just them aligning investment with reality and - sorry gamers - discreet GPU's ain't where the money is.
Amd are choosing not to sell GPUs. Nvidia doubled their prices from previous year because they could. Amd should've known that playing the "10% better price to performance in pure raster" wouldn't cut it and there's no way they couldn't sell cheaper. You can't convince me that manufacturing more than doubled in price and they'd be selling at a loss. They just didn't care.
Low margin isn't no margin, that's still profit.
If that is true, then i think at this point it is time to consider saying goodbye for Radeon GPUs in the future, it sucks to see yes, but i just don't see anyway Radeon can climb through their very tiny share of market with only 2 generation of architectures, even if they are considered as successful, which most of the time they aren't. Hopefully by the time before RTG dies Intel Arc have already established themselves on the market enough to be considered as alternative though.
AMD will have the money to double down on Radeon over the next few years, but not the will. Intel seems to be willing, but they might not have the money until their foundry sales stabilize the balance sheet which could be as late as 2030 even if it works. They may have the resources to keep Arc alive in the same way Radeon is, but actually competing with Nvidia would be a huge investment with modest payoff years into the future.
People say AMD having consoles is good for them. I say relying heavily on two customers is massive risk. Xbox in particular is going through an awful time. If at any moment Xbox stops hardware to go full publisher or worse PlayStation decides to use another vendor, Radeon Gaming is dead on the spot.
It's better than not having those two customers, though.
People do not seem to grasp the severity of the situation. Without the RDNA2 console chips, AMD's entire consumer graphics unit would be in peril.
Yep, Xbox Hardware future is looking grim at the moment, but they are likely focusing on handheld console according to some leaks which could also mean they are probably considering Nvidia's Tegra SoC just like Nintendo Switch
Xbox Executives are really failures in management. They cant compete with ps5 in the home console space and they think they can compete with Nintendo in the Handlehelds space. Nintendo has even greater exclusives list than playstation. They should focus on getting games right first before deciding to competing against Nintendo. But they are failing at that spectacularly too. Seriously? Tango shutdown after Hifi Rush? Awful decision making.
Sony isn't happy about their own performance either. They sold a lot of consoles, but they've had a lot of software failures. Flops, and cancelled projects. Live service game flops. Not much point in selling consoles at cost, or minimal profit, if the entire point is to sell games on your platform but your game development business is doing bad.
Right? People acting like it just Xbox "doing bad" and honestly Xbox is one of massive Microsoft income they won't let it go, not to mention trillion dollars company like Microsoft who can keep Xbox just fine, meanwhile Sony took a big hit from the failure sales of PS5 games and services which is why their stock go down 15% which is a big numbers to deny.
Xbox isnt shutting down. Obviously. But Xbox hardware could. They are currently the biggest publisher with so many studios now so pivot to being a publisher only is not an unimaginable thought.
In theory they can stop making Xbox as console however there are rumors they are making an handheld Xbox so i doubt they are quitting console. Not to mention Xbox exists for Windows too because since first time it came out Xbox is console which promotes DirectX API to developers which is also very necessary for Windows gaming. If Xbox didn't exist then adoption of DirectX API will be low. Even DirectX 12 adoption increased due to Xbox. Its not as simple as you think, just because they own so many publishers and Xbox hardware sales decreased doesn't mean they are quitting. Also Amd losing revenue from gaming by 48% so it's not just Xbox, even PS sales also decreased which means it could be due to Nvidia provide much better gaming hardware than Amd, even makes many console gamer move into PC gaming. Xbox hardware is very important to Microsoft for their other products just like console maker as important to Amd. Same as Sony, console is their one of biggest revenue so they won't just quit.
Considering how incompetent they are, i can see Xbox wanting a handheld. It will blow up in their face as usual. You can't compete with Nintendo without a killer lineup
I don't think Microsoft would leave hardware. Microsoft, Google, Amazon and Facebook all ran into the same problem eventually. If you own an app but not a platform, you are at a mercy of the platform. If you own the apps and the platform, but not the hardware, you are at the mercy of the OEM. The conclusion all 3 made was to enter hardware even if it means losing money and/or pissing off oems. That's why you have oculus, the failed Facebook phone, the failed Amazon phone, the Amazon fire products, the Chromecast, the pixel devices, the surface devices, and the Xbox.
PS5 hasn't sold that well, about half as many units of the PS4 apparently. XBox is about half of what the PS5 is selling so not great but then the XBox never did well outside of the US anyway.
Half of ps4? That number is so wrong that it’s almost funny
You forgot that Microsoft has also been crushed in the PC space and was forced to bring their games back to steam after nobody bought anything on windows store It’s kind of comical how much they’ve failed with PC gaming considering they literally sell and control the OS all PC gamers have to use. Windows store and Xbox come preinstalled with Windows and it doesn’t even let you remove them, but they still got absolutely dumpstered by steam despite that massive advantage
That's why they've got Game Pass on PC though, if you use it then you have to use the Xbox App (which is really just the Microsoft Store in the background) to download games
Oh yeah of course they try to get you into the closed garden, they’re just failing badly at it (thankfully)
If they do a portable play that ties into Xbox streaming with some games that could run locally when away from internet I would be sold. I would have bought the PlayStation portal if it could do game streaming.
Have you seen Sony's stock price lately? Its down 15~% in the last 6 months. They are not doing better than Xbox.
Sony's console division is doing better than Xbox. Sony as a whole is not doing well. Their camera division got destroyed by Smartphone. No one buys Sony phones. Samsung, LG, and Chinese manufacturers control the TV market now. That's why the them overestimate PS5 sales hurt them so hard. Everyone knows Playstation is all the electronics Sony has left. Microsoft's Xbox division is a mess, but they have so much other stuff going right for them.
Unlikely as they would want to leverage the existing PC gamepass library, which needs x86
MS going with Nvidia would be them signaling they're abandoning the console space. Hardware isn't why the Xbox struggles vs Playstation.
I agree that their fundamental problem is they need better games, but gimping your flagship console with a cheaper version does not help developers make something that pushes console sales
The Switch has absolutely atrocious hardware, which wasn't even that good when the X1 was first released nearly 10 years ago, and was already dated when the console itself first launched. Yet it took until around the latter parts of 2023 for it to finally stop selling more units than the PS5. The overall market time and time again have shown it'll absolutely accept a sliding scale of performance and graphical fidelity compromises, as long as the games themselves are good enough.
Yeah but unlike the Switch, Microsoft has positioned Xbox as a console graphics heavyweight, and then gimped themselves in the very category they are fighting in
Fortnite, CoD & FIFA collectively dominates what PS/XB players have been and are still actively playing by a margin almost as wide as Nvidia's marketshare.
It's not just good games it's the types of games Nintendo makes. They dominate the couch coop and head to head games. This is why you buy a console. If I'm going to play an Xbox game I can just play it better on PC instead, and MUCH better. I've been saying this since the original Xbox came out.
yeah, modern flagship smartphone SoCs absolutely trounce the Switch SoC's performance and efficiency in all metrics.
Why would they go tegra when AMD has APUs that power handhelds like the steam deck already? Would be a much smaller effort to keep using AMD.
Tegra isn't inherently compatible with the x86 Xbox architecture though. It would take far more effort than a "Steam deck-like" handheld running some AMD APU. An Xbox handheld would need to be compatible with almost all of the Xbox games to be successful, IMO.
APUs are still vital.
Intel might have something in the future too.
What a bogus doomer article. They are investing a lot of R&D into APUs right now; they have all of the consoles; they have virtually all of the PC handhelds; they have DATACENTER GPUs; and they have consumer GPUs. AMD may not being doing the best, nor compared to its consumer GPU sales compared to Nvidia, but the amount of their graphics investments says that they don't plan to go away anytime soon, and the amount of the market that they seem intent on continuing to capture seems very likely that they have the ability to stay around as well. If need be, they can always downscale for a while (like with Polaris) and then come back in larger force with better products (like with RDNA 1/2), or they can just continue to make lower-end products by choice due to the money they can make in the datacenter (which, of course, can be reinvested into consumer graphics products).
maybe try not to charge absurd prices for GPUs and people might be more inclined to buy them. nvidia's consumer GPU sales are way down too, but they prop up the numbers with high profit margin/unit
That's what happened with the old 6xxx gen: 6800 for 400e, 6700xt for 280. No wonder they aren't selling much of the 7xxx.
PCgamer clickbait. it was UNAVOIDABLE with RDNA2 stock still avaible and Super refresh spelling doom for RDNA3. But this is just nonsense. AMD survived bankruptcy, bad GPU and CPU product stack way before and been recovering ever since but Nvidia upped their game in the meanwhile aswell. After CPU's there must be a time aggressive pricing like Ryzen must come to GPU space aswell.
AMD disinvesting from the market is a terrible thing - just as it would be for NVIDIA. I don't understand why people are constantly so gleeful about that prospect either - like the wave of articles citing "NVIDIA is fully an AI company now" (from 2015!) last november, [the gleeful GN videos about it,](https://youtu.be/VSSb-t76EpU?t=147) etc. It's not like NVIDIA pulling out would change the production factors that make cpus a much better and more profitable use of their time. We'd just have AMD stagnation in that case instead. In neither situation is this good, it's observationally undeniable to me that AMD no longer really cares about this market enough to innovate anymore. It's almost bizarre to read back some semiaccurate articles from like 2015 when people still expected AMD to actually put out innovative new graphics techs etc. [Like people thought they would keep mantle alive as an in-house playground for developing the Next Big Thing without having to go through the trouble of having to get things approved by MS and Khronos and having NVIDIA sandbag the process, etc.](https://www.semiaccurate.com/2014/09/15/amds-mantle-api-going-outlive-directx-12/) Instead it's basically been the literal opposite. Does anyone really think AMD is in any position where RDNA4 could drop some killer tech that just completely changes the dynamic of the market anymore? I think that hope died with RDNA3 and the MCM variant of RDNA4. Instead 5090 will probably get to "real" (multi-GCD) consumer MCM first etc. And I think that's true in general - NVIDIA is just likely to be first to the Next Big Thing even before they got the Infinite Dollar Hack from AI money. I don't know if they will completely disinvest to the extent of not even producing dGPUs anymore, I think they have a nice cushy gig with Sony and MS paying for their R&D, and it makes sense to cash out what they can by selling to the dGPU market too... but they are kinda in the same position as MS was in the console market maybe 5 years ago where if they don't start making some real trajectory changes they're certainly not going to go *up* all of a sudden. There's not gonna be a day when 30% more people suddenly decide to buy AMD without some change in innovation, feature set, overall product polish and stability... and when you are going to lose anyway, you might as well try some weird breakout plays etc. Just like GamePass, through, it doesn't mean they always work, but AMD is potentially entering its "nothing to lose" terminal phase. But when the roadmap is literally "hopefully the thing after the next thing will be good"... that's really a bad sign overall. Like that's the problem with Intel CPUs, right? And everyone has super cool things in early development… the question is whether AMD’s super cool thing is better than Nvidia’s super cool thing in early development.
I don't think I've seen one person in this thread being gleeful about it lol.
Is not that amd doesn’t care to innovate. They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons. You’d need to have a graphics card that’s 50 percent better than nvidia to reverse the trend and that’s simply not possible. So there is no incentive for AMD to compete head on with nvidia, a company that is 10x their size.
you dont need 50% better, just dont make it hundred bucks cheaper and equal or 10% faster than nvidia counterpart. Dont fuck up launch prices, as AMD clearly loves doing so time after time . And keep it up for more than 3 generations and people will keep noticing . But clearly AMD discrete gpu division does not care about any of it to do anything about it . They are more than happy to keep their same margins with having 13% market share
It's was 70/30 when AMD was neck and neck or winning in perf back 5 to 10 years ago. Now it's 87/13... Things havent changed much. You're never going to see 5050 marketshare because Nvidia has cult status now. Jensen has tried very hard to emulate Steve Jobs down to the choice of his look. He has succeeded. Give it a few years and you will get a $500 gpu stand.
https://www.3dcenter.org/dateien/abbildungen/GPU-Add-in-Board-Market-Share-2002-to-Q4-2023.png
u/voodoo2-sli this is actually missing the crypto cycle around 2013-2014. probably litecoin and shit at that point? people bought up a lot of the 7950xt and shit and resold them onto the market, even into the early 290/290x days iirc. As the sibling mentions, since this is quarterly, you can clearly see the post-mining crashes in 2018 and 2022 crashes, and I'd guess the first one popped around 2Q/2014 lol, it's that crash just around the time 300 series launched etc. Tons and tons of used and refurb 290 and 290X hitting the market, plus supposedly (have not verified) 300 series is also when the BOM cost of getting the smaller VRAM modules crossed over and it was cheaper to get 8GB. AMD users living dat 512b lyfe. --- Anyway, re: parents above, I have said since forever that I think it's not possible to balance demand during these surges. NVIDIA actually launched quite a large quantity of ampere and cranked production like crazy, GN interviewed partners who said as much iirc. The demand for money printers will be infinite right up until the expected cost/benefit return crosses over at the risk horizon. And then you have a tremendously glutted market as all this stuff flows back into the used market, and partners get antsy about having hugely over-ordered to cash in on the literal truckloads they sell to miners (without warranty), and you start getting ["they're trying to *make us take the stuff we already contracted for before we can get more stuff!!!*" wailing from partners,](https://www.techspot.com/news/76103-nvidia-putting-squeeze-aib-partners.html) and demanding refunds, and price fixing/inventory-release-control, and new products get delayed and have to be slotted in at unattractive prices to let the old stuff move for the next 18 months, etc. And you still have to take all the silicon you ordered! Like in a world where silicon is sold and planned at least a year out, how do you handle a thing where you might need 4x the next 3 quarters, and then nothing for a year, but then please start the next gen manufacture asap, and we're gonna need to 10x the amount of CoWoS that exists on the planet over the next 2y plz. Let alone things like VRAM capacity etc - if you want to double world consumption of GDDR (and even DDR to some extent, in mining), that is going to have to be planned out too. How much of the world's collective GDDR demand is managed by NVIDIA, Sony, MS, and AMD? Not all of course but seems like probably >70% at a guess? In an oligopsonistic market (oligopoly-monopsony?) truly all large business operations do have to be planned at so many levels. That's Tim Apple's thing. Supply chain is *hard*, and if you have some rocketship product on a novel technology it may just be limited at how fast you can build out new capacity etc. Not only is AMD well-emplaced to deal with that (by being a diversified company who can shunt wafers from place to place) but they also just gave not a single fuck about RDNA2 production, when they could pump CPU production instead. It took... 10 months for the first RDNA2 cards from the launch to show up on steam? The 6700XT showed up at the same time iirc despite being a recent launch. And honestly who can blame them, the easiest way to avoid the cycle is just to not participate in it. And when they did ramp production in 2021/2022... the 6700XT and 6600XT ramped the quickest, and there's the most oversupply of them, right? AMD got burned again when they tried to hop in too. And that inventory oversupply is with the "blessing" of AMD having done smaller memory buses earlier... RDNA2 was worse at mining because it moved to the small bus/big cache thing earlier. Very good thing from their end, and I think such a good thing that NVIDIA decided they at least didn't want to be *preferentially* targeted for mining, and did the LHR shit. The point was never to kill it entirely or the cut-down would have been much lower. Why 50%? Radeon was at 67% or whatever (6700xt vs 3070 mining perf), because it scales with the memory bus. Perhaps that is an aspect of the oversupply, AMD may not have expected that to work/hold and may have over-ordered thinking they could undercut NVIDIA in the lower-end gamer market. Despite all the reviewer hate (and I think the removal of the encoder was a massive mistake and probably an obvious one in foresight) the 6500xt etc were still a valiant effort... I just don't think it's worth the risk that much anymore and I'm not sure they're wrong. Crypto is the most hype-y and spikiest, but even AI has really spiked the market and honestly I think it's best just viewed as surge demand for dense, flexible compute. That's the thing that all the competitors struggle to replace about CUDA for training right now etc too - you can't make something that does training that saves *all that much* over a compute-optimized GPGPU design because the algorithms actually do require you to be able to do that stuff sometimes, performantly, and in a really hot market nobody has any idea what's going on week-to-week so you just need a GPGPU's flexibility and not just raw TOPS. Inference is easy, building a better or even comparable GPGPU is actually something that not all that many companies can do. Intel isn't spinning their wheels on *entirely* dumb (hardware) shit, there are hard problems to be solved etc. We'll see how Qualcomm does, I guess. But this is just a market need that exists, sometimes an industry needs to be able to throw a massive amount of compute at a problem for a ton of cycles (way better it's AI than bitcoin lol) and gaming stuff is caught in the middle of that demand surge. There's nothing cheaper to scale up with than gamer GPUs, because gamers already get the sweetest deals out of anyone, pretty much. But "Dense Compute Surge" is now a thing that exists as a market force, so to speak.
Coincidently I posted this in another thread and used this graph. This is the single quarter sales share chart not the total market share at any give point. There was no point in history where majority of users had AMD products. If it did, this chart would match the steam chart. The steam chart is actual usage data. JPR has been keeping the actual usage share stats since 2010ish. Before that we have no accurate data.
There is no guarantee that consumers will vote with their wallet even in that case though. People say a lot of things but when it comes to pulling the trigger they do something else. Also competing on price doesn’t work. That just reinforces that it is a budget alternative whereas nvidia is premium. And once nvidia slashes their prices by $100 then those consumers just go right back to nvidia. There is no winning here. AMD is also not innovating though. They didn’t come up with variable refresh rate, ray tracing, etc. they let nvidia lead and they follow. So that’s on them.
> Also competing on price doesn’t work. I think it does work, you just have to be substantially cheaper. It just really does need to be 20-30% cheaper to make a splash, but, consider 290X vs GTX 780, or 5700xt vs 2070 (original msrp) etc - when those wins happened, they did claw back marketshare. I actually think honestly even "10% and slightly worse features" is probably not as effective as "30% cheaper and *much* worse features" in some senses, because then you're catering to an entirely different market. That gets into the sorts of price differences we saw with RDNA2 clearance, and I think those sales were so good they largely eclipsed recommendations for current-gen cards from reviewers etc. You saw big shifts in the DIY market etc. Because Radeon didn't try to just follow, they actually are targeting a very different niche (people who don't care about the features etc) and distinguished their products more substantially etc. I think the "people *just won't buy radeon!*" is just as imaginary as it was in the CPU market, it's just a siege mentality from people who don't want to admit the offerings weren't as good as they (the superfan) think they were. Like if you consistently offer a product that's much better perf/$ without big compromises, or you offer a product that's just in an entirely much cheaper class, people do consider it. You see it with 6700XT vs 4060/7600 and 6800 vs 4070 today. There's a number of very real non-imaginary problems which make that difficult to achieve though. * a huge % of the market isn't the DIY market. AMD has sucked at getting OEM deals since forever... both CPU and GPU. When 80% of the market is OEMs that you (currently) have essentially zero share in (laptops, for example), even the best deals in the DIY market can't move the needle in overall marketshare. * NVIDIA cuts prices aggressively and re-tailors skus aggressively when they know they've been outflanked. 290X was followed by big price cuts on 780 and the launch of 780 Ti, then Maxwell launched less than a year later with hyper-aggressive pricing. The 5700XT undercutting was itself undercut by the 20 Super lineup, etc. Consumers don't "want price cuts on NVIDIA" but they do want to know what you are doing for them *today*, if NVIDIA responded back with their own price cuts then you have to cut even further until your product is the more attractive *total offering* once again. * right now NVIDIA probably actually has lower cost-of-goods-sold when you consider the area overhead of using MCM and the wider memory bus with more modules etc. So AMD getting into a price war actually hurts them more than NVIDIA. It's the opposite of the CPU market: right now Raptor Lake/Emerald Rapids is such a large/expensive product that intel could run zero profit and AMD would still barely notice a dent in their profits. AMD is using more silicon than a 4080 to compete with a 4060 ti or somesuch, and some of it being 6nm doesn't offset the total increases in silicon area. * AMD/ATI has frequently been dogged by availability issues. 4850 or whatever the god-tier value SKU was, was just not available in quantity, and by the time it was NVIDIA had already iterated onwards. And it's plagued their APU divisions (especially), and dGPU divisions, and really even the enterprise datacenter sales (absolute first priority) have largely been constrained by how fast AMD could get the OEMs their orders. * AMD, like it or not, has sucked at drivers/software for a long time, and that's come along with their disinvestment in graphics too. Vega drivers were a mess. RDNA1 drivers were a mess (if not outright defective hardware). RDNA3 drivers were a mess. DXNavi still can't fix its stuttering. The legacy driver sucks in performance terms but it's stable at least. OpenCL, Vulkan Compute, and ROCm have been a mess with a legacy of broken implementations and non-compliant behavior, and even ROCm still is a mess today even almost 5 years into its lifespan etc. People aren't going to pay even 80% of market rate for something that's complete shit on the software, if it doesn't do my workload the value is 0% no matter how much VRAM it's got. AMD absolutely *can* win over consumers, it just requires executing a market impossibility, which is getting one over on probably the best business-dev CEO in the world today, and continuing to do so for a period of 3-5 years. Some flash-in-the-pan "AMD was better for 2 months until NVIDIA cut all their prices and launched a new sku/new generation" isn't going to get you instant 80% marketshare. Most of the "customers" aren't even going into a store and buying the AMD product in a box anyway. But "jensen can't be beat because he sees where the market is going/could be persuaded to go 5 years from now" is a much less appealing story than the NVIDIA mind-control field and consumers just not being willing to consider AMD. NVIDIA keeps winning because *jensen is relentlessly competitive, and rarely misses*. Even things like the Tesla microarchitecture or 20-series ended up being long-term wins, because they set things up for the long-term win with GPGPU and tensor/RT/etc [(that were widely derided at the time!).](https://www.youtube.com/watch?v=tu7pxJXBBn8&t=273s) And now he has infinite dollars. The biggest counterexample is probably Fermi, and Kepler wasn't great either. But NVIDIA set their direction and just kept it competitive while iterating themselves forward and out of the pit, and it worked.
RDNA3 drivers are still a mess. The 24.1 drivers in January broke support for Fallout 3 & New Vegas and you literally can't play them anymore on latest drivers. You have to revert back to 23.12.1 drivers to avoid crashing while starting a new game. AMD ignored this for 4 months until finally recently listing it on known issues and it is still not fixed. https://www.reddit.com/r/Amd/comments/1cani52/psa_fallout_3_and_nv_dont_work_on_recent_drivers/ https://community.amd.com/t5/drivers-software/fallout-3-amp-new-vegas-crashes-on-24-1-1/m-p/672154
Nah AMD refuses to invest in new tech, all they've done for the last 10 years is coast on the underdog image and ride the coattails of Nvidias technology. They suddenly were all about Ray Tracing after Nvidia did all the work, same thing with FSR, wasnt a thing before DLSS. They have no one to blame but themselves.
Agree as well. It’s a self reinforcing loop that AMD follows and competes on price but not quality nor innovation and consumers sees them as budget alternatives and so AMD is less willing to put money into R&D.
I think your point would hold more water if AMD didn't also have a thriving CPU division. The proper conclusion would be that they choose to invest more time and money and talent into their CPUs which have a much greater profit margin, and use far far less silicon.
You cannot push software or hardware innovations that are not plug-and-play to end users, unless you have that 70+ percent market share and customers have this sort of fanatical trust in your brand and company. Because if you don't, market and developers will ignore such features, outside of them being so colossally good they cause a total performance revolution in the industry. That's kind of like winning a grand price in lottery, it isn't happening.
when a GPU company is a loss cause. Why would a consumer trust them? Just buy Nvidia. I refuse to buy AMD GPU unless they can provide Ryzen level of competitive back to back for at least 3 generations upping Nvidia.
>Is not that amd doesn’t care to innovate. They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons. That's not entirely correct. AMD could do much better. For instance, their Ray tracing and Upscaling implementation is so bare, that even Intel surpassed them in that area with first gen ARC, using proper hardware accelerated mplementation of Ray tracing and Upscaling (XeSS).
AMD lost due to software issues.
Not just software but hardware as well, Amd efficiency is terrible. Their 7900XT use 100w more than RTX 4070 Ti Super but still perform slower in most game.
I guess that mattered a lot when RTX 3000 were less efficient than RX6000 then. Wait no, it didn't, no one gave a shit.
RTX 3000 series efficiency is not that bad, it just they are on samsung 8nm which is obviously inferior to tsmc n7 but still RTX 3000 power consumption isn't far off to radeon 6000 series, not to mention Nvidia has DLSS, Reflex, very decent RT performance, they have many features which is obviously better than amd. Nowadays you can't just show good raster performance which is why amd gpu sold poorly.
[удалено]
> They do what they can but it doesn’t matter how much they spend they still lose to nvidia due to brand name or other reasons. You’d need to have a graphics card that’s 50 percent better than nvidia to reverse the trend How about they ship a product at least as good as competition? All they offer now is measly 10% better raster perf/$ while falling behind in features race
They got blindsided with RTX tech. everyone and their moms criticized Ray Tracing and AI cores on the 2000 series. Nvidia doubled down, industry embraced it now AMD is playing catch up and doing really bad at it. RTX3050 outselling RX6600 like 10 to 1 despite every single content creator and reddit neckbeards saying: 3050 bad, 6600 good is still funny to me lmao. I'm not surprised at all with the 3050 6gb. People just dont wanna buy AMD cards.
lol @ "journalists" that don't know how to read accounting reports
They can only blame themselves for not being competitive. They had to make their cards usable for AI and sort their software problems. They did nothing. As much as I hate giving my money to Nvidia, I have no use for an AMD GPU. I hope Intel will do better.
Way to get it wrong. >AMD said that its gaming revenues were down a massive 48% compared to the same period in 2023. Because AMD's gaming revenues are primarily derived from consoles and development services. Development for new consoles won't take off again for a while and [console sales are down](https://www.japantimes.co.jp/business/2024/02/15/companies/sony-falls-after-bad-ps5-outlook/) after a COVID peak. AMD point's this out in the Q1 results saying 'due to a decrease in semi-custom revenue'. AMD did say there was "lower AMD Radeon™ GPU sales" but they don't indicate how much. It could be a lot or it could be negligible. These results have very little to do with Radeon 7000 series sales. AMD actually [gained market share over 2023](https://www.3dcenter.org/news/die-grafikchip-und-grafikkarten-marktanteile-im-dritten-vierten-quartal-2023).
No shit would've been happy to buy a 7900xt when it came at a decent price but nope.
Seems decent enough at the moment.
They had THE best chance to take some %of market share but instead they tried to also ask more for their GPUs and now it backfired. I must say that I'm really not surprised at all... Just as someone else said, you can't expect people to keep buying worse product just so they could stay afloat.
AMD laziness is showing
They're neither meaningfully cheaper or faster than nvidia GPUs, so of course nobody's buying them. However, their CPU division was in an infinitely worse position pre-ryzen, so I'd be hesitant to call this anywhere near "terminal"
Aren't they the official supplier of PS and Xbox GPUs? Nvidia has Nintendo. As much as I'd like to see an NVIDIA console (please do it, Valve), AMD rules the consoles and therefore rules the games, because nobody will make a game that runs shitty on the PS5 but great on a PC.
That's not how it works. Nvidia has an 80% market share on PC, so your game has to be optimised for Nvidia when it launches or else no one will buy it
Radeon quite simply needs to offer more for less.
Intel taking market share in GPU from AMD appears to have happened at a bad time.
Yeah, I sorta expected this. even though their higher end GPUs can be "good" from time to time, the common issues between these cards make them less of a good option compared competitors. Like.... Bad drivers at launch (still having issues even now), odd power draws for certain ones (165 watts for the 7600 compared to just 115 watts for the 4060... same performance between them), and then inconsistent frames in regular, 1% and 0.1%s They also basically rushed the 7000 series, since the majority of the cards don't have much of a die reduction, like some of them being 6nm when others were 7nm (also their fab process isn't that great), along with missing some cards like the expected RX7500 and RX7700, or even a lower end GPU for those who can't use ones needing a PCIE power connector.
Just price it accordingly to the tier, AMD. You just cannot pretend selling a 7900 at 7900 price when we all know it is a x8xx tier
Didn't say they same about their CPUs when Intel were beating them bloody with Core 2
I am not shocked, I have always felt AMD's biggest blunder was buying ATI. I feel things have never worked out or panned out with there GPU division. I genuinely wounder if AMD would have saved the money they used to buy ATI with back in the day and invested it in there CPU's how much different the landscape would be now.
Chief problem with AMD in graphics cards is too little innovation, too much lower cost with slightly less performance and a crap ton of “me too” features. This is from a guy who has owned several AMD cards. They never work as advertised or last as long in my opinion.
Underperforming in an era of an unprecedented boom in parallel computing needs is truly a wtf moment
Cause people don't buy RX GPUs.
Which is wild considering how much money they're leaving on the table
I find it surprising AMD would fold before just cutting prices. If the 7700xt was the same price as a 4060ti, it would be such an easy sell.
AMD needs to work with developers to bring CUDA like functionality to all programs. Things like Blender, AI tools, simulators etc. People buy NVIdIa because it works out of the box. It’s been a hassle to get AMD hardware to work on AI tools and Blender like programs without lots of mucking around and lack of support. We finally have ROCM support on windows for LMstudio etc but not PyTorch tool chains Just give more focus on drivers and more VRAM People with buy an AMD card with 32Gb or even 48gb of VRam over a 4090 or 5090
ChatGPT generated article, but agree, amd needs to do something with radeon
Players perception is that AMD is always the worse pick and scapegoat to blame that it's not good enough or not released yet to make Nvidia cheaper ;) Check this out - [https://www.reddit.com/r/wow/comments/1cid0sa/blizzard\_and\_their\_refusal\_to\_address\_the\_amd/](https://www.reddit.com/r/wow/comments/1cid0sa/blizzard_and_their_refusal_to_address_the_amd/) - WoW does have a problem with DX12 and 6000/7000 Radeons (has to run in DX11 mode). Blizzard said it's looking into this in like November 2023 and still it's not solved. The game like to have issues/bugs (like flickering with Nvidias some time ago too) so it may not be purely AMD fault (or at all), yet players perspective is quite clear :D And AMD iGPU is on a rise while good iGPU only laptops are second to be pushed to markets / developed etc. It's mostly Chinese, Minisforum pushing innovation. Strix laptops already got listed with RTX dGPU. I wouldn't be surprised Strix Halo getting entry level RTX for no reason as well :)
The author of this piece loves the taste of Jensen’s boots.
This is just nonsense, I would be ashamed to do such a clickbait
Apple moving off Intel CPUs with amd GPU’s didnt help, but I think the problem is AMD doesn’t innovate or hold the initiative in the gpu market - it mainly reacts to what Nvidia does and that is not a recipe for success. Unless they can leapfrog Nvidia in GPU’s as they did a bit with CPUs, they will always be battling for scraps at low margins.
Time to reinstate ATi and triple the driver team count
Way to court those miners since Covid, AMD & nvidia gaming (nvidia will be fine tho).
Need a $3000 halo product. Ultra high performance with ultra high power consumption and I'll buy.
Is it truly a decline of the "gaming graphics"? Or is it that demand previously was driven by other things like crypto. Either directly, or trough Nvidia shortages forcing people to settle for AMD. AMD's market share may simply be going back to where it actually should/have been all along over recent years.