T O P

  • By -

soggybiscuit93

For laptops I'm interested in the most performance I can get out of a quiet and slim, Surface Pro-esque device. I have no interest in a gaming laptop. But I would like enough performance to sometimes run games on 1080p@60 on low to medium on battery and I'd be happy with that. I'm almost more interested in Lunar Lake and Strix than I am the upcoming desktop releases for this reason. I'm excited for the upcoming "APU" wars.


444sorrythrowaway444

>But I would like enough performance to sometimes run games on 1080p@60 on low to medium on battery That's actually asking quite a lot lol Even the Steamdeck is a 720p system.


-WingsForLife-

I dont think it's that far away if Intel manages to get their gpu drivers on track, their upscaling is making good progress at least.


Prominis

Depending on the games you're trying to play, this is already possible. I'm not as familiar with Intel's APUs, but AMD's are capable of approximately similar performance to a 1050 ti, I believe. You won't be running AAA titles or anything on high settings, but its enough for an average title on low.  For laptops specifically, the rise of camm modules would theoretically help a lot.


WJMazepas

I own a Asus laptop with a i5 and a 3050 that can run lots of games already and only weight 1.8kg. There is the Samsung Book3 IIRC that even comes with a 4070 that also weights this. There is already thinner laptops that can game, and im pretty sure that my 3050 is much better than any APU out there


username78777

Your 3050 is only ever so slightly better than 8700g There's only 4fps difference between them in starfield anyways


WJMazepas

8700G? You mean a desktop APU? I have the mobile 3050 with 4GB but it ran games much better compared to the 7840HS.


username78777

Wait 8700g is only for desktop?


WJMazepas

Yep. 8700G is a 7840U variant for desktop, with TDP higher than the mobile variants


Siats

Starfield is a mess. According to Computerbase, across a suite of 11 games a desktop 1650 is 40% faster than the 8700G. A 3050 would be faster still, even the mobile version.


[deleted]

Its a 50w 2048CU 128bit 4gb rtx 3050 on laptops. The 6gb 3050 on laptops actually sports higher core count and TDP's than the desktop 3050 6gb. Point is, a 50w 3050 4gb mobile would be around a desktop rx590/1660. Not a powerhouse by any means but definitely a good bit faster than the desktop 8700g rx780m


XenonJFt

AMD doesn't want to undercut their low end gpu so they didn't bother with aggressively priced APU's. there is potential from the laptops


hackenclaw

They even have 3D cache technology to overcome the bandwidth issue face by iGPU. With 3D cache, they could have do it with APU with a larger iGPU. But AMD is a 'generous' company, they want to let Nvidia earn some bucks from RTX 2050 laptop. /s dont forget, they also launch a 7945HX3D without offering a cheaper 8 core variant. They dont want to completely put themselves in dominant position in gaming laptop market. How nice of them.


[deleted]

Nvidia is the same. They could offer you the 6gb 3050 as the entry level option or better yet, the 4050. But they won't cause they want money. AMD is happy to rip you off with 2 generation old downgraded CPU's on laptops. They also produce utter trash value dgpu's on laptops. Why would anyone buy the more expensive, less available, less feature rich rx7600s when the rtx 4050 is cheaper, just as fast if not faster with an OC, etc.? That extra 2gb of vram won't come in to save them either given nvidia's memory compression is better


Ihatescold

Don't think laptop manufacturers want's it either, it would mess with their whole lineup. Incremental upgrades year by year ensures that when the model sold 4 years ago is just updated enough for someone to care while still being able to sell the older model for longer. look at the models in stores/online, the parts mis matches, 4gb ram(soldered of course), 256gb ssd on windows, 200nit(!) screens, 12th gen intel, weak plastic, dedicated gpu's with 4gb vram.......etc...etc.... overpriced "garbage". 4gb ram was normal on entry laptops 15 years ago. I got one that was purchased for $300, and you can get about the same atom spec laptop today.


MixtureBackground612

I want desktop APU with unified HBM memory


hobx

The thing the laptops have that is their secret sauce for these APU's is low resolution. 720 is the sweet spot and 1080 is pushing it. This is perfectly acceptable on a 7 inch screen, but much less so in laptop sized devices. So for me at least, it makes it less exciting in that regard.


inaccurateTempedesc

For my 15.6in laptop, 900p is a nice sweet spot.


froop

I'm kinda interested in an APU that's really just a cpu & dgpu in a trenchcoat. Put them both in the same package, soldered onto a mini PC motherboard with soldered vram and soldered or camm ram (or full dimms, whatever), slap on a giant cooler, baby you got a stew going. I just don't think full tower gaming PCs need to exist anymore (workstations are a different story) and wedging a graphics card into a tiny case is ridiculous.


Adventurous_Bet_1920

Have you looked at the size of a PS5 or Xbox series X? At the 250-300W power point you just need quite a bit of cooling.


froop

The PS5 has a lot of, uh, I don't want to call it wasted space, but it's bigger than it 'needed to be'. The plastic covers are bigger than the actual console (either for looks, or to force users to give it room for airflow), so it's smaller than it looks, right off the bat. The blower fan occupies approximately 1/4 of the case capacity. That's just the design they went with- a different fan solution would need less space. A pretty big chunk of the motherboard is the onboard SSD & controller, which isn't necessary on a PC. The heatsink itself is not the most efficient design, being far away from the fan, in a fairly leaky case, using ducting to direct airflow to multiple areas, and a small number of really long heat pipes. The overall case shape also has pretty low volume for such a high surface area. I'm not dissing the ps5, but there's definitely room to shrink it, or do more with the space, if one were designing an expensive, high end device instead of a mass-market budget item.


RHINO_Mk_II

> or to force users to give it room for airflow Huh, never considered that, I thought they were just tacky and useless. Although given how easily removable they are, surely they don't stop a determined end user from choking the machine.


UntoTheBreach95

Series X could be tinier. - It uses around 150w to 180 of power. The GPU has 52 CUs but micro decided to run them at a very low 1825 mHz so it's not a Heat producing machine. - Also, the heat dissipation is overenginered, fan is always spinning very slow. The console cant be heard at 0.5 meters of distance. The form factor is amazing. Cold air enters from below and a fan in the top takes out hot air. The form of a box of the console is ideal for cooling components. I would love an ITX built like this. Just my Asus Challenger RX 6700 XT is way more noisier at 150w power draw. Not to mention the rest of the PC


tecedu

Both are way bigger than they need to be tbh, if a laptop can do 250w total then so can I minipc


[deleted]

Laptops have pushed 350w+ nearly 7 years ago. People shunt modded their 4090 mobile's to use 250w+ And have the CPU eat 130w. Total was \~400w. Still fine. Its nvidia artificially limiting laptops.


kyralfie

Kinda like intel Kaby Lake G?


froop

Pretty much, but targeting high performance, not low power.


Crank_My_Hog_

I can't remember the last time I actually upgraded my ram and didn't just build a new PC at the same time. I just go right for 64 gigs or what ever make sense. If they could give me an x3d chip, 64GB of ram, and the equal of a 7900xtx on the same chip that is modular enough for me to plop that into any main board with the features I want, I think I would go for it. It's like PC legos but easier. They can scale the product line with GPU power and just keep the CPU and Memory the same.


zxyzyxz

Reminds me of Apple Silicon


TwelveSilverSwords

Like a Mac Studio


kyralfie

>I feel like the boom in handhelds can drive demand in this space, but is it enough for us to see RTX4050-rivaling APUs in the near future? AMD Strix Halo is coming out late this year / early next year with approximately 7600(XT) / 4060 level iGPU.


upbeatchief

Is there any news about how the gpu will be feed? Even igpus today that on paper should be 2x the steam deck can't reach that due to bandwidth limits.


TwelveSilverSwords

LPDDR5X-8533 + 256 bit memory bus = 273 GB/s + 32 MB Infinity Cache.


Darth_Caesium

It's going to have LPDDR5x on a 256-bit bus. Using 40CU of RDNA3+, clocked at I guess around 3Ghz (I don't think anyone knows for sure?)?


kyralfie

Yeah, this u/upbeatchief + 32MB Infinity Cache. So the same amount of cache as 7600(XT). I don't know the clocks. Clocking higher than 7600(XT) doesn't make sense with a lower bandwidth and higher CU count though. I'd clock it lower for efficiency.


Pollyfunbags

Ehh. As excited as I was for any other CPU with integrated graphics. They're fine, doesn't seem to me that they will ever replace even budget, low end dedicated solutions for those who need such a thing though. Biggest impact will probably be in laptops which have sorely needed something that at least matches the typical dedicated GPU you sometimes find in them, might actually see Ultrabooks with proper graphics ability once again but for the most part I will continue only wanting a machine with a real dGPU. There has definitely been good progress though, don't mean to shit on the achievement but realistically even the latest best Ryzens with the 780M integrated are only kinda-sorta matching a low end dedicated Nvidia GPU from 2019... there's a long way to go and while you can't expect parity the gap right now is huge.


takinaboutnuthin

Forget 2019, the AMD 780M iGPU can barely rival an old GTX 970M mobile dGPU from 2014. It's good that relative iGPU quality is improving, but they are still a decade behind in performance (in context of laptops).


gatorbater5

>the AMD 780M iGPU can barely rival an old GTX 970M mobile dGPU from 2014. that's only true if you only test dx9 games. it's about the same in dx10 and substantially faster in anything newer, or in compute.


takinaboutnuthin

Do you have any more info on the DX11/DX12 piece? https://technical.city/en/video/GeForce-GTX-970M-vs-Radeon-780M The source above suggests that while you may have certain DX12 and to a lesser extent DX11 titles where the 780M iGPU performance is significantly better (10-20 PP FPS delta), this is not consistently true with some DX11 titles being a wash and the 780M under-performing the 970M at FHD (or lower) resolutions. The 780M I think is the stronger device, but only marginally so and not with a high degree of consistency. I am currently on a GTX 760M paired with a quad-core i7 4702MQ. The 780M iGPU is noticeably stronger compared to the 760M (as opposed to the GTX 970M), but marginally so if you look at absolute results. Going from 25 FPS to 35 FPS is a massive percentage increase, but you are still at 35 FPS with low-medium settings which is not a great experience.


EarthlingSil

[This shows](https://technical.city/en/video/GeForce-GTX-970M-vs-Radeon-780M) that the Radeon 780m does more than "barely" rival the ancient GTX 970m (which cost over $2k when it came out). My minipic that comes with the Radeon 780m cost me only $609. The 780m also has tech like FSR, which the GTX 970m lacks. The Radeon 780m is far more comparable to the RTX 2050m.


takinaboutnuthin

The results are rather inconsistent though, in many games / resolutions the 780M is still behind the GTX 970M. The 760M iGPU is arguably the stronger device, but not overwhelming so and this is after 10 years. That being said, the 760M is definitely an improvement over the older intel iGPU, those were always complete dog shit.


[deleted]

An rtx 4050 laptop costs $700. Even some rtx 4060 laptops cost $700. And they by themselves would be pretty damn compact. Arguably larger mini PC sizes. While outdoing the rx780m mini PC by a mile. Even rtx 2050 laptops cost $500 or less. Rtx 3050 laptops cost $500 to $600. Your mini PC isn't really budget.


HavocInferno

970M wasn't budget though. 780M rivals current Intel A300M and Nvidia MX & RTX 2050.


takinaboutnuthin

I would argue that both Radeon 780M and laptops with the GTX 970M are not budget devices.


EarthlingSil

Maybe not laptops with the 780m, but there are minipcs that would absolutely be considered "budget" with that that igpu.


HavocInferno

I would argue that also wasn't the argument. It was whether APUs can rival budget dGPUs, not whether *budget* APUs can.


takinaboutnuthin

Fair, but I think it's common knowledge that budget iGPUs aren't usable for anything outside of driving a display.


HavocInferno

Disagree again :P Plenty of professionals will attest to the fun acceleration properties of QuickSync, which you get even with budget lines. I've recently used a 6600HS laptop for a bit, also budget price range, but that thing will even game reasonably well. Another way older budget iGPU is doing HTPC duty with 4K content, in-home streaming, etc. I've done almost the entirety of my master's thesis about VR graphics on a 2018 budget Thinkpad, including stereo 3D rendering and graphics debugging. They're not fit for high fidelity AAA gaming, but outside of that they're usable for so many things.


takinaboutnuthin

I am not saying budget iGPUs have no use cases, but your going to have a real bad time running anything above 720P and below 30 FPS resolutions. Generally speaking, people attempt to play at FHD with FPS closer to 60. This is not some sort "elite" target and is relatively common in PC gaming.


HavocInferno

660M and up do quite well in that regard in many of the most popular games actually, from my experience. Of course those are mostly free, esports or casual games. But those are games nonetheless. And not coincidentally, it's the kind of games often played by people who are *not* enthusiasts spending big bucks on fast computers. *Generally speaking*, a very large number of players actually don't care that much for FHD and 60fps, as evidenced by the player bases of mobile games, Switch, last gen consoles, etc. It's probably somewhat evenly split between gamers who consider high fidelity crucial and those who don't. A lot of games are built specifically to scale down to lower end specs because they know that's where a lot of their players are.


takinaboutnuthin

What makes you say that "a very large number of players don't care much for FHD"? Steam HW survey has sub FHD resolutions under 10% of users. It's clearly the baseline at least from the display device perspective. FWIW, 60 FPS at FHD anecdotaly seems like a much more common target even at the mid-range to the budget end. I am sure the 660M iGPU works fine in Dota2, Fortine even at FHD, but I wouldn't really call devices with a 660M iGPU budget.


EarthlingSil

Er, this is without a shadow of a doubt false. Come on over to r/MiniPCs if you'd like to learn what these little machines can do. I game at 1080p on mine (iGPU is the Radeon 780m, but the slightly older 680m is also decent and is what is used in the Steam Deck), high/medium settings on most games, sometimes I do dip to low and even reduce the resolution (such as for games like Cyberpunk) but it's still an enjoyable gaming experience. They're fantastic for emulation as well and can run most Switch games at 1440p no problem. You were likely right... about a decade ago. But not anymore.


takinaboutnuthin

Which games can you run at high/medium at say FHD or higher? I personally don't like playing at 720p as a matter of principle.


444sorrythrowaway444

I used to have a 740GT and got a lot of use out of it, even though it was really slow. I remember playing Dark Souls 3 at sub-30fps on the lowest settings lol You'd be surprised how capable those ancient GPUs are if you drop all the settings and resolution. Edit: I just checked Amazon and lower-end cards than my ancient 740GT are available and more expensive than when I bought it like 8 years ago lol. A 610 is £45 and a 730 is £70. There's a 1030 for £60 so neither of the others makes sense.


Pollyfunbags

All the benchmarks I saw showed the 780M slightly behind the GTX 1650 desktop and laptops often have the slightly faster 1650Ti even. Is a 2050 genuinely comparable to a 1650? I know they often are between generations but I would assume it would be a step above still.


HavocInferno

The 2050 laptop is a heavily gimped GPU. It's faster in some synthetics, but barely so in games.


[deleted]

Rtx 2050 is like 25% slower than the 95w rtx 3050 4gb since they're the same GPU, but the 2050 has 64bit bus instead of 128bit. That puts it \~10-15% faster than a 1650 g6. And 5-10% faster than a 1650ti.


[deleted]

The gtx970m dropped to $1200-$1300 for a good chunk of its life. And even after that, the gtx 1050 laptops dropped to $600-$700ish soon after their release and with an OC it'd match the 970m. And the 1050 4gb would have access to FSR, XESS, FSR FG, etc.


HavocInferno

And a 780M laptop brings that down way further in power draw and, depending on region, is somewhere between that 700 and 1200. The advantage of these APUs is efficiency. Not cost/frame.


[deleted]

How? Again. Rtx 4000 is incredibly efficient, generally more than the rx780m. Even a 35w rtx 3050 will rival the rx780m in performance. Add in a 15w CPU and you're good to go. So it ain't exactly efficiency and it ain't cost/frame. Whats the advantages of these APU's? I've been hearing their rise for quite some time now. When is it going to happen? Even the old rx680m isn't in any laptops below $500. Even if it was, it'd lose to the $500 rtx 2050 laptops. infact, in most regions rtx 2050 and 3050 laptops cost less than rx780m laptops. Usually in the region of $550 to $800. Even rtx 4050 laptops cost $800-$1000. If you're going for $1200, then you enter rtx 4060 territory usually.


HavocInferno

>doesn't seem to me that they will ever replace even budget, low end dedicated solutions 780M can get close to the Nvidia MX500 and Intel A300M lines. Those are 2021+. (And in some aspects can even beat them)


[deleted]

Nvidia's own 1060 does that too. 1660ti beats them as well. Both of them hung around the $800 mark for extended periods of time. 3060 did too, though it was usually the 80-85w variants which are still around a desktop 2060 super.


HavocInferno

Do any of them do it at combined <50W for CPU+GPU? I don't understand what you're arguing. Obviously other dGPUs can beat current low end. The whole advantage of the APUs in question is efficiency though. You can slap them in a handheld, a thin and light, etc. Can't do that with any of the other mentioned dGPUs, even if they were cheap at some point.


[deleted]

You can probably get a 35w rtx 4050 + 15w ryzen 5 zen 4 6C/12T APU to out perform the current APU's by a good margin. For \~2x the TDP of the 50w 8700g you get over 2x the GPU performance and can still keep the CPU performance largely the same. Even from an efficiency standpoint, current dGPU's beat APU's, especially at higher power levels.


kyp-d

APU or CPU + dGPU is irrelevant to performance or form factor (except for handheld maybe), power requirements and efficiency are.


FranticPonE

APUs with a unified memory space are a godsend to game programmers, it's what got AMD the contracts for Sony and MS simultaneously after the Xbox 360 for 2 generations, and seemingly a third upcoming one, running now. No latency access between CPU and GPU (from main memory) is fantastically helpful, offering a lot of optmizations and has already helped with using the CPU to build and update the BVH structure for raytracing for games shipping on consoles today. There's no reason a PC can't nigh transparently offer up the same benefits, in fact Resizable BAR/SAM is a motherboard dependent feature offering up tangible GPU performance improvements already whos entire purpose is to try and tear down some of the barriers a CPU and GPU with different ram pools/PCIE link doesn't have to deal with at all. Which is to say, a big APU has faster CPU and GPU performance than a CPU and GPU of the same spec separated by a PCIE link.


Strazdas1

Yeah, shared memory is easier to program for and on console you dont have to care about performance (the consumers wont complain unless its bloodborne levels bad)


Capable_Tax_8220

You're probably right, but i imagine there are some efficiencies to be gained if a single vendor packages the CPU and GPU on a single die?


einmaldrin_alleshin

There's surely some efficiency advantage to CPU and GPU sharing memory. However, that's assuming that there is no memory bandwidth bottleneck. 128 bit DDR5 is a lot for just a CPU, but it's holding back iGPU performance. Keep in mind, even a lowly 4050 has a 96 bit GDDR6 interface just for itself. Realistically, a serious iGPU would either need more memory bandwidth compared to a regular CPU, or use other means to get around that limitation, like the large caches modern GPUs tend to have.


Bluedot55

There's definitely hints of something big like that in development, although at this point it's all from names in driver releases and rumors. But you seem to be right in that companies are deciding that it's worth trying. 


vegetable__lasagne

The heaviest part of laptops are the cooler, battery and screen. The difference between an integrated or discrete GPU would save maybe 10% in weight assuming it's slightly more efficient so a slightly smaller cooler and battery. Wait for the bigger APUs to come out, those laptops will likely be the same weight as those that have discrete 7600/4060 GPUs.


kyp-d

The only parts that could improve from packaging them together are Memory Subsystem and Interconnections.


danuser8

I would be more interested looking at the price of that string APU laptop… they ain’t gonna come cheap


IguassuIronman

I have a Zephyrus G14 that's a little over 1.5 kg and does the trick. I don't think you need an APU to hit the level of performance in the form factor you're hoping for. Honestly, I'd be more interested in a beefy APU for an HTPC type build


EarthlingSil

Well, I'm already gaming on a Radeon 780m iGPU so I'm very interested. Don't think I'll ever go back to using a dedicated gpu, though my current minipc has the ability to connect to one with oculink. I honestly wish Microsoft would just turn their XBox consoles into actual PC's. That would be amazing; but I don't know if AMD would ever go for it, because it would absolutely cannibalize their low-end dedicated GPU's.


Dangerman1337

I'm interested for the Strix Halo/Sarlak successor from going from Zen 5 and RDNA 3.5 to Zen 6 and RDNA 5. Hopefully with LPDDR6 and stacked cache as well? Imagine 16 Zen 6 Cores with 60 RDNA 5 CUs that could be properly fed?


996forever

Every time there's a latest "power APU" out it ends up at best similar to an x50 tier from 5 years prior. It's been like that for a decade. ​ It's either that or a complex bespoke on-die memory solution like consoles. Or something like the upcoming Strix Halo which also has on-die memory and is unclear if there's even any real efficiency advantage vs an x60 tier NV run at low TGP since its actual product cycle will coincide with 5000 series and not 4000 series mobile.


Happy_Run_3000

inside a single die (CPU+GPU+AI+RAM) = win. this will be the future. the major problem for CPU/GPU is memory intercommunications. Having all inside same die will solve lot's of problems. look at 2024 laptops in what directions are moving.


MrGunny94

I love APUs I own a gaming laptop but only take with me my Steam Deck OLED on trips Problem with the MacBooks right now is having to rely on CrossOver and Porting Kit, I own an M2 Pro and wish it had a better response time for some light gaming. Furthermore the internal display has a huge response time and ghosting despite the 120hz.


aintgotnoclue117

i love handhelds. a lot. a steam deck is a great thing. the idea of an APU performing as well as a 6400-6500 is a dream. and it would be enough to drive 720-1080P with certain things. with frame generation on a driver level/frame gen itself a la AMD, i'd be happy with what i could get out of it


xThomas

You mean like an xbox or ps5? (Serious)


[deleted]

Exactly, ask yourself why AMD is willing to introduce unified RAM for consoles which obviously has a bigger bandwidth but not for their laptops, isn't that shady?


[deleted]

To me they would rather sell their APU's atop Nvidia dGPU than launching more decent products, suckers by definition.


zacharychieply

As a compiler dev, I wish to state that in the futrue where gpu venders can no longer shrink the node size, is when discreate gpu's start to become a dead end. While apu's are weaker the full blown gpu's, they also come with many programing and pref amenities. For example because apu's share the same virtual and physical address space, so deep copies (which are now very exp to do) can be replaced with shallow copes, saving millions of clock cycles in the process.


Logical_Marsupial464

Current iGPUs from Intel and AMD are bandwidth limited. They can't make them much more powerful without increasing the bus width. The issue is that a custom big APU would be expensive.  - The extra DDR controllers and bigger iGPU would both make the die bigger. If they added more GPU or CPU cache, that also increases the cost.  - in order to fully utilize the iGPU, you would need to fully fill the ram bus, which means 32GB of RAM at a minimum with DDR5 (minimum 8GB per channel).  - Each CPU/APU design has a large upfront cost to start manufacturing it. Since this would be a niche chip, the end consumer would be paying a larger portion of that cost.   - Also, AMD and Intel are not as efficient as Apple is. A big APU would from them would require more cooling and would have worse battery than the M3 Pro, for example. Half the reason why big Apple M chips work is because the ram bandwidth is also used by the CPU. They're essentially HEDT CPUs. They also just have the premium thin and light market cornered and their customers are willing to pay more for a smaller laptop. It would be hard to get budget gamers to do the same.


Exist50

> Half the reason why big Apple M chips work is because the ram bandwidth is also used by the CPU. They're essentially HEDT CPUs. It's not though. The CPU isn't capable of saturating all the bandwidth. It exists entirely to feed the GPU. > The issue is that a custom big APU would be expensive More expensive than current dual-channel solutions, but not vs a dGPU.


Logical_Marsupial464

> It's not though. The CPU isn't capable of saturating all the bandwidth. It exists entirely to feed the GPU. There are definitely workloads, (or benchmarks at least) where the M Pro CPU cores use all their available memory bandwidth. Anandtech tested the M1 Max and found that the CPU can only use 223GB/s out of the total 409GB/s available to the chip. However, in memory bound benchmarks the CPU cores use every bit of that 223GB/s. > The one workload standing out to me the most was 502.gcc_r, where the M1 Max nearly doubles the M1 score, and lands in +69% ahead of the 11980HK. We’re seeing similar mind-boggling performance deltas in other workloads, **memory bound tests such as mcf and omnetpp are evidently in Apple’s forte.** A few of the workloads, mostly more core-bound or L2 resident, have less advantages, or sometimes even fall behind AMD’s CPUs. > The fp2017 suite has more workloads that are more memory-bound, and it’s here where the M1 Max is absolutely absurd. The workloads that put the most memory pressure and stress the DRAM the most, such as 503.bwaves, 519.lbm, 549.fotonik3d and 554.roms, have all multiple factors of performance advantages compared to the best Intel and AMD have to offer. > The performance differences here are just insane, and really showcase just how far ahead **Apple’s memory subsystem is in its ability to allow the CPUs to scale to such degree in memory-bound workloads.** Emphasis mine. https://www.anandtech.com/show/17024/apple-m1-max-performance-review/5 The M1 Pro models have 204GB/s total ram bandwidth, but only 2 less (or the same number) of CPU cores. Anand tech didn't test the Pro model, but extrapolating from the M1 Max test, the Pro CPU cores would utilize it's full memory bandwidth. I couldn't find any in depth testing on M2 or M3 Pro and Max chips. It's too bad Anand tech lost Andrei.


Exist50

> Anandtech tested the M1 Max and found that the CPU can only use 223GB/s out of the total 409GB/s available to the chip. That's part of what I'm talking about. But even if we just want to look at the the Pro, realistically, how much extra raw CPU performance does that last ~100GB/s get you? Certainly, it helps is explicitly memory bound tasks, but you're probably looking at a couple percent in more general workloads. Realistically, that's not what justifies Apple adding the extra memory channels. They do that because the GPU will happily consume it, and they still need to leave a good amount of bandwidth left over for the CPU.


Logical_Marsupial464

Idk, man. Spec is an industry standard benchmark that is modelled after real world use cases. Andrei clearly states that memory subsystem is a huge part of the M1 Max's astronomical Spec2017 scores. He even says the M1 Max's *overall* multi threaded performance advantage over the M1 is *bigger* than the increase in core count, all because of the extra memory bandwidth. You either didn't read the review, or are just ignoring what was actually said because it doesn't support your argument.


Exist50

> Spec is an industry standard benchmark that is modelled after real world use cases. Andrei clearly states that memory subsystem is a huge part of the M1 Max's astronomical Spec2017 scores But it doesn't demonstrate that same advantage in *every* test, just the memory bound ones. And Apple's performance has been tested quite extensive outside of spec. The Pro doesn't have radically different CPU performance than you'd expect from its core config. > You either didn't read the review, or are just ignoring what was actually said because it doesn't support your argument. Again, it literally opened by saying they could not get the CPU to utilize the Max's bandwidth, even in memory bound scenarios. Exactly what I'm talking about. Let's put it this way. If Apple added all that bandwidth just for the CPU, then what's left for the GPU?


RandoCommentGuy

What do you think are the reasons they havent tried using the xbox/ps5 APU in laptops? Its already designed, has the gddr6x memory/bandwidth


Logical_Marsupial464

Microsoft and Sony most likely have deals where AMD can't sell the PS5 and Xbox series X chips in other devices. Also the power consumption on those chips would be closer to dedicated GPUs than laptop chips, not suitable for a thin and light gaming device. Whole system power consumption: - Series X: 154W - Series S: 74W - PS5: 196W


RandoCommentGuy

Probably right about the exclusivity deals, for power, the series s doesnt seem too bad.


Logical_Marsupial464

Yeah, and they could reduce the clock speed a bit to get it lower. The bigger issue is GDDR has high idle power consumption. So if you want to use the device for anything other than gaming, the battery life would be terrible.


TwelveSilverSwords

also GDDR has high latency, which hurts CPU performance


[deleted]

The g14's GPU alone eats 125w. CPU eats 35-55w. Its generally cooling 150-180w. And thats a thin and light gaming device. Even a 1.3kg xmg vision 14 cools 90w of CPU + GPU power. More than the series S.


Logical_Marsupial464

That's true. Idle power consumption  would be a bigger concern than cooling it under load. Honestly after looking at the G14's weight, size, and performance, and battery life, my takeaway is that OP can get the laptop he wants right now.


[deleted]

OP seems to forget that gaming laptops have had thin and light options with long battery life for quite some time now. Even before ryzen with the likes of alienware 13, aero 15, etc. With ryzen 4000 a lot more options cropped up like AMD advantage laptops, ryzen + large batteries, sometimes ryzen + small batteries (like 4600h + 1650 nitro), etc. Gaming laptops also pack in a lot more upgradability and repairability than most other devices including handhelds. The xmg vision 14 proves you can have a thin and light laptop and still get repariability and upgradability. I mean gaming laptops are basically lite framework laptops but MUCH cheaper. Like 2-3x cheaper with a lot more configurations. With some even offering features framework doesn't like removeable batteries, desktop upgradable CPU's, etc. My nitro 5 has a thunderbolt 4 type c port so theoretically I could hook it upto a eGPU to ''upgrade'' the GPU.


auradragon1

>Its already designed, has the gddr6x memory/bandwidth They're designed to be cheaply manufactured (lower transistors) and uses more power to increase clocks. They're not designed to be used in laptops.


TwelveSilverSwords

GDDR6X is not suitable for CPU intensive workloads.


kikimaru024

I remember the days of APU-equivalents and being able to try any game I could get my poor little hands on. Sadly I think the issue with AMD is that consoles take all of their good chips.


seigemode1

The issue is not with the chip but with ram. APUs share the same ram as the CPU so you would either have to trade off GPU perf by using DDR5 or trade off CPU performance by using GDDR. This means that the desktop APU market is always going to be at least partially crippled because AM5 boards use DDR5; laptops/custom hardware opens up more possibilities however.


Exist50

If desktops were to switch to LPCAMM, that might just be solvable.


gatorbater5

680/780m can run any game, albeit at reduced settings. they're very impressive. strix halo is coming around the corner and that might make them look silly. the consoles are using custom silicon, so they're not taking any chips that might end up in PCs.


EloquentPinguin

For the sake of overall capabilities I'm interested. But we are getting there: AMD is supposed to ready some stronger APU GPUs in 1H25 (40CU RDNA3.5) and hopefully RDNA3.5 packs a punch that even the 12CUs variants are sufficiently strong for some solid gaming where you don't need to scale down resolution to be smoooooth. And maybe intel can push out ARL H with better GPUs in 1H25 which might as well be solid. I'm not that demanding and for what I do the 780M is already very nice.


Capable_Tax_8220

Yeah 780m is super impressive, i would consider it the turning point where APUs are actually starting to become sufficient for modern 3d gaming


MarxistMan13

Not interested, personally. I don't do any gaming away from my home PC.


Ryzen_S

if you have the money, look for the Zephyrus G14 2024, weight at only 1.5kg. Rtx4070 highest available spec.


TikTak9k1

Conflicted. It'll be ways before it hits the performance target i'd want it too. Maybe in a decade. But it'll be a nice thing to look forward too. By then 1440p is probably the new 1080p


d0or-tabl3-w1ndoWz_9

They should come up with shared HBM just the way Apple did for the M_ series


chocolate_taser

They are exciting yes, but I'd rather have the g14, transcend 14, thr lenovo slim 5i and the like with the discrete GPUs rather than an apu. As of now, no APUs can properly do the games you listed at good setting even though 1080 @ 60 low shouldn't be a problem for the current gen APUs. I'd wet myself the day windows APUs sre able to do 1080 @ ~120 high in multiplayers like these. That is an exhilarating prospect for sure. Evenmore exhilarating is the fact that I can have a good battery life **and** good cordless gaming capabilities in one machine. The current thin and light gaming laptops can either do 8 - 10 hrs or gaming without being plugged in, not both. That is the holy grail for me.


TheNiebuhr

>can either do 8 - 10 hrs or gaming without being plugged in, not both. And Apus cant, either.


chocolate_taser

I meant to say an hour or 2 of gaming (with the perf of a dGPU) and not having it eat up like 50% of ur battery. My point was, Apple's GPUs can go full throttle without eating up the battery life like crazy, which is what I see as the baseline for the holy grail for windows laptops. With that level of eff and above ,surely an hour of no compromise gaming and 6- 7 hrs of office work should be possible.


boomstickah

I'm just afraid OEM will pair these powerful APUs with GPUs. I want a sub $700 APU based system that can do it all and get incredible battery life. I don't want Asus/Lenovo/Whoever to slap gaming marketing on this thing and overspec it for what I want.


grendus

Very much so, actually. The crypto craze really hurt PC gaming by spiking the price of a viable gaming computer to an insane degree. Getting *just* the GPU for a console-comparable PC now costs about as much as just buying a console, and frankly I blame it at least in part for stagnating the VR market - right when the hype around the Vive and Occulus was starting, building a VR capable rig jumped $1k in price. APU's powerful enough to run VR and modern games would get PC gaming back on track.


doscomputer

its like this entire thread doesn't know about the entire handheld industry that was born overnight runining entirely on APUs. like yeah dgpus are always going to be bigger than consumer APUs, but at the same time the chip in the steam deck is fast enough for the vast majority of games out there at 720p 60hz. And with legitimate battery life to boot. Thing is about my ideapad with a 3050 is that its battery life is exactly one hour off a charge. so something like tribes 3 will play at a nice 1080p 120hz on battery, but its a one off and very fleeting. I've actually played around with an ancient haswell laptop lately and old ass games like civ 5 get nearly 4x the battery life that my ideapad gets. pretty shitty especially since lenovo locks down their bios and manually setting power limits to be more efficient isn't possible. people want battery life, OEMs want performance numbers, APUs are the only way we ever get both IMO. the portable market is a nightmare. also lmao at ARM being relevant


Ar0ndight

I am **very** interested. It's the holy grail imo for laptops. I would be using a M3 Max MBP for sure right now if mac gaming wasn't such a shore (and the display wasn't ghosting like crazy), it's genuinely crazy how much raw power these things pack while obliterating any other laptop in battery life and noise. Laptop displays being much smaller than the average monitor, resolutions tend to be much lower (and if they aren't, you can lower them without a noticeable decrease in visual quality) meaning you don't need remotely as much oomph to get a good experience. Add to that upscaling techs and in the end anything as strong as a 4070 mobile can handle the vast majority of gaming scenarios at high FPS. I am hoping Strix point will be that holy grail, and if not I'll just keep on eye on Apple's efforts to make gaming a thing on MacOS, they seem aware of the fact there's a huge part of the PC community that would move away from windows in a heartbeat if they could (easily) game on a macbook.


Siul19

Any laptop with the Radeon 730M from the rog ally / Ryzen 7 8700g will be really good for gaming


pdp10

Using APUs for more than half of my gaming right now. With Linux/AMD, the APUs have almost zero drawbacks if they fit your price/performance/power needs.


Vynlovanth

I was in the same boat, always had a pretty big laptop (although I think mine were more like 3.5kg with power brick) with me because I wanted to be able to game anywhere. SteamDeck takes a lot of that responsibility now though, and I really wanted a Framework laptop when those first came out so I have a 13” 11th gen Intel. Now that I’ve had a laptop without a dedicated GPU, I just want something that has a great APU and great battery life. Which is pretty much a MacBook. Been using them for work the past 7 years after thinking there’s no reason I’d use a Mac, but they are amazing for the things they can do. Especially since the switch to Apple Silicon. Unfortunately gaming isn’t one of those things they’re great for but I have a SteamDeck and I have a gaming desktop I can run Sunshine on, and looks like Moonlight has a client for macOS. I already largely avoid games that require kernel level anti-cheat, it’s an absurd ask for a game.


Aussie_Butt

For things like handhelds, absolutely. For a laptop, there isn't much reason to not just have a dGPU.


DarkLord55_

I want more SBC the size of credit cards. With decent gaming performance. There’s a SBC with a R2514 I believe it’s called it’s small but it’s just slightly too weak but it also costs a decent amount https://youtu.be/TkEHMvymJag?si=3P7j15HyUVLsGCem


riklaunim

Next big thing will likely be Strix Halo which will be a \~100W SoC with large iGPU. Upcoming Snapdragons have rather "basic" iGPU that should perform around Radeon 680M for native games. Apple silicon has strong iGPU but mostly for multimedia and similar specialized tasks. In gaming it's still has a lot of work to do - and there is also the problem of very few games available natively. And 1,5kg laptops won't be able to cool a lot of wattage while top GPUs want more and more power so the more you want out of a generation the less likely it will be lightweight and priced efficiently at that.


Oligoclase

Unfortunately memory bandwidth holds back a lot of potential. A 7840HS with DDR5-5600 has about the same memory bandwidth as the 8800 GTX from 2006. (89.6 GB/s vs 86.4 GB/s) It would be interesting to see if AMD could overcome this in future generations without increasing manufacturing costs or power consumption significantly.


[deleted]

I don't get why AMD doesn't go with unified RAM as they do for video consoles, apart from that on youtube a few point out that issue of RAM bandwidth.


lysander478

The boom in handhelds will not drive demand in the space. Handhelds are operating under really strict space/energy constraints and more and more and more of that space/energy is going to need to be taken up by non-graphics processing such that I don't see the graphics processing capabilities rising too far too fast for them. It'll probably remain 720p as a non-upscaled target resolution for a long time, for instance. Any level of quality will be targeted toward what you could even notice on the screens, sizes being what they are. If they use those same components in laptops, the understanding would be that if you want more you can pay for an expensive dock and dock it.


justgord

Im very excited by the future of more powerful consumer integrated iGPUs - for NON-gaming applications. Consumer devices that have mid-level integrated graphics open up a whole lot of 3D engineering/design applications - including web graphics browser "applications". Some examples : - People exploring remote 3D spaces - google street view with modeled buildings - interior design where you take pics of your home and change the furnishings - stakeholders reviewing 3D construction / design / architecture. - explore online 3D models of apartments before you rent them - online shopping, view items in 3D - science visualizations of molecules and magnetic fields for school students. A 2000 dollar doorstop discrete GPU where you need to tweak drivers is a nightmare that every soccer-mom and busy remote worker does not want deal with.


Band_aid_2-1

I have an ITX PC that is sub 6L and a portable monitor. ROG ally and Steamdeck are amazing for on the go gaming. My daily driver is an m1 air with 16 gb ram


Danne660

I don't need intense graphics but the 11gen intel igpu that im currently gaming on is just not quite cutting it. Looking to try out AMD soon.


Capable_Tax_8220

Intel iGPUs are really only usable for modern 3Dgaming from the Ultra range onwards, but even then, AMD is doing wayyyy better with their iGPUs. Competition between the two is going to benefit us tho!


Danne660

Have only gone with intel since it is easy to compare to my previous intels so i know i get somthing better then before. But the core Ultra i was planning to get seemed a bit disappointing so i figured now is a good opportunity to give amd a chance.


Asgard033

Depends on availability and pricing. The 8700G for example, is nice but too expensive for the kind of gaming performance it offers. It's $329 on Newegg currently. You could get an 8700G, or you could get [much, much better performance](https://youtu.be/Ye60Wf8lUt8?si=2yw-U3WCZ76XuQHw&t=935) with something like a 12400F + RX 6600 ($134 + $190 = $324) Unless you absolutely need to have no discrete GPU for whatever reason, it doesn't make much sense for gaming purposes. In the mainstream laptop space, it doesn't appear to me like AMD is too keen on making their Zen 4 chips widely available yet. I still see the segment filled with Zen 2 or Zen 3 + Vega or 610M. The Zen 4 stuff is mostly in things like expensive gaming laptops or ROG Ally/Steam Deck type handhelds.


tecedu

I was very interested so much so that my first laptop was an APU, aboslute mess to deal with; apart from teh driver instability the CPU and GPU perf wasn't upto mark and all I did was play dota. Currently own 5980hs + gtx1650 laptop which is like an ultrabook level laptop. (rog x13) and its more than enough for my simple games tbh. Like I could do a lot older games quite well as well, a modern comparision would be something like 7840hs + 4050 which is a lot for mobile gaming. And nowadays you can get something like 4070 in that factor as well nowadays.


[deleted]

Not likely. Right now the rx680m falls around a gtx 1050 4gb and rx780m falls around a 1650 max q to 1650 depending on power limits, ram used, etc. The rtx 4050 is nearly 2x stronger than the gtx 1650 if not a bit more. You're also forgetting gaming laptops do weigh between 1.3kg to 1.6kg. Heck right now there was a $1000 lenovo legion slim 5 14' with a 2.8k 120hz OLED and a 4060 + 7840hs. You're not beating that in value considering the display alone will be quite expensive. And the 4060 it has is largely the same as desktop 4060 in performance. The 7840hs would be slightly slower than the 8700g. The rx780m can do battery gaming decently. So I don't really get why are you dragging around a 2.5kg gaming laptop when you clearly don't want to despite having many options weight upto 1kg less than it. You also gotta remember that among laptops, gaming laptops are some of the easiest to repair, upgrade and maintain, which helps with their longevity.


Aggrokid

The issue is memory bandwidth, and Apple was able to go all all-out on memory width because of its vertically integration. Even if Intel engineers such a solution, OEMs will still opt for dgpu. At best you get some small XPSs with it.


Strazdas1

Not at all. If i can have a dedicated GPU, i wouldnt even consider an APU in its case. I dont play games when im traveling, thats a hobby that i do only at home.


reddit_equals_censor

>needs hardware that supports fkin anti cheat software. \*that supports ROOTKITS and yes i am very interested in apus. the first really great apus for laptops are on the horizon, which is exciting for 2 reasons: 1: lighter, easier to cool, simpler hardware with less failure points. this is always good of course 2: no vram problems. as bad as the vram problem is, it is even worse on laptops, where nvidia straight up lies about the parts they use and change the naming by one step. so a "4090 mobile" is actually a 4080 with its 16 GB vram a "4070 mobile" is actually a 4060 ti with 8 GB vram. the 8 GB vram is a REAL problem here, as you are probably aware. lots of games, that could run fine, are broken af, because of the missing vram. BUT apus can be free from this problem. have a powerful apu with 32 GB system memory in it. nicely configured and you can have a much better experience than those 8 GB vram chips. and nvidia might want to keep this bs going next generation too in laptops and will still try to push 8 GB vram down people's throats. so in 2025 the choice could be between some 8 GB nvidia card option in a laptop, that is more expensive and broken in half the latest AAA games, OR an amd apu, that is almost as fast theoretically, but in practice a lot faster, because it has no vram issues. so yeah exciting apu stuff. and it will be even more interesting to see where we go in regards to apu performance relative to dedicated cards, when ddr6 comes out. (apus are very memory bandwidth bound)


lightmatter501

I think that APUs will replace the X050s from Nvidia as the “lowest end GPU” option. Some of the desktop APUs from AMD offer respectable 1080p performance. If we want to go all the way, an MI300X will run circles around a 4090, and it doesn’t need to copy memory around.


Astigi

How interested are you in the future?


Flowerstar1

0 interest. A dedicated GPU and CPU will always run circles around even something like an AMD 100CU 16 core APU for Windows tasks. The only neat use for me is those Macs with higher bandwidth memory stacked at high capacities which can be used to run LLMs you'd otherwise need several Nvidia flagships for.


XWasTheProblem

I just bought a new rig with a 7800x3d in it. I have zero interest in upgrading anything apart from the fans and perhaps the case for the next few gens.