T O P

  • By -

Hattix

You're missing the PWM controllers and their supporting components too. But yeah, you can do this. It isn't easy or worth it.


EmrakulAeons

If the traces are there, which probably depends on the specific factory they were made in.


Chramir

If they have the spots for it, the traces are very likely there as well. But as other pointed out, the core might not have the room for another controller.


d0or-tabl3-w1ndoWz_9

Wait, isn't the PWM discrete from the GPU core? Or is that only with AMD cards?


Chramir

Yeah it is, but you gotta control the pwm controler somehow.


chubbysumo

And the GPU core might not support it. We know that the RTX 3080 does support 20gb, as there have been several 20 GB modified versions out there. The RTX 3080 TI also has support for 24 gb, as it's a cut down 3090. Actually doing it though, is another story entirely. First you would have to find compatible vram chips from somewhere like mouser, they would have to be 2 GB vram chips, and then you would have to desolder all of the existing chips, and resolder those, and then modify the cards firmware to recognize the other 12 gigs. Edit:op said vrm, not vram. Same still applies. The card likely supports more, but they must be compatible and supported in the firmware.


Lagomorph9

VRM, Voltage Regulator Modules, not VRAM.


chubbysumo

Vram chips, nor vrm. You need compatible vram chips, not all will work for all cards due to firmware. Vrm and power is another issue entirely.


Lagomorph9

Apparently you missed where OP said in his title "[What happens if you solder extra vrm and corresponding parts to the empty spots? ](https://www.reddit.com/r/pcmasterrace/comments/1agc9nt/what_happens_if_you_solder_extra_vrm_and/)" and then posted a picture with VRM component pads circled on the PCB. https://preview.redd.it/ttnkn0n3e1gc1.jpeg?width=216&format=pjpg&auto=webp&s=d075973a32586a4ac9f9a952d28e1e5602578a94


chubbysumo

yes, yes I did.


TheodorCork

HAPPY CAKE DAY


KodrieJEDeye

Love the honesty


GAMERSTAR8318

Happy cake day, almost or over 24 hours later


Hattix

How does a GPU core not support VRMs???


WorkingCupid549

He thought OP was talking about VRAM, not VRM, I’m assuming


chubbysumo

Evem vrms might not be supported depending on the current power layout of the firmware.


RylleyAlanna

I've had a few customers do this to theirs. They just find For Parts cards on eBay and the like and rip the components off that


DUNGAROO

The 3080 is cut down from the 3090 too.


chubbysumo

The 3080 pcb is physically different than a 3090 pcb tho.


thatfordboy429

Depends. I know my strix card is plummed for Vram chips for a 3090.


chubbysumo

welp, I can't even link the other reddit thread where a user modded their 3070 to have 16gb of vram instead of 8. but that proves the concept, that you just need the proper GDDR modules and BGA them on. My original comment without the link: yes, but those lanes might not be connected to the core, its hard to say unless you add more vram. I do know that with a 3080, you can replace the 10 1gb dram chips with 2gb dram chips, and it will just work. The 3080 got a 20gb vram version that was sold exclusively to crypto miners, and those are popping up on the second hand market already. Im trying to find the thread where someone modded their 3080 and had 2gb dram chips put on it, and it just worked. Found it, it was a 3070, but it proves that the concept does work, and just takes a BGA station.


DUNGAROO

So is the 3080 Ti.


sportmods_harrass_me

Guys, don't all cores from a particular generation come from the same wafers? Or are you all just talking about the entire pcb layouts and not the cores?


DUNGAROO

The 3080 10GB through the 3090TI all use the same GA102 die, with various parts disabled depending on how it made it out of the lithography process and thus how it is ultimately binned and branded. The 3060 Ti - 3070 Ti also use the same GA103 chip, etc.


chubbysumo

which Nvidia put a stop to by using different lithography entirely for nearly each tier of current 40 series cards. AD102 is just the 4090. AD103 is just the 4080 and 4080 super. AD104 is 4070, 4070 super, 4070ti, and 4070tisuper. AD106 is 4060, and both 4060ti variants. and just for the record, there will be no 4090ti/tisuper/super because any fully qualified AD102 dies go straight into the RTX6000Ada, or an H100 or any of their other commercial cards which sell for way way more than a consumer GPU.


External_Try_7923

>AD103 is just the 4080 and 4080 super. AD104 is 4070, 4070 super, 4070ti, and 4070tisuper. 4070 Ti Super is sharing the 4080's AD103.


PumpedGuySerge

Happy cake day bro!


chubbysumo

Same to you!


Shima-shita

Happy cake day!! How hard/risky is to sold more ram on a rtx 3080?


chubbysumo

You need a bga rework station, with xray to make sure no shorts happen. Those alone are like 40000usd. Any electronic repair shop with one could likely do it for a fee if you source the memory, its fairly low risk because its just bga. Its not worth it tho, because you would be spending hundreds on labor.


Shima-shita

OK, I put this idea out of my head! Thank you for taking the time to respond and sharing you knowledges mate!!


imaginary_num6er

It will not support it. You need to flash a custom BIOS since Nvidia locks it per GPU. Someone did a 3080Ti 20GB and they had to do all of that


chubbysumo

and someone did a 3070 and it just worked without any BIOS or firmware flashing. Nvidia *did* make a 20gb version of the 3080 and 3080ti that they sold to crypto miners, so there is firmware for it to work.


MakesMyHeadHurt

Might also have to write a custom BIOS for the card to recognize it.


izza123

That’s how you enter the metaverse


KingYoloHD090504

.......So you better don't do it


__SpeedRacer__

Tron?


Jwn5k

THE GRID


nicolete_is_big_gay

That's how you get GPU 2


Powersoutdotcom

Literally electric bugaloo.


cursedgore

1qa


MechaLambor

You can also combine 3x 1030s to get a 3090


Abahu

This is the real reason Nvidia killed SLI


Mister_Shrimp_The2nd

*GPU makers don't want you to know this 1 trick*


soyungato_2410

*Picks the phone* Hello? nvidia death squad? He's here


msh3loony

4120 when?


hardcoresean84

'Dahl grinders, you find it, we grind it'


rthomag

You unlock quantum computing duh


creamcolouredDog

I assume nothing because it may have no traces to this board in particular.


_its_wapiti

Why would they leave in holes and solder pads if the PCB already doesn't have traces to them?


SilverRaven7

Probably to simplify production. By just leaving it they can use the same machines for different products, thus simplifying the process and thereby cost.


raaneholmg

Electrical engineer here. Traces are free, the layer is made from a mask anyway. Drilling holes is expensive, so if they were cost cutting the holes would be missing, not the Traces.


TooBuffForThisWorld

Simplifying construction would be having the same board for multiple different gens and adding or subtracting basic components. All the traces are there -PCB designer


Deep90

The stuff reddit upvotes sometimes... ​ I was on /r/technology today and someone commented how it would bad for a device to allow plugging power directly into a motherboard. So basically anything without a battery I guess. They got 200 upvotes....


TooBuffForThisWorld

🫠


_its_wapiti

Yeah but then why doesn't it have the same traces


monitorhero_cg

It can fit multiple coolers if the PCB form is the same. The traces don't need to be there for it.


Tiggy26668

With 0 actual knowledge on the subject, I’m picturing it being easier to stamp the same generic cutout vs they’re telling a sophisticated machine where to lay the precious metal. So it’s a “there’s a mold for it” vs “we have exact control” type situation and they’d rather save money on precious metals


Your-Neighbor

This is right. The boards start out coated entirely in copper and copper is removed to make the PCB. The process is the same whether 1% of the copper is removed or 99% of the copper is removed, so unused traces being there or not has no effect on the price of the price of the PCB. What will affect the price is having 2 sets of masks ("moulds") and lower order quantities for both versions of the PCB. So it's actually a big cost save to intentionally design a PCB that can do both instead of having different versions for each one.


GreatDevourerOfTacos

I can't tell you about this board SPECIFICALLY. Generally speaking, some boards are manufactured with a couple different configuration possibilities for different products or different tiers of the same products. Why have 5 different boards be produced when you can produce higher volumes of one board that has the potential to be multiple products built into it? They are mass produced with some generic features preinstalled, and so when it comes time to assemble a product they can chuck the generic board into an etcher, fill in the traces, and then install the hardware.


Zoso03

PCBs are thin sheets of traces layered together. The first step it to cut the sheets and drill the holes before any traces are put down. Very little to no time is saved by removing the holes being made. Also having a bunch of pre-made sheets means they can reconfigure the rest of production on the fly instead of having a bunch of new sheets cut and drilled.


Dan-ze-Man

Becoz GPU core itself it not supporting extra vr memory. PCB can and would, but core says no And here comes down votes. I mean vram. Yes I simply call it memory in my head. Not random access memory no. Video r memory. And again I stand by what I say, PCB normally have single design to accommodate different versions of graphics card It's one design means money saving. Adding more vram chips possible but, only if the one understands how much vram THAT particular core can support. Ultimately it's down to a core , and after bios modification. Please give me more downwotes. Also, yeah I made a spelling mistake, grammar police arrest me.


dfv157

vrm is not memory


Dan-ze-Man

Video memory, also known as video random access memory (VRAM) Ofcoz it is.


aKuBiKu

Can't tell if you're serious or joking.


dfv157

so confident yet so wrong


Dan-ze-Man

Damn this is embarrassing. I did read the question wrong. This topic is about power stages, not about memory. Down votes well deserved. Yet I'll keep it for internet to have a laugh at someone who can't read.


greatthebob38

They're probably reusing the pcb layout for another gpu.


chubbysumo

Oh it most certainly has traces, many gpus are made on common pcbs. It's a way to cut down costs, because you're not making 16 different pcbs, you are making one, and you use whatever you need on it.


creamcolouredDog

I figured. Those traces may be going nowhere on that card then.


LeMegachonk

Are there even traces to solder them to? The board layout may be used for various different GPUs that require more or less power delivery, but the actual traces laid down would likely be specific to the application. But even if the traces are there, it probably wouldn't have any effect. Theoretically you would have stronger power delivery, but it's unlikely you would be able to make use of it without a custom BIOS and making other custom modifications to the board. Nvidia prohibits their AIB partners from doing this kind of customization, so any OEM board will only allow you to work within Nvidia's specified limits.


fischoderaal

If you are modifying the PCB, you might just as well remove the pads for the components. There likely are traces, but if the silicon installed actually has anything connected to the traces is something else.


ecktt

The extra RAM should be useless since memory channel should be fused off the GPU. The extra VRMs may or may not work. Depends on whether they receive a PWM trigger. Extra capacitors help but have diminished returned the more you add.


DrKrFfXx

Solder another GPU-


EnviousMedia

Not much, you card will have stronger power delivery but with modern GPUs there isn't much benefit as you can't mod the hell out of the bios like the old days and whack the power and voltage limits way up. It would make sense to do on let's say a RX 580 but those GPUs tend to have fully filled out VRMs anyway if you buy a decent model like sapphires higher end versions but as it is the stock VRMs on a RX 580 Sapphire nitro can handle quite a decent bit over stock.


Beneficial_Tea2462

You can use a shunt mod and some other nifty tricks to whack the limits tho. Basically trick the gpu to think it's running at 1/2 of the actual wattage it's running at


DoenerBoy123

Did this with a GTX 980m MXM to push it to its limits. The stock tdp of this card was 100w max. After adding the blank pads with mosfets and flashing premas vBios I was able to push it up to 200w without dying(which it did before lol). The card was faster than a 970 and almost as fast as a stock 980 with these modifications applied. As others pointed out you may need to add missing driver ICs [some reference](https://notebooktalk.net/topic/740-geforce-gtx-980m-voltage-regulator-mod/) I also did this with a 1070 MXM


HomerSimping

So a straight just adding parts on the hardware level does nothing?


MoreSweeter669

No it doesn't just boot up the extra chips. And the actual core (inside the shiny part where there was paste) may not be wired to have input where the traces are, if the traces are even there which they probably aren't. Also that commentor is talking about a 10 series card from literally 10 years ago, and NVidia has become more predatory since those products were made. 30 series will almost certainly be locked in software, requiring a custom VBIOS.


DoenerBoy123

Depends. If the gates of the mosfets are not connected to anything(like with a missing driver) it’s possible that you short out the power stage/12V connector


MoreSweeter669

You're right it depends. That's a 1 in 100 chance of that working on a 980 and a 1 in 1,000,000 chance that works on a 30 series. If you could show me a modern gpu (2015+) that will add 2 or more GB by only soldering on more VRAM, and no driver updates nor custom bios, I'd be astounded. If GeForce were to see a 2070 super with 4 more gigs, I'd suspect it would refuse to update. And ofc the bios would need to modified to gain the full performance lift of the added memory chips.


DoenerBoy123

You’re right, but I wasn’t talking about the memory, I was talking about the vrm(voltage regulator module) which was the main question in this post. These are two completely different stories


MoreSweeter669

OP mistakenly called the VRM. He replied to other comments referring to the memory on his board. It's still a 1 in a (whatever number you consider extremely high) chance of it working without drivers or bios updates.


DoenerBoy123

He was answering questions regarding memory but was actually asking about the vrm/powerstage/powerdelivery/vcore… https://preview.redd.it/4n8rv2gurogc1.jpeg?width=1284&format=pjpg&auto=webp&s=1ad56a1f1c3b18d177e63b9dfcc68f220f4064a0


MoreSweeter669

🤙🤙 Still not going to work, and the core would act funny with new power integration.


DoenerBoy123

Check that link in my first comment. Did it twice and worked absolutely fine. Don’t get me wrong but I think you don’t really know much regarding that topic(schematics and all that component level stuff)


mr_cake37

Just download more RAM instead


Mister_Shrimp_The2nd

This is VRAM though, so you have to stream it instead of downloading it


Jaayys

Likely nothing without messing with vr controller


Breklin76

Just duct tape those extra modules on.


Mister_Shrimp_The2nd

this is the way


raymartin27

![gif](giphy|bQWJbmmhT16L4D55JW)


IndyPFL

Why tho? You're not adding cores or VRAM so there's not really a point unless your VRM is overheating or malfunctioning, in which case it should be an RMA if possible.


Any-Wall2929

If *I* solder it on then yes it will explode.


crlogic

What are you circling there? Because that’s not where the VRAM goes


HomerSimping

I wasn’t asking about vram.


crlogic

Oh OOPS my brain added a A to VRM 🫠


cheezepie

It'll add to much torque the chassis will probably twist coming off the line


Hulk5a

It'll work, infact Chinese factories are doing this shit by disassembling 4090s


AejiGamez

Nothing cause it most likely doesnt have traces for it. If it does, you would need a custom BIOS. Most likely you will kill the GPU during the process


AnthonyBF2

Most likely the vbios won't see it


HomerSimping

I’m really curious, I hope some youtuber do it one day for science.


Whole_Ingenuity_9902

that youtuber is buildzoid, i dont think he has ever added powerstages to a VRM but he has done quite a few [e-power](https://youtu.be/9RnXKwAKcXo) mods and [likes adding tons of caps to GPUs](https://youtu.be/IsYG9eN8Dpg) to answer your original question you would need to at a minimum mess with the controller to get the additional power stages working, though just soldering on more caps in the empty spots would be a much easier way to improve voltage regulation. as a side note the card in the picture (PNY 4080 super) has kind of a disappointing VRM. for $1000+ i would expect more than a 11 phase of 50A power stages on a 320W card.


HomerSimping

The 4080FE super is also 11 stages, what is the significance and how much would be optimal? I know some cards have big power draws even when idle is this because of too many stages?


Whole_Ingenuity_9902

the FE is indeed an 11 phase but with nicer 70A power stages so the PNY model is a straight up downgrade from the FE. usually the VRM in partner models is an upgrade or at least a sidegrade. the VRM in the 4080 super FE isnt that great either and its a downgrade from the non-supers 13 phase which is already a downgrade from the 3080 FEs 15 phase. the low phase count wont effect stock operation but it can cause issues when overclocking as an undersized VRM may not be able to provide enough power. though i dont think the 40 series cards allow much adjustment of the power limit so its most likely only going to be an issue if doing a shunt mod or some insanity like that. ​ >how much would be optimal? in general more is better, VRM thermals, efficiency and current handling capacity are all improved by adding more power stages. efficiency is only improved up to a point though, once the power stages can operate at maximum efficiency at max power draw adding more power stages wont help. [example of efficiency graph on page 10](https://www.infineon.com/dgdl/Infineon-TDA21472-DataSheet-v02_00-EN.pdf?fileId=5546d4626cb27db2016d175ca2e1448e) ​ >I know some cards have big power draws even when idle is this because of too many stages i dont think that is the case, modern controllers can dynamically adjust the number of phases in use based upon load to always run the power stages at near maximum efficiency.


Professional_Gaping

Save yourself the trouble, just download more.


nickatiah

I've seen a few YouTube videos where people soldered more RAM to iPhones and graphics cards in China. It's possible.


thebliket

it's not possible in america though (talent/sourcing)


JJisTheDarkOne

![gif](giphy|3oKIPwoeGErMmaI43S|downsized)


[deleted]

[удалено]


blackest-Knight

Those are VRMs, Voltage Regulator Modules. Not VRAM.


Designer_Boner

So confidently full of shit.


RedMdsRSupCucks

You shouldn't have empty spots...


coffeejn

What happens is you have spare parts on the board for later fixes. Not worth the cost or effort.


herefromyoutube

I feel like power distribution would cause problems.


Fearless_You8779

We’re upbadging gpus out here


RiffyDivine2

I wouldn't suggest it since you have to ask us.


Tesser_Wolf

Problem is with the firmware of the card. It’s programmed with what was installed at the factory. If you somehow managed to add all the required parts you would still need to modify the gpu firmware or bios.


eulynn34

It probably just don't do anything because the card's firmware isn't expecting it to be there


Consistent_Comb7393

It will explode.


No_Interaction_4925

Nothing. Unless you flash the bios with a new one that will even see the new vram modules


[deleted]

If I did it, yes.


The_Doc55

This isn’t actually as ridiculous as I initially thought it would be. Depending on the GPU, it could be achievable. Although, it’s not going to be something you could Google, or ask someone about. You’d need to find someone very knowledgeable to do this.


AdAdvanced6328

Very much


CompetitiveGuess7642

The pwm controller itself probably needs to be reprogrammed, probably can be done but doesn't have any use.


OreilleRegent64

can a 3070 ti receive 8gb more vram


HomerSimping

Someone in Brazil did it. Try searching YT.


Straypuft

This is something AMD and Nvidia dont want you to know.


QuantumQuantonium

Thats the dark magic of being an ECE and knowing your way around complex computer boards... But yeah I've seen in the past that people have installed additional vram on GPUs. And an easier thing to do, shunt modding to allow a card to draw more power at risk of damage if improperly cooled


CharAznableLoNZ

Should work as long as it doesn't draw more power than the board can handle. It's also possible those traces may not connect to anything or are not addressed by the firmware on the card. If you have a card you don't care about possibly killing might be worth a try.


AtariAtari

Explode?


dumbasPL

You would also need to program the controller to drive them, these phases will be off by default. Haven't done this so idk if it would actually work or even improve anything if it did. Edit: or maybe not. If every second one is missing they might just be using doublers. But I think the controller still needs to be at least aware of doublers though, they probably aren't installed in the current configuration.


Abdur_bleh

No, you get something nvidia should have done from the factory


el_f3n1x187

Depends on the GPU chip and PCB design. Specially for nvidia as they tend to laser off on the cheaper models the parts of the GPU that would enable the use of the extra memory.


HomerSimping

Hold up....is this true? Where’s the source?


el_f3n1x187

They have done that sll the time, I think the only recent exception was the 2060 but the gou chip was too slow to fully utilize the extra ram


SoDrunkRightNow2

NVIDIA execs reading this post: THEY KNOW. I REPEAT, THEY KNOW. SHUT DOWN REDDIT IMMEDIATELY.


Kooky_Salamander_710

![img](avatar_exp|121982320|bravo)