Jump to content

I'm from China Mainland.We have some tricks on Modifying VRAM

 

I’ve already modified and tested the 3060 Laptop with 12GDDR6 instead of 6.It seems like NVIDIA had a idea of launching a 12G model of the laptop 3060,but canceled with some reason.We have confirmed all 30s have the native support of 16Gbit SAMSUNG memory so we did the swap on this particular card and it has been proved to be success.

 

In theory,3070 3080 3090can be modified too.I’ll try it then.

 

with 12Gigs of VRAM,3060L proved to have better performance than Desktop ones.

8024366E-6220-412C-8A6D-FE09ECAB1107.png

B2BF4CEF-8EDA-461A-9F69-00C79E8B52F3.jpeg

Link to comment
https://linustechtips.com/topic/1489557-3060-laptop-with-12g-vram/
Share on other sites

Link to post
Share on other sites

Well that’s neat. I’d like to see some bar charts similar to what Linus tech tips puts out on reviews for stock vs modified in certain games, resolutions, etc.

My PC Specs: (expand to view)

 

 

Main Gaming Machine

CPU:  Intel Core i7-14700K
CPU Cooler: Deepcool LT720
Motherboard: MSI PRO Z790-P WIFI
Memory: G.Skill Trident Z5 RGB 32 GB (2 x 16 GB) DDR5-6400

Storage 1: Samsung 990 Pro 2 TB

Storage 2: Crucial P3 Plus 4 TB
Video Card: EVGA XC3 ULTRA GAMING GeForce RTX 3080 10GB

Power Supply: Corsair RM850 850W
Case: Corsair 7000D Airflow
Case Fan 140mm: Noctua A14 PWM 82.5 CFM 140 mm (x7)
Monitor Main: MSI G274QPF-QD 27.0" 2560 x 1440 170 Hz
Monitor Vertical: Asus VA27EHE 27.0" 1920x1080 75 Hz

Link to post
Share on other sites

3 hours ago, TCatTheLynx said:

I'm from China Mainland.We have some tricks on Modifying VRAM

 

I’ve already modified and tested the 3060 Laptop with 12GDDR6 instead of 6.It seems like NVIDIA had a idea of launching a 12G model of the laptop 3060,but canceled with some reason.We have confirmed all 30s have the native support of 16Gbit SAMSUNG memory so we did the swap on this particular card and it has been proved to be success.

 

In theory,3070 3080 3090can be modified too.I’ll try it then.

 

with 12Gigs of VRAM,3060L proved to have better performance than Desktop ones.

8024366E-6220-412C-8A6D-FE09ECAB1107.png

B2BF4CEF-8EDA-461A-9F69-00C79E8B52F3.jpeg

My 2060 that I recently replaced with a 3060Ti, also has two "free" spots for memory chips. I think I could technically modify it and give it 8GB instead of just 6

Link to post
Share on other sites

24 minutes ago, DreamCat04 said:

My 2060 that I recently replaced with a 3060Ti, also has two "free" spots for memory chips. I think I could technically modify it and give it 8GB instead of just 6

well thats technically impossible due to the lock of 192bit memory from NVIDIA factory.U can change the existing vram die from 1G to 2G though.That gives u 12G with no need of refreshing VBIOS.The driver should be ok as the 12G 2060 actually exists.Remenber to swap the memory indicate resisters after vram swap.Technically possible.

Link to post
Share on other sites

1 hour ago, TylerD321 said:

Well that’s neat. I’d like to see some bar charts similar to what Linus tech tips puts out on reviews for stock vs modified in certain games, resolutions, etc.

I’m just a senior high student from China so I don’t have the capability to make such a chart 😞 maybe Linus can make a review on this?

Link to post
Share on other sites

10 minutes ago, TCatTheLynx said:

well thats technically impossible due to the lock of 192bit memory from NVIDIA factory.U can change the existing vram die from 1G to 2G though.That gives u 12G with no need of refreshing VBIOS.The driver should be ok as the 12G 2060 actually exists.Remenber to swap the memory indicate resisters after vram swap.Technically possible.

Was just an idea of me, I don't think I'll actually do it because I now have an RTX 3060Ti and I'm planning to sell my 2060. I'm currently also repasting that one. Thermal pads need to be replaced after the card has been opened, right? I've heard that you should do that but I also heard that it's fine to leave them. What should I do?

Link to post
Share on other sites

4 minutes ago, DreamCat04 said:

Was just an idea of me, I don't think I'll actually do it because I now have an RTX 3060Ti and I'm planning to sell my 2060. I'm currently also repasting that one. Thermal pads need to be replaced after the card has been opened, right? I've heard that you should do that but I also heard that it's fine to leave them. What should I do?

To me,whether to replace or not depends on the situation of that pad.if it has cracks on it or it has been completely dried out,replacing is recommended.otherwise u can keep it there.for G6Xs,replace anyway.

Link to post
Share on other sites

I was his friend and collaborator. After testing, the graphics card can maintain 130w power consumption, as shown in the figure.

Our next plan is to convert the rtx3090ti to 48gb vram. Since the rtx3090tupcb board only has one layer of video memory particles, we plan to transplant the rtx3090ti to the pcb of rtx3090. Why not retrofit it with rtx3090? Some technicians in China have tried to modify the rtx3090, and finally failed to run properly due to bios problems, which do not exist in the rtx3090ti bios.

Since I am not proficient in English, I use a translator to help me communicate

 

mmexport1676965436042.jpg

Link to post
Share on other sites

1 hour ago, TCatTheLynx said:

To me,whether to replace or not depends on the situation of that pad.if it has cracks on it or it has been completely dried out,replacing is recommended.otherwise u can keep it there.for G6Xs,replace anyway.

The VRM thermal pad I had to remove to get to a screw beneath it because I wanted to get the shroud off for getting the dust out of the cooling fins. But it still looks around the same thickness. The pads still seem to be in good condition, even after the three years that I've had that card for. Should I just try putting it back together and seeing whether anything goes wrong? Can anything bad happen? The 2060 uses "normal" GDDR6, right? Because the thermal pads have an estimated delivery time of around 2 weeks and I kinda don't wanna wait for that long... I also saw that dell put two thermal pads where there are two empty spaces for memors chips. Perhaps I could use one of these for replacing the VRM thermal pad. They're all the same thickness, 2mm, I've measured it

Edited by DreamCat04
Link to post
Share on other sites

Just now, DreamCat04 said:

One, I had to remove to get to a screw beneath it because I wanted to get the dust out of the cooling fins. But it still looks around the same thickness. The pads still seem to be in good condition, even after the three years that I've had that card for. Should I just try putting it back together and seeing whether anything goes wrong? Can anything bad happen? The 2060 uses "normal" GDDR6, right? Because the thermal pads have an estimated delivery time of around 2 weeks and I kinda don't wanna wait for that long...

It will be okay but I still recommend swapping it once new pads have delivered.the worst situation is fried vram totally but it’s a really rare situation on g6s

Link to post
Share on other sites

1 minute ago, TCatTheLynx said:

It will be okay but I still recommend swapping it once new pads have delivered.the worst situation is fried vram totally but it’s a really rare situation on g6s

Okay, then I'll buy the pad, is 3x3cm enough for 6 GDDR6 chips? I'll use one of the spare memory pads for the VRMs probably

Link to post
Share on other sites

2 minutes ago, TCatTheLynx said:

Thickness is also needed to be awared.

Yes, I know that 🙂 I think they're all 2mm pads, but I'd need to measure them with a more precise tool than just my ruler, but according to what I could see there, they are all 2mm pads, but I'll have to measure them with a more exact tool when I'm back home, thanks for your help in a post that was originally yours ❤️

Link to post
Share on other sites

Just now, DreamCat04 said:

Yes, I know that 🙂 I think they're all 2mm pads, but I'd need to measure them with a more precise tool than just my ruler, but according to what I could see there, they are all 2mm pads, but I'll have to measure them with a more exact tool when I'm back home, thanks for your help in a post that was originally yours ❤️

You are welcome,I’d like to help others. I’m pretty bored with so many experiences in PC building.I’d like to share:)

Link to post
Share on other sites

22 minutes ago, TCatTheLynx said:

You are welcome,I’d like to help others. I’m pretty bored with so many experiences in PC building.I’d like to share:)

I don't really have any experience in building PCs yet, my first PC is a Dell G5 5090 (the small boi jet engine, I have the 2019 I think model). I did help my dad build his new PC back in early 2019, but I haven't built my own yet. Will definetely do that once my current PC gets too old. I also think that when I repasted my gaming laptop, I applied way too much paste to the GPU. I applied a worm all across the die lol even though a dot would have probably been enough. But better have too much than too little.
Also how should I benchmark my 2060 once I have done the repaste? Because I wanna see if it improved anything. Previously, the hotspot was 17°C hotter than the core under load and 12°C warmer at idle. My current 3060Ti is 12°C warmer at its hotspot at full load and like 9°C at idle

Link to post
Share on other sites

4 minutes ago, DreamCat04 said:

I don't really have any experience in building PCs yet, my first PC is a Dell G5 5090 (the small boi jet engine, I have the 2019 I think model). I did help my dad build his new PC back in early 2019, but I haven't built my own yet. Will definetely do that once my current PC gets too old. I also think that when I repasted my gaming laptop, I applied way too much paste to the GPU. I applied a worm all across the die lol even though a dot would have probably been enough. But better have too much than too little.
Also how should I benchmark my 2060 once I have done the repaste? Because I wanna see if it improved anything. Previously, the hotspot was 17°C hotter than the core under load and 12°C warmer at idle. My current 3060Ti is 12°C warmer at its hotspot at full load and like 9°C at idle

temp bench-Furmark

 performance bench-3DMark

stability-3DMark stress test/Furmark

Link to post
Share on other sites

10 minutes ago, DreamCat04 said:

I don't really have any experience in building PCs yet, my first PC is a Dell G5 5090 (the small boi jet engine, I have the 2019 I think model). I did help my dad build his new PC back in early 2019, but I haven't built my own yet. Will definetely do that once my current PC gets too old. I also think that when I repasted my gaming laptop, I applied way too much paste to the GPU. I applied a worm all across the die lol even though a dot would have probably been enough. But better have too much than too little.
Also how should I benchmark my 2060 once I have done the repaste? Because I wanna see if it improved anything. Previously, the hotspot was 17°C hotter than the core under load and 12°C warmer at idle. My current 3060Ti is 12°C warmer at its hotspot at full load and like 9°C at idle

10 minutes ago, DreamCat04 said:

I don't really have any experience in building PCs yet, my first PC is a Dell G5 5090 (the small boi jet engine, I have the 2019 I think model). I did help my dad build his new PC back in early 2019, but I haven't built my own yet. Will definetely do that once my current PC gets too old. I also think that when I repasted my gaming laptop, I applied way too much paste to the GPU. I applied a worm all across the die lol even though a dot would have probably been enough. But better have too much than too little.
Also how should I benchmark my 2060 once I have done the repaste? Because I wanna see if it improved anything. Previously, the hotspot was 17°C hotter than the core under load and 12°C warmer at idle. My current 3060Ti is 12°C warmer at its hotspot at full load and like 9°C at idle

By the way,some of those thermal paste actually a conductor.Too many may cause MB failure.So be aware nextime 

Link to post
Share on other sites

21 minutes ago, TCatTheLynx said:

temp bench-Furmark

 performance bench-3DMark

stability-3DMark stress test/Furmark

Sorry, I forgot to add the most important part, I know already what tools I wanna use.

The "problem" I have is where to test the card. Because i don't wanna take out my 3060Ti and I don't know if my dad would lend me his PC so that I can test out my card. But I'll see about that later, when I actually need to do it. I have an old PC from a relative that had a broken fan, so we fixed it and kept it because they didn't want it anymore. Maybe I could use that? It's got a Core i5 4550 (or whatever the 4th gen core i5 was called), but the replacement PSU (the old one was busted) also lacks a PCIe power connector...

15 minutes ago, TCatTheLynx said:

By the way,some of those thermal paste actually a conductor.Too many may cause MB failure.So be aware nextime 

Yep, definetely! Now I know that it was way too much but I'll do better for my 2060! It was my first ever re-paste, so I had no orientation on how much to use. The die had a protective black tape like thing around it though and the old thermal paste also went a bit under that. I used Arctic Mx4, that might be conductive because Artic says that they have carbon particles in it or something like that

Edited by DreamCat04
Link to post
Share on other sites

3 minutes ago, DreamCat04 said:

Sorry, I forgot to add the most important part, I know already what tools I wanna use.

The "problem" I have is where to test the card. Because i don't wanna take out my 3060Ti and I don't know if my dad would lend me his PC so that I can test out my card. But I'll see about that later, when I actually need to do it. I have an old PC from a relative that had a broken fan, so we fixed it and kept it because they didn't want it anymore. Maybe I could use that? It's got a Core i5 4550 (or whatever the 4th gen core i5 was called), but the replacement PSU (the old one was busted) also lacks a PCIe power connector...

Just take out the 3060ti,its not that hard 😉

Link to post
Share on other sites

21 minutes ago, TCatTheLynx said:
6 minutes ago, TCatTheLynx said:

Just take out the 3060ti,its not that hard 😉

 

I kinda don't wanna deal with the driver oddity that happened when I upgraded. Because for the first five minutes, only my first screen worked and the GPU wasn't recognised by Afterburner, it only showed "Microsoft Basic display adapter". I did wanna use DDU, but didn't really know what to do so I exited safe mode again and then after a few minutes and a coincidental driver update from GeForce experience, it was all fine again

Link to post
Share on other sites

5 minutes ago, DreamCat04 said:

I kinda don't wanna deal with the driver oddity that happened when I upgraded. Because for the first five minutes, only my first screen worked and the GPU wasn't recognised by Afterburner, it only showed "Microsoft Basic display adapter". I did wanna use DDU, but didn't really know what to do so I exited safe mode again and then after a few minutes and a coincidental driver update from GeForce experience, it was all fine again

Well it’s pretty simple.Windows 10 and

11 will automatically update the drivers once the new card has installed.don’t worry about it.Plug the computer into internet correctly and wait for about 5min and the driver will be ready up in no time.By the way,2060 actually share the same driver as 3060ti,so the ready-up-time should be shorter and the performance should be more stable.

Link to post
Share on other sites

1 minute ago, TCatTheLynx said:

Well it’s pretty simple.Windows 10 and

11 will automatically update the drivers once the new card has installed.don’t worry about it.Plug the computer into internet correctly and wait for about 5min and the driver will be ready up in no time.By the way,2060 actually share the same driver as 3060ti,so the ready-up-time should be shorter and the performance should be more stable.

Oh, that might explain the behavior, I'll do it in my system then, thanks

Link to post
Share on other sites

  • 1 year later...

Hello all, I just have a couple questions.

I was wondering how one would choose the proper VRAM chip to upgrade with. Is it as simple as searching for a GDDR6 2GB VRAM 192 bit chip or are there other specifications to consider? Also, was there any additional soldering that was done on the mother board besides replacing the VRAM chips?

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×