Search the Community
Showing results for tags 'vram'.
-
I've heard that the 3070 is vram bottlenecked, but is the 12gb 4070? If it is, will it still be fine playing really vram intensive games, or is the vram bottleneck bad?
- 8 replies
-
- bottleneck
- vram
-
(and 2 more)
Tagged with:
-
Hey guys, I posted a few months back about using those cheap used Nvidia server class GPUs in a workstation computer. I finally completed my build, and I am proud to announce that I have managed to use an Nvidia P40 for my workstation-oriented PC. Other than that, I used: CPU: Ryzen 7700X Mainboard: Asus Prime B650 Plus RAM: 32GB DDR5 Crucial @4800 GPU Cooler: Morpheus 2 core edition SSD: 2TB Kingston M.2 SSD Power supply: 700W BeQuiet! System Power 9 PCIE power to CPU adapter for the GPU Anti-sag support for the GPU Zalman S2 case Alpenföhn Ben Nevis CPU cooler 3 extra case fans + 2 for the GPU cooler (P12 Slim) I also had to buy some small heatsinks for the VRAMs on the back of the GPU. So far, the system is very performant (I can post some benchmarks if needed later, but 60+ FPS for Minecraft with RTX shaders, max render distance, all settings ultra, speaks for itself, especially since it's a 200$ GPU. Of course, the total price of the GPU is closer to 300$, since the cooling solution, as well as the adapter cable, totals around 100$. I have connected the GPU fans directly to the mainboard, where I use FanControl to set fan curves - even under a lot of stress (Furmark, Blender render, Stable diffusion) the card never exceeds 55°C with around 55% fan speed. Please let me know if you have further questions regarding my build or any complaints about my configuration.
- 9 replies
-
- server gpu
- nvidia
-
(and 2 more)
Tagged with:
-
I'm looking at getting a 3070 for my 7 7700x build, but it only has 8 gigs of vram. Will that be enough? I've heard that some cards are too slow to take advantage of extra vram, but is the 3070 one of those?
- 7 replies
-
- question
- first pc build
-
(and 1 more)
Tagged with:
-
I'm building a pc with a ryzen 7700x, but im looking for a graphics card to pair with it. I'm not going to do anything too gpu intensive so it's ok if i have a gpu bottleneck because half the time i won't even use 60% of it's power. I've heard that the rtx 3070 has really good value, but i really want to future proof myself with more vram than just the 8 gigs it has, and i've been looking at the 4060 ti bc it has similar peformance and price while also having 16 gigs, but i've heard that the 4060 ti is really bad value. Why?
- 4 replies
-
- question
- first pc build
-
(and 1 more)
Tagged with:
-
Card is three years old. Haven't paid attention to the vram temps before so idk if this is a new problem or always have been like this. Everything online is a little fuzzy so i wanted to ask here if these are ok temps or not and if they arent what can i do to fix it?
-
Hello there! So some context, I bought my 3090 when it came out and had no issues with it in the beginning. Great card. roughly 2 years later it started cooking the vram hitting the 110c point before shutting down. Thought it was a cooling pad problem so I sent it off to a company who switched out all the paste and pads. They told me that there was a bit problem with the memory but that it shouldnt effect me. Now here I am and the problem is still there, playing tarkov, star citizen, valorant and even simpler games and it still shuts the pc down and reboots or it freezes... If this is caused by the bit problem, what should I do? If not and its a cooling problem, should I just get water cooling instead?
-
I am making new computer build and I am looking for a graphic card to include. After searching a little I chose Sapphire Radeon RX 6650 XT Pulse Gaming OC 8GB because of it relatively low cost in my currency and also it matching Ryzen 5 5600 I want to include in my build. But in a little higher price I also found XFX Speedster SWFT 309 Radeon RX 6700 which in difference to RX 6650 has 10 GB of VRAM instead of 8 GB. As I watched few videos on youtube some people say that 8 GB is fairly enough for gaming in 2023 but more VRAM will be needed in future. So is this 2 GB of VRAM worth investing and sacrificing a little boost clock speed (210 MHz in favor of RX 6650)? Also on one site I saw that Sapphire RX6650 uses PCI Express 4.0 x 8 instead of PCI Express 4.0 x 16, does it visibly decrase performance? Maybe someone know any experiences about these cards and would like to share them with me, feel free to comment and suggest!
-
Hey I have a 4 year old HP-pavillion 15 laptop with a ryzen 5 4000 CPU that is not gaming focused but after college is what I now use it for, one game that I play I'm getting only 20-30 fps max and 15 on the low end, I used task manager and windows game bar to view my hardware usage and CPU is staying around 70, IGPU around 35, ram around 85, and vram at 90. Will installing more ram also increase my vram, in doing my own research I've heard that it can pull from the CPU and others state it pulls from ram. Any info at all would be helpful and is greatly appreciated. Thank you guys.
-
Hello, I have an RX 6600 and sometimes my memory clock does this. As you can see in the images the memory clock goes to 0Mhz and it stays there. My guess is that it goes to some kind of idle mode or something but the thing is this really affects me when alt tabing out of a game as when I return to the game the image freezes for 2 seconds and then everything goes to normal including the Mhz, which is annoying (happens at random, sometimes I just have to spend like 5 seconds out of a game for this to happen). The only workaround I have found is to force the memory clock to max out by either disabling Freesync or just using two monitors (the reason why I don't like this workaround is because the temps go a little bit high, which is why I turn off one monitor in the first place). I would like to know if there is any fix to this, this does not happen on Linux so my guess this is just yet another AMD driver issue... I would appreciate any help, thanks.
-
I'm going to build a new PC for 1440p gaming. I'm considering getting an RTX 4070TI, but am worried that 12gb of VRAM won't suffice anymore in 5 years. I want to play some really demanding games, which I can't even think of doing with my present PC (barely handles cs:go). If not, than I would be saving up for a 4080. I want my PC to last without any upgrades for at least 4-5 years. I'm also going to be playing in 4k sometimes, on the TV. Where I live, the 4070TI is cheaper than the 3090TI by a large margin. The 3090 is a bit cheaper.
-
It is well known that games created in the future will be more and more in demand of VRAM. I want to buy myself a new GPU (at the moment I have 2 gb gtx 950), I would like it to last at least for next 3 or 4 years so I am aiming in 12 GB VRAM models, like 6700 XT (or 6750 XT, very simmiliar price). I heard that AMD may announce new GPU's with 12 gigs in September, so my question is: Should I wait those two months and most important, if they announce new 12 GB GPU's will other model's (6700 XT, 6750 XT) prices drop? I just upgraded my CPU to Ryzen 5 5600 and I can run most of games I play easily, but as I tried Squad, it can only run on very poor graphics due to it having only 2 GB's of VRAM... So any recommendations on what to do? I was planning to buy used 6700 XT for 300$, but I want to avoid problems with buying second hand so if prices are gonna drop after announce of new GPU's then I will definitely wait. Open for opinions and recommendations!
- 11 replies
-
- 6700 xt
- 12 gb vram
-
(and 1 more)
Tagged with:
-
I recently upgraded my GC to a Gigabyte 3070 8GB Gaming OC, only to later realize I am getting short on its VRAM with maxed out settings on some newer games. Being a Chip level repair tech myself I couldn't help but wonder if we can replace the 1GB chips to 2GB chips and tweaking the BIOS effectively doubling the VRAM. I found a guy who did exactly the same and was successful, but he did it 2 years ago and he faced some issues with the bios. I couldn't find any further development after him. I would love to know if there are any Modders in the community that did or want to pursue this goal for their GPU. Attaching the links I found below, if someone can understand Russian please help me translate his G-Drive and find this updated bios for me. Thanks. https://www.tomshardware.com/news/16gb-rtx-3070-mod https://www.pcgamer.com/modder-gives-rtx-3070-a-16gb-upgrade-before-nvidia-has-the-chance/ link to his youtube video: He also has done the same for many other cards if you check his page. His G-Drive link from video (NSFW): -link removed by staff-
-
I got a Notification pointing to an old Reddit post from someone working at AMD Radeon 10 years ago. It provides us with some useful insight into why each company chooses a certain amount of VRAM for a certain product. Before you begin commenting about "Oh these companies are stupid and didn't give us enough VRAM..." - I agree, however keep in mind that development of RTX 4000/ RX 7000 began as soon as RTX 3000/ RX 6000 series finished releasing all initial products, back when 8 or 12GB of VRAM was still plenty for the vast majority of gamers. This does not excuse the problem, but you should give credit where it is due... COPY-PASTE FROM OLD REDDIT THREAD: I work for AMD doing product marketing in the Radeon division. I can answer your question: 1) Memory quantity is initially chosen by us, and is influenced only by two factors. First, how much it costs to equip every product sold with XYZ quantity of memory. Secondly, how much memory the board can physically accept with respect to the capacity of each chip and the space available on the board. We let partners add more if they feel like footing the bill. 2) Bus width determines how much data can be moved into memory per 1Hz of memory clockspeed. The clockspeed determines how many times that can happen per second. At any given clockspeed, a 384-bit bus will be able to move 50% more data than the same RAM on a 256-bit bus. 3) The width of the bus does not determine how much RAM can be installed, or how much RAM can be utilized. As an aside, it *does* determine how the memory chips must be installed on the board (e.g. in groups of 2 or 3). The number of textures (and their size) and the number of special effects ("shaders") determines how much VRAM is used. Generally speaking, the better the game looks, the more VRAM it will use. 4) The priority of GPU performance is: compute shaders* > memory bandwidth > GPU clockspeed > memory quantity. The importance of memory quantity and GPU clock flip if the VRAM is < 2GB. We equip our high-end boards with 3GB of GDDR5 on a 384-bit bus, and let partners add more. NVIDIA's guidance was to use 2GB of GDDR5 on a 256-bit bus. The end result is that at resolutions equal to or higher than 2560x1440, Radeon is better performer because the Kepler architecture runs out of VRAM *and* hits the limit of its memory bus. We can presume NVIDIA made this decision because most gamers are playing at 1080p, which doesn't highlight the limits of their RAM configuration. Our data suggests that our high-end customers are choosing higher resolutions, e.g. 1440p or 1600p (or Eyefinity), so we built the boards to fit those needs. \* NOTE: We call them Stream Processors, NVIDIA calls them CUDA Cores. //EDIT: Removed statement irrelevant to an example. SOURCE: https://www.reddit.com/r/hardware/comments/wtew1/could_someone_explain_the_significance_of_amds/c5gdv0v?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button
-
I've got MSI GTX 1660 6GB Super Ventus XS OC. I recently did a thermal paste repaste whilst also replacing the thermal pads on the VRAM. The factory thermal pads were all worn out so I decided to replace them with my own. Originally the VRAM thermal pads were of 2.0mm thickness but all I had was 1.5mm pads so I replaced the older ones with them. For the thermal paste, I've used Gelid GC Extreme. Got the thermal pads of Amazon. My temps under full load have dropped drastically, from 83°C to 75°C but I'm left wondering if the incorrect thermal pad size might lead to air gaps between the heatsink affecting VRAM temperatures negatively. I've tried both GPU-Z and HwInfo64 but there's no thermal sensor on the VRAM so I can't check their temperature. Is there an alternative way to check if my VRAM is overheating? Specs: Case: NZXT H210i Processor: Ryzen 5 5600x GPU: GTX 1660 6GB Super Ventus XS OC RAM: 8x2GB 3200Mhz Kingston Fury CaseFans: 3x120mm Lian Li Unifans, 1xArctis 120mm
- 4 replies
-
- msi graphics card
- vram temp
-
(and 3 more)
Tagged with:
-
Okay so long story short, after starting a "bigger" game my graphics card(RTX 3070 Founders Edition) was making a weird coil whinish sound (it wasn't coil whine, trust me) and so after weeks of trying to diagnose the problem we found out that it's something to do with the VRAM. I tried a lot of benchmarks and the one which diagnose VRAM only showed a lot of ERRORS about the VRAM but nothing else. I before knowing the problem I tried driver updates, troubleshootings and so on but nothing helped and I knew it's something to do with the hardware. Since I can't really do anything and there aren't any services nearby that would fix it I'm just curious how can I check which VRAM might be the faulty one? Isn't there any program that might help finding the faulty one?
-
48gb/more VRAM or 24gb GDDR7 VRAM/same amount of VRAM but higher bandwidth which is faster for my use case??? Gaming+OBS studio+effects+pushing dsr resalootion to the maximum extent of the code+video editing in resolve!!!!!!!
- 3 replies
-
- 16k
- resalootion
-
(and 1 more)
Tagged with:
-
I’m strongly considering getting rid of my 3070 8gb. Due to all the issues with it not having enough vram and seeing everyone else online debating on if this card is going to last 3 more years. When I first built my pc in August, I had a 3060 12gb oc. A couple of months later, I sold it and brought a 3070 (brought both used on eBay). In hindsight, it kinda of feels stupid, because I had better vram but a slightly less powerful gpu. I didn’t think about the vram. Now I’m playing at 1440p, I don’t have my textures on high but I’m having issues. I feel like if I wait to get rid of this card, I’ll get barely anything for it. If I move up to a 3080 it has to be used because getting for $1k new is crazy. So that mean you have to buy a 4070ti to get a card that doesn’t cost over a thousand. What would you do if you were in my shoes? Yes there’s AMD cards, but I heard Nvidia is better for streaming and video editing which I plan to do eventually
-
Around a month ago I decided to repaste the cpu and gpu on my Acer Nitro AN515-54. When opening up my laptop I noticed that the pink vram thermal paste looked dried out and didn't cover the vram chips evenly. I didn't have a replacement so I decided to not do anything about it. After repasting the cpu and the gpu the temps were great but after a couple weeks I noticed that the temps were starting to get really high again. I thought it could've been because of my poor technique (it was my first time changing the thermal paste), so I decided to repaste again, and I just did. The temps seem fine, but I wonder if I should do anything about the vram thermal paste. To be clear what I mean, the pink thermal paste looks something like this: (source: https://www.reddit.com/r/IndianGaming/comments/tnp2tb/what_to_do_with_this_pink_clay_like_stuff_on_the/) If a replacement is necessary, I am thinking of getting thermal pads (as it is significantly cheaper than k5-pro).
- 1 reply
-
- acer nitro 5 an515-54
- acer
-
(and 3 more)
Tagged with:
-
Hello I bought my RTX 3070 Gigabyte Vision in 2022 september. I mainly bought it so my flight sim runs better. I play Msfs2020, XP11 and P3Dv5. Recently every time I load in the sim crashes, it comes up with an error message saying “Your computer has ran out of useable memory”. It ran perfectly before. My specs: R7 5800x, 16GB 2666mhz, Rtx 3070, Asus Prime B450M-A. I am suspecting my motherboard also, Beacause its A Prime mobo.
-
Summary Cadence, a vital IP provider for DRAM PHY, EDA software, and validation tools is announcing its new memory standard to debut with the next-generation of GPUs. The new GDDR7 promises starting speeds said to be as high as 36 Gbps, going beyond the 50 Gbps mark in its lifecycle. While JEDEC has not formally published the GDDR7 specification, this latest technical data dump comes as Cadence has launched their verification solution for GDDR7 memory devices. A report says that NVIDIA's next-generation GeForce RTX 50-series, probably slated for a late-2024 debut, as well as AMD's competing RDNA4 graphics architecture, could introduce GDDR7 at its starting speeds of 36 Gbps. Quotes My thoughts These bandwidth numbers for GDDR7 are definitely appealing. One of my only concerns though, is if 128-bit bus GPUs are capable of delivering 576 GB/s, it makes me wonder what NVIDIA might do with their low to mid tier video cards. As we saw with Ada Lovelace, bandwidth starvation has been a common theme with 4070 Ti and below cards. While the rest of the lineup for Ada Lovelace has yet to release, rumors are shaping up to show these cards being bus width constrained. If GDDR7 allows them to continue this theme, I'm sure NVIDIA will release more cards that are 192-bit and 128-bit. Even if they are cards in those tiers that wouldn't normally have such small bus widths. Besides that, it seems there still might be some life left in GDDR6, considering Samsung is working on GDDR6W which doubles performance and capacity. Supposedly, GDDR6W is comparable to HBM2E in performance and outright bandwidth. While simultaneously, Micron is attempting to push speeds of GDDR6X, as we have yet to see 24 Gbps being utilized in GPUs. Scary part about all of this is to think about what next-gen cards will cost. This new technology is of course great, but to think of a 128-bit x60 series card for $500-600 seems outlandish. The craziest part about it, is people will still be willing to pay. Despite everything, we have nearly two years before we see this technology utilized. I'm sure as time progresses we will get more concrete details of future GPUs that may use GDDR7. Sources https://www.techpowerup.com/305676/nvidia-geforce-rtx-50-series-and-amd-rdna4-radeon-rx-8000-to-debut-gddr7-memory https://wccftech.com/gddr7-memory-for-next-gen-gpus-enters-verification-stage-as-cadence-intros-first-solutions/ https://www.anandtech.com/show/18759/cadence-derlivers-tech-details-on-gddr7-36gbps-pam3-encoding https://www.techpowerup.com/305653/cadence-announces-the-first-gddr7-verification-solution https://news.mydrivers.com/1/896/896269.htm
- 57 replies
-
- memory
- amd rdna4 radeon rx 8000
-
(and 3 more)
Tagged with:
-
Let's let the cat out of the sack right away: I'm experimenting with LLMs (how daring I know) and I would like to run and train them locally. Now I will not work from scratch (obviously), but I would like to try my hand at locally fine tuning some existing LLM checkpoints. The end goal is to teach the LLM tool use (see a paper called Toolformer which taught GPT-J to generate different API calls to turbo charge its answer accuracy) and improve its conversational behavior in terms of time and person awareness. The checkpoint I would like to work off of is the Facebook Research 65B checkpoint, and from what I have seen, people have been successfully running it off of a single A100. Now, I do not have the budget for an A100 (and my Uni will very likely not give me the budget for an A100) so I wanted to ask the community for help. Here are the basic questions: 1) I know torch allows you to split a model to fit into multiple GPUs if you trade off some performance. Would it make sense to get a couple cheaper second hand Quadros and divide the model up over those? 2) You could also load the model in CPU and (according to some people) they can still infer a couple words per second which is honestly not too bad. (But what about that fine tuning?) Would that make more sense? 3) If the verdict is to get a bunch of used GPUs with as much VRAM as possible, which GPUs would be the most affordable to go for right now? Looking forward to this discussion! And thank you all for the help!
-
Summary Gigabyte now lists as many as six different GeForce RTX 4070 graphics cards models with various memory configurations on its Thai-language website. A Gigabyte game bundle giveaway in Thai language now reportedly lists the GeForce RTX 4070 with 10GB, 12GB, and 16GB of GDDR6X memory. Quotes My thoughts The real question about this slip up or leak is whether or not NVIDIA plans on releasing multiple 4070 SKUs with different memory options or if these are simply configurations that were once proposed that didn't make the final cut. If they are SKUs that NVIDIA is planning, the big question is if they are going to be like the original 4080 12GB and have different specifications alongside the different memory configurations (like different CUDA Cores amounts). If this is the case, it would appear as if NVIDIA didn't learn from the 4080 12GB backlash. It could also simply be a case of typos or old invalidated info which made its way into the database. Regardless, it looks like we will have to wait until around April to find out what the RTX 4070 final specifications will be and whether it will launch alongside other models with different VRAM options. If this leak is accurate, hopefully they simply are different memory arrangements and there isn't an explicit change to specifications like we saw with the 4080 12GB. Sources https://www.guru3d.com/news-story/gigabyte-rtx-4070-may-come-with-a-number-of-different-ram-configurations-to-choose-from.html https://www.tweaktown.com/news/90417/nvidia-rtx-4070-leak-shows-cards-with-different-vram-loadouts-whats-going-on/index.html https://www.techpowerup.com/304976/nvidia-readying-10gb-12gb-and-16gb-variants-of-rtx-4070-gigabyte-thinks-so https://www.tomshardware.com/news/gigabyte-lists-geforce-rtx-4070-with-different-memory-configurations https://videocardz.com/newz/gigabyte-lists-geforce-rtx-4070-graphics-cards-with-10-12-and-16gb-memory https://hothardware.com/news/geforce-rtx-4070-listings-for-10-12-and-16gb-memory https://www.eteknix.com/geforce-rtx-4070-may-get-multiple-memory-options/
-
Hi everyone! With the current state of the market, I'm not immediately looking to buy a gpu, my window is within the next year or so. I've recently dipped back into learning about the state of PC hardware, but I'm still not comfortable buying one without a little help. My questions 1- Is the 8GB/10GB of VRAM in the 3070/3080 looking like it will hold up with games in the near future? 2- As someone who will not be looking at doing anything above 4k 60-120fps(I know these are fairly high settings), is there a card on the market that can hit these with high(not max) settings and have a little future proofing? I have these questions based on watching performances for the 3070/3080. Games like Cyberpunk seem to already take up 8GB of VRAM and they both seem to struggle with max settings even with DLSS on. I'm guessing that Cyberpunk has it's own "future proofing" by letting things be cranked up way higher than is standard right now, but I haven't followed things closely enough to be comfortable assuming. I've also watched things like unreal engine 5's "Nanite" tech, which to my understanding seems to show very high quality textures with much less resources required than normal. I'm not sure if optimizations like this and others will mean my concerns over VRAM and hitting high performance in general won't be as important going forward. My end goal is to get 4k 120fps with ray tracing and not pay 3090 prices(yes, I want my cake and I want to eat it too!) As I mentioned I have no problem waiting for next gen if that's what makes sense here. I know people here will have a much better sense than me over if even next gen is a realistic option for what I want. Thanks in advance!