Jump to content

xFluing

Member
  • Posts

    357
  • Joined

  • Last visited

Posts posted by xFluing

  1. On 5/20/2023 at 2:28 PM, thekingofmonks said:

    Yes and no:

     

    SLI was a technology pushed by Nvidia that allows multiple GPUs to work simultaneously on one display (AMD's was called Crossfire or mGPU).

    Officially, Nvidia doesn't allow SLI between different GPUs, let alone these two you have that are 2 generations apart.

    SLI unfortunately died out as game devs got lazy and stopped working on it.

     

    Some games (very few, Ashes of the Singularity for example) do however come with what's called Explicit Multi-Adapter which allows architecturally different GPUs to do the same thing as SLI.

    So if you ever slot in both your 3060 and your 1060, you should be able to have it run with EMA.

     

    The big drawback would be that it performs poorly as the better 3060 would have to tune down its render speed to match the 1060.

     

    Other con is that it's only a thing in very few select games.

     

    So yes, it's technically possible. I do not recommend it, though.

    "Pushed by nvidia"

     

    Bro has never heard of 3dfx

  2. On 5/11/2023 at 2:41 AM, mishmish said:

    Hello all, I've long posted about upgrading my gpu, its just hasnt happened yet, but more recently I've seen prices drop. Heres mu current specs:
     

    Ryzen 5 3600

    MSI B350m

    16GB Ram

    MSI RX 580 4GB

    EVGA G2 650w

    1440p, 144hz Monitor

     

    For some time I've had my mind set on a 8gb XFX Merc 308 Rx 6650 xt. I wanted the Merc 308 because I had read that the fan/cooling is supposed to be better then the Swft models, I do value cool temps and quiet fans.

    Recently I saw the price of the RX 6650 xt drop to $280 usd. Out of curiosity, I searched for a RX 6700, and saw a 10gb XFX Speedster swft309 also for $280.

     

    I just wanted an informed opinion about which to choose. Is it better to get the 6650xt for the better cooling, or is it more of a no brainer to get the 6700 for the performance?

    I do have a 1440p 144hx monitor, but I wouldnt say that I play too many AAA games, probably like Forza 4 and 5, some Planet Coaster/Zoo, and in general mostly emulation.

     

    Also, it looks like a 650w PSU is the minimum, should that concern me? Just wanted some realistic opinions, thank you all!

    Let me tell you, those PSU recommendations on the manufacturer site are way overspec. I'm rocking a 6700XT with a 550W power supply, and according to MSI afterburner I never use more than 200-250W, AS A WHOLE SYSTEM, way smack dab in that 50% load efficiency.

  3. 23 hours ago, johnno23 said:

    I would use the DP myself. 

    Not a fan of HDMI due to the bullshit over the years.

    buy a bluray hDMI update and need a new bluray or disks dont run etc etc then they messed with HIFI ...I invested 5000USD on a hi end amplifier and the HDMI BS once again as they used HDMI to lock people out of certain audio signalling...the HDMI is simply criminal in my humble opinion. they can take a long walk of a short pier.

    Plus in general situations DP gives a better signal in most cases. 

    My 34" Ultragear nano ips has a much better image and perfect refresh rate with DP not HDMI.

    If HDMi was so perfect GPUs would have 3 HDMI and a single DP not the other way around.

    And my apologies for the little rant but seriously just use the DP and then you dont have your monitor suddenly going black by accident if you nudge that cable.

    All that just pales in comparison to what I think is the real killer app of DP. A mounting system. No way in hell will a DP cable accidentally unplug by itself.

  4. 22 hours ago, Tomberry said:

    having used a intel igpu fr gaming (on my laptop) the performance is very hit and miss. eg it could run far cry 4 at 900p medeum-low and get 40-50 no issues. however max payne 3 at 600p low everything was getting like 15 fps with many many drops

    Why the hell are you comparing apples and oranges? Of course an iGPU will be worse than a dedicated one, especially way before they started taking graphics drivers seriously with the ARC cards.

  5. On 4/13/2023 at 7:38 AM, barry hachelaf said:

    i cant run valorant when using MSI afterburner , i can't lower the fans speed blew 45 percent from the amd software , i want to lower the fans speed from the amd software directly

    Then stop playing that game with its rootkit malware level of anticheat (remember that hackers found an exploit in it, as I predicted it would happen when vanguard was new). I just hate it when games make it outright impossible for you to use any kind of useful program because "we don't allow overlays". Fuck tencent to hell and back who is with me

  6. On 5/6/2023 at 4:56 AM, IEatglue said:

    wait what

    i take back what i said

    i dont think it will drop the price of the 6700 xt or at least not immediately

    so either buy now or get ready to wait and spend more money

     

     

    What do you mean "wait and spend more money" the 6700 xt surely won't increase in price

  7. 9 minutes ago, Xteolz said:

    I was kinda scared of the heating problems with the connector and this was my first all out build. Just wanted to double check and stuff 

    Those 12V cables are notoriously hard to plug in, make sure you push really hard on them. Watch the gamersnexus analysis about it, very informative. This cable spec is fucking awful.

  8. 18 minutes ago, DeerDK said:

    The 4070 "only" draws 200w. 

    More importantly, according to Hardware Unboxed, the maximum powerspike they saw brought it up to 230w. 

    That is IMO pretty impressive, given the other cards with simular performance (see the screendump I took from Tom's Hardware GPU Hierarchy). 

     

    Mind you, it annoys me, as I had talked myself into going for AMD or Intel for the next gpu, voting with my wallet and all that, but with a mITX case efficiency is an important factor. 

    Screenshot_20230505_081227_Samsung Internet.jpg

    Fair enough

  9. I dont know about the audio but the visuals sure do look as if the card itself is unstable (basically not enough voltage to propey drive the clocks)

     

    Since you most likely didnt overclock yourself i would suggest downloading msi afterburner and actually lowering the clocks by ~100mhz and see where it takes you (since nvidia locked down voltage control thanks a lot)

     

    I am a bit worried that this means it came defective from the factory and the situation is only going to worsen (had very similar experiences with two old radeon hd 7850s)

     

  10. Sounds like coil whine to me yes.

     

    You can aleviate this by limiting your FPS to about 5 below the monitor's refresh rate (so you are firmly within freesync range if you use AMD), it should also help with temps. Another thing you could try is possibly undervolt the GPU too.

     

    Edit: make sure that any competitive games you are playing don't get the FPS limit as it will increase latency if you care about it that much.

  11. 11 hours ago, emothxughts said:

    Beat me to it. But if you're using Afterburner now and would like to stick to it, pick "driver only" or "minimal install" when you install the AMD driver.

    The crimson software is very good in its own right. Not for the gpu controls because those suck ass, but for all the driver level settings you can set on games like image sharpening, frame limiting (so you can stay in freesync range also savings on power and temps)

  12. 5 hours ago, YoungBlade said:

    The reason that the top Intel card is $330 is because it competes with the RTX 3060 and RX 6650XT in terms of performance, and those cards sell around that price point (although the 6650XT has fallen in price since the A770 was released and now sells for less, but at the time it was around that price)

     

    I agree that the current generation is overpriced, but stupid high-end options have always existed. It's just that they used to call them "Titan" cards instead, so fewer people cared. Now that they're called "GeForce XX90" it's a problem.

    Thats not the problem and you are completely missing the point, furthermore the "titans" used to be regular ass cards. Look at the GF-200 in the gtx580 and then the GK104 in the 780 and then GK100 in the titan. They intentionally offset the stack so they can charge more for an 80 class card without backlash. A 580 was never $700

  13. Just now, YoungBlade said:

    Why?

     

    Look up the reviews of the GTX 1080Ti, a high end card that came out $50 more expensive than the launch price of the GTX 980 Ti and offering about 60% more performance with nearly double the VRAM. The 1080Ti is legendary for its great value at the $700 price point - a lot of people even call it a "mistake" on the part of Nvidia for releasing such a good value card. GN will often joke that "Nvidia's not going to make that mistake again."

     

    And yet, despite being legendary for its value, it was still more expensive than the previous gen by 8%.

     

    To say that the norm should be 50%+ performance for the same price as last gen, when you look at the context of the history of GPU pricing, is strange to me.

    That's just called corporate dumbfuckery. Any argument against cards staying the same price is just some suit trying to add 1 penny to a company's bottom line, moreover it's dumb to say THAT, instead of following the trend of past generations. And it's not even 50%, in some cases a performance increase of 100% but without competition nvidia won't EVER do that again (especially since, technically, every x80 card since the 780 is a midrange card sold for high end prices)

  14. 2 minutes ago, RONOTHAN## said:

    Check the clock speeds while it's running CS:GO, I've seen it before where a 30 series card doesn't recognize a game as enough of a load and doesn't actually leave idle. In Minecraft, for instance, a 980 Ti would outperform my 3080 unless I turned on shaders. If this is the case (I'm not 100% on that, CS:GO was not a game this happened with), the trick should be to enable "Prefer Maximum Performance" in the Nvidia Control Panel

    Unless something changed, from my experience what "prefer maximum performance" does is it makes the card run at 3D clocks no matter what, even on the desktop which isn't exactly ideal. Maybe have him set "prefer maximum performance" only as a per-game setting rather than global.

  15. Probably a dumb suggestion, but have you checked your resolution? Did you turn on Dynamic Super Resolution?

     

    It could very well be that csgo runs at a higher resolution somehow, otherwise I'm stumped too.

     

    Edit: I'm mentioning Dynamic Super Resolution because it basically enables the video card to render at a resolution higher than native, then scale it down to your monitor. This used to be called super sampling anti aliasing.

     

    It's also the case that when enabling it, some games see the new resolutions and try to run on the highest possible one. (This happened to me, but with AMD's equivalent when launching doom eternal with my new 6700 XT, it autoset the resolution to 8k or something, and of course it ran like ass.)

×