Jump to content

TECHNOMANCER303

Member
  • Posts

    70
  • Joined

  • Last visited

Reputation Activity

  1. Funny
    TECHNOMANCER303 got a reaction from RedhaRay in GPU to buy   
    Def EVGA asus is great but terrifies and evga is usually cheaper or on sale. And noctua as much as i love them you dont need the cooling on 3070 get the ti
  2. Agree
    TECHNOMANCER303 got a reaction from ThatOneWhoRippedACpu in Fun fact   
    Context please
  3. Funny
    TECHNOMANCER303 got a reaction from da na in Intel 13th Gen 'Raptor Lake' Core i9-13900 ES performance looks promising, 33-50% performance uplift over 12th Gen ‘Alder Lake’ Core i9-12900   
    What is ES??? is that like KS or kf or abcdefg? If it means the same as before then why does intel have to keep changing things. if its different then np.
  4. Agree
    TECHNOMANCER303 reacted to trevb0t in New PC Build - need review   
    Basically for a gaming PC, the GPU is going to be the main star of the show.
    Getting the best GPU you can for your build at your budget is generally the thing to do on a gaming PC, and since NVENC (nVidia) exists, unless you want to record your gameplay in UHD or something, then GPU is more important than CPU for streaming also.
    I generally agree. I'd personally be going for a 3070 in this budget range.
  5. Like
  6. Like
    TECHNOMANCER303 reacted to SpartanA259 in New PC Build - need review   
    Thanks for all the advice. At microcenter I found a 12400 for $159.99. I'd assume that's better than the f version because it's the same but with graphics for a cheaper price. I don't know why you would need a better gpu, if someone could answer that that would be great. If I do upgrade the gpu, I'd probably want a better monitor like 1440p. The motherboard I would want to stay the same even with the 12400 because it has good IO and most importantly, it has wifi. I can't really use ethernet so I have to use wifi. I'd preferable want ATX for future expansion. Due to popular demand and self contemplation, I will probably change the cooler to a better non rgb one. I would then be able to get rid of the rgb hub. This PC I'd want to be as long lasting as possible, despite the new rapid improvements. And rgb speakers :D.
  7. Like
    TECHNOMANCER303 reacted to thewelshbrummie in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    I'd refer you both to a fairly old Top Gear episode that's relevant. I managed to find my TV recording of the episode (series 11, episode 1 from June 2008), so I can quote the stats. Clarkson took a BMW M3 4 litre V8 around the track for several laps and followed a 1.5L 4 cylinder Prius being driven by the Stig who was driving it as fast as possible (so that Clarkson was effectively driving it at the same speed as the Prius). The BMW did 19.4mpg vs 17.2 for the Prius. So at an equivalent speed, the more powerful car was actually more efficient.
     
    That's the point @TECHNOMANCER303 that you've missed by creating this thread. You're looking at headline/max powerdraw and therefore interpreting that as poor efficiency, while ignoring that at a reduced load the parts in question will use a fraction of the maximum power draw that they can use under 100% load. You seem to have forgotten (or intentionally ignored to make a specific point), that components don't use their full power draw at less than 100% utilisation so may be just as efficient, or more efficient, than their predecessors. Just because AMD parts don't go much beyond 170W vs a 241W max for the 12900K at high loads, doesn't necessarily mean that they are more power efficient. As Clarkson put it, it's not what you use but how you use it that matters.
  8. Like
    TECHNOMANCER303 reacted to xAcid9 in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    Current gen is more efficient than last gen(majority of them). Techpowerup tested this on their review. 🤔
     

     
    Just because they're chugging 300-400 watts, doesn't mean they're not efficient. 
  9. Like
    TECHNOMANCER303 reacted to RONOTHAN## in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    This is a computer forum, most people who are serious enough to be on here regularly are also pretty into cars, they do share lots of the same draws (customization, crazy performance numbers, etc.). It's rarer to find someone who's not. Computer overclocking IMO is very similar to doing dyno pulls and drag race runs, except a lot more affordable. 
     
    I do have both a 6900XT and a 3080 (long story) that I was planning on drag racing over the week end to see which would stay in my main rig. I can (if you want me to) undervolt them to be running the same general performance (I'll target the same score in something like Time Spy give or take 100 points) and see which card ends up being more efficient. The AMD card will likely be more efficient thanks to it's stupidly aggressive sleep algorithm (the core will actually turn off in between frame renders then turn back on, affectionately referred to by some as "power naps"), but it still might be somewhat interesting to see. 
     
    I could theoretically do the same thing with AMD vs. Intel, I do know a guy with 12700K that I could drag race against the similarly performing 5900X, undervolt both and see which would draw less power under load, but at the same time I doubt he'd let me borrow his computer over the weekend. 
  10. Funny
    TECHNOMANCER303 reacted to Quackers101 in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    SLI gone, double the card, watts and price? yes pls.
  11. Like
    TECHNOMANCER303 reacted to KaitouX in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    https://www.forum-3dcenter.org/vbulletin/showpost.php?p=12852240&postcount=530
    https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-tested-at-various-power-limits/
    You can get some pretty huge power savings with both the 3090 and 12900K by just undervolting or power limiting them, and often the performance penalty is pretty small as long as you don't limit it too much.
     
    Nvidia, Intel and AMD all consider that power consumption doesn't really matter when it comes to their flagships, they will gladly increase the power consumption in 30%+ to achieve 5% higher performance in those products. The 5950X doesn't use more power at stock probably because many motherboards wouldn't be able to run it and it would be way too hot for normal air cooling, and they already announced that AM5 will have CPUs with 230W PPT, so AMD is 100% going in the same direction. The 6950XT is basically the same as the 3090, similar performance and similar power consumption. The 3090Ti still is the worst offender though.
    Nvidia's Ampere is a bit weird, as even lower end parts are way out of their sweet spot, some GPUs like the 3070/3060Ti are able to achieve similar performance to stock(<5% difference) while using 20% less power. AMD's RDNA2 on the other hand seems to start losing performance earlier in my experience.
  12. Informative
    TECHNOMANCER303 reacted to Somerandomtechyboi in So I was going to LC 2 3090ti but...   
    Why even watercool other than purely for aesthetic reasons?
     
    imo custom water is a complete joke only for ppl that wanna show off theyre builds, though judging by this
    Id assume you wanna have a go at ocing, unfortunately cpu oc is dead and has been dead since 9th gen, though gpus and ram oc are still very much alive and will give noticable benifits (though for ram oc that depends on appplication), even then gpus should still oc fine on theyre stock coolers
     
    if you wanna have a noticable oc boost from watercooling id suggest using a big car/truck rad or 2, no point going water when you arent keeping your components near ambient or under 60c when loaded
     
     
    Basically dont bother with watercooling unless you wanna run a beefy car/truck rad or 2 for minimum below 60c operation at all times due to normal aircoolers being fine for cpus and gpus
     
    Because ddr5 is overpriced garbage that nets you very little extra performance over ddr4, even cheap oc rams like ballistix that can do 4600+ cl18 will destroy even the highend ddr5 kits, even if you are lazy and just oc to 4000 cl16 itll still wipe most ddr5 performance wise
     
    Also binned samsung b die kits that can do 5000 cl16 at near 1.75v, so lets just say ddr5 has got alot of competition which is cheaper and straight up superior performance wise
     
    If were not talking ludicrous oc rams 3200 and 3600 ddr4 will perform similarly to 5200 ddr5 which is the main reason everyone is bashing the sht out of ddr5, why pay 2x markup for basically the same performance?
     
     
    Basically ddr5 is trash because its a new standard and has not shown much performance improvement over ddr4 or gets straight up beaten by ddr4, maybe wait a year or 2 for faster kits to become mainstream for ddr5 to become worthwhile
  13. Like
    TECHNOMANCER303 reacted to RONOTHAN## in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    Not exactly for that analogy. Yes, the ol' school V8s with super chargers on them were hilariously inefficient, but that doesn't actually mean that the turbo 4 bangers are actually more efficient. 
     
    If you drive an Ecoboost Mustang like you're supposed to drive a mustang (I.E. somewhat aggressively, not babying the throttle) you will actually be getting something more along the lines of 15-20MPG, very similar to the likes of the Mustang GT of the same year if you're driving it the same way. Turbo 4 bangers have a very weird efficiency curve to them, they are very good at RPMs between 1.5k-3k, but above that when the turbo starts to spool the fuel efficiency drops off a cliff (for a variety of reasons, one of which being that in order to prevent the engine from knocking to death they need to run very rich) and becomes about as efficient as a V8 or worse but without the gearing advantage those have, thus getting worse gas mileage. Turbos mean you're either getting power or fuel efficiency, you can't get both. If you're gonna drive a car relatively hard, you'd actually be better off getting a big NA motor if you care about MPG.
     
    Plus, thanks to the aforementioned gearing advantages of the V8s, when naturally aspirated they still can get very good gas mileage, the old C5 Corvette for example will get 40MPG highway stock if you aren't dogging on it thanks to the fact that in 6th gear at highway speed the engine will only be turning ~1k RPM thanks to all the torque they've got down low. 
     
    If you want a more accurate analogy, it's more like turbo charging a car. The idle efficiency stays roughly the same, though when you hammer it, it's 10-20% faster for ~50% more fuel usage (numbers aren't 100% accurate, but still). It is still a viable strategy, since a 12900K, for example, in games will still be relatively tame power wise (roughly in line with something like a 5900X), it's just that when you're at 100% load they lose any semblance of efficiency and run balls to the wall
  14. Like
    TECHNOMANCER303 reacted to dogwitch in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    when you have to under volt...  that a issue itself.
    the most powerful super computer and many other  that are coming online  are switching  to amd this time...
    general that nvidia bread and butter ,(that where consumer gpu come from).
    nvidia knew that had heat/power issue right about 1080 era/server gpu.
    native rez for games hit a wall with tech around 360 era... hell games current gen are still not using hd as a indrusty standard due to vram cost ,bw and needed size of vram bottom end 24. really 40gb.
    hell the new unreal engine. is doing a old trick with a bit of a.i. to render gaming now.
     
  15. Like
    TECHNOMANCER303 reacted to xg32 in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    someone on youtube did a test, the 3090's power draw and performance is linear with the 1080tie so it's actually faster than the previous gens on all metrics.
     
    it's more efficient undervolted since it's in the exponential part of its own curve tho*, 350w is fine for me on gpu, wouldn't consider it cheating.
     
    for cpu i'm leaning amd+ddr5+16cores.
  16. Like
    TECHNOMANCER303 reacted to maartendc in So I was going to LC 2 3090ti but...   
    40 series won't be long now, and is rumored to be MUCH faster than 30 series, so I would wait before spending THAT kind of money.
     
    If you are building midrange, you can buy now, since the first cards out will be 4090 and 4080 tier. The xx70 and xx60 series always come later.
  17. Like
    TECHNOMANCER303 reacted to Unedited_Mind in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    My RTX 3070 goes from a 220w GPU to a 260w GPU when I allow it to clock it's self to like 2055mhz instead of 1965mhz gaining like 9 FPS or less which for the most part is never noticed in game.
     
    If I under-volt it and cap it to 1940mhz I lose around 0.5 FPS from stock and power drops to like 175w.
     
     
    Something like this...
     
     
     
     

     
    167w vs 220w stock.
  18. Like
    TECHNOMANCER303 reacted to IkeaGnome in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    That page actually bring up a very valid point.
    Seems like they are more efficient when locked to 60 fps at the same settings as older cards.
    3090 uses less power than a 1650 and 75% of the power of a 1660S. The extra power they can pull is to make more and more frames because no one is happy with enough. They always need more.
  19. Like
    TECHNOMANCER303 reacted to Zando_ in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    Have.... Have you seen their latest high end cards? Y'know Vega and Vega II were the GPUs that started the trends of high transient spikes that tripped a shitload of PSUs right? I used my Vega FE to heat my room in the winter, it was actually better than my 1080s at shoving heat into the room (and a bit slower to boot). Had the RVII for a bit till AMD fucked the drivers, it wasn't a light sipping card either. From what I've seen the latest cards suck a lot of wattage as well. For even older stuff, friend has a rig with an R9 290X, its a bit faster than my 780s but actually even more power hungry, so AMD cards being power monsters goes back a while.
     
    AMD's Ryzen CPUs have been way more efficient than most Intel chips (almost all the past ones and all the current high tier ones, the new i3 1200 can give 6c chips a run for their money though) though, that's for sure.
    More power usually = more faster if you have the cooling to handle it. Most PC builders want a faster PC, so they don't mind the power draw. I've never shot for more power draw in general use (especially as it's summer rn, my 250-300W of PC - under gaming load - heats my lounge too much already), but I did shove massive power targets through my RVII to try and get the highest clocks possible for benchmarks.
    I think you mean diminishing returns? Have to pump exponentially more power for each step in performance.
  20. Like
    TECHNOMANCER303 reacted to Montana One-Six in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    AMD isn't far off from NVIDIA and in some load scenarios even surpasses NVIDIA in power consumption (talking about top of the line): https://www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/35.html . So I wouldn't really call either of them efficient and depending on the use case one is more efficient than the other.
     
    I don't know anything about GPU architecture stuff but I know those don't allow you to break the rules of physics and I know from high school physics that there is no system on earth that is 100% efficient. Since It also seems like the heat output is getting higher with each generation meaning more power that goes in is "wasted". Which makes it impossible at some point to get higher performance without increasing the power to get there. It's a rather crude explanation with basic physics but my point is at some point there is nothing left to do other than increasing power to get more performance.
  21. Like
    TECHNOMANCER303 reacted to Middcore in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    Well this is a seriously silly way of thinking. There is no benefit to greater power consumption in and of itself, in fact it's a harm (albeit a very very marginal one) to you and everyone else on the planet. 
     
    In their own way some PC guys are no different from rednecks in big trucks "rolling coal" to "pwn the libs" when they go by a Prius. 
  22. Funny
    TECHNOMANCER303 reacted to AnonymousGuy in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    Why you gotta go write my biography like that?!
     
    But seriously in a desktop I couldn't give a shit about power consumption.  The more the better.
  23. Like
    TECHNOMANCER303 reacted to 8tg in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    This has gone back and forth for multiple decades. 
    How do you increase performance on the same technology when your new technology is taking a while? Add power.
     
    The Pentium 4 was technically slower than the Pentium III, because the Athlon XP showed up and shit on everything intel had done.
    And still when the Athlon 64 dropped the Pentium D was just two Pentium 4’s stuck together and was a comparative housefire at the time, while that wattage became fairly normal later.
    Then came the core2duo and core2quad which owned the entire market while AMD was trying to keep up adding more cores and higher frequency to the Phenom line. 
    Intel i-series came out, here comes the AMD FX line, absolute disaster. More cores, more power, suddenly you’ve got the FX 9590 which was known to immediately destroy the VRMs on low end boards and caused multiple actual house fires.
    AMD can’t keep up, intel sits on effectively the same tech for 5ish years until Ryzen comes around, suddenly there’s competition, the solution from intel? 8th gen, more cores.
    Things stagnate a bit with Ryzen 2, intel goes into 9th and 10th gen with more cores, more power, Ryzen 3 shows it’s face and the 11900k is born, drawing up to 400 watts or more when overclocked just to compete with the 5800x and up.

    You’re about to see the same thing happen again from one side or the other. Alder lake is really impressive and another generation of this P+E cores concept might cement intel this next generation, meaning Ryzen 6000/7000 might rely on cores + power again.
     
    Nvidia and AMD have been doing the same thing for ages. And with intel showing off a gpu that seems to be on par with an rtx 3060, that’s just the first retail product disclosed, knowing full well they have products for the entire comparative lineup ready to go, there will be rushed launches from both amd and nvidia to compensate for that potential competition.
    Intel has the capability to upset that duopoly with massive scale domestic development and then the same thing all 3 companies can do and have done in the past:
    -more cores
    -more power
     
    Because the 4090 is basically that, it’s just more cores and more power but as a gpu instead of a cpu in concept. It’s not like AMD didn’t just do the same thing with the 6950XT, intel can do it just as well and pull a 2nd generation out of thin air drawing 500 watts from the wall and stomping on everyone.
  24. Like
    TECHNOMANCER303 reacted to RONOTHAN## in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    They are efficient, it's just that in order to get the most performance out of them, they need to push the power limits well past where the peak of the efficiency curve is. There's a reason why undervolting Nvidia cards is so popular, you can decrease the power draw by 20%+ by only dropping the performance 2-3%.
     
    Intel is the same way, their efficiency is actually basically identical to AMD right now. Look at the 12400, it performs very similar to the similarly positioned 5600X, load power consumption is basically identical, and idle consumption is slightly better than the 5600x. Their top end SKUs just scale with more power consumption, so in order to get the performance crown, they need to raise the power limits in order to do so. 
     
    It would be nice to see stuff a little more efficient, especially for some areas, but with power being cheap where I live and I personally preferring large cases with better cooling, it's not that big a deal personally. 
  25. Funny
    TECHNOMANCER303 reacted to Middcore in NVIDIA Cheating with 3090 and 40 series. Also Intel with 12900ks   
    It's not "cheating," it's just a design choice. 
     
    I do find it sort of amusing that power efficiency was something people always brought up as one of the reasons to buy an Nvidia card when they had the edge there but nobody brings it up in discussions of what card to get now. 
     
    Nvidia made a (probably correct) calculation that nobody really cares about power draw if throwing more juice at the problem results in suitably eye-popping performance numbers to be touted. 
     
    The truth is many people secretly or not-so-secretly like the idea of having a power-hungry card that needs a gigantor PSU, as long as they think it's getting them the "best" performance by whatever tiny margin. I've talked to people on subreddits and forums who would insist on buying overkill PSU's purely for e-peen and become actively hostile if you tried to show them power consumption calculators proving it was a waste. 
×