Jump to content

uzzi38

Member
  • Posts

    16
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    uzzi38 got a reaction from flopana in [HardwareLeaks/_rogame] - Exclusive first look at Nvidia’s Ampere Gaming performance   
    https://hardwareleaks.com/2020/06/21/exclusive-first-look-at-nvidias-ampere-gaming-performance/
     
    Long story short are the following details from the article:
     
    GPU core clock reports as 1935mhz, Memory clock is 6000mhz but likely a misreport due to new memory type/early driver.
     
    EDIT: Alright, if I'm to add in something original, then here we go:
     

    GA100 at least doesn't show any major uplifts in performance/flop or IPC, whatever you want to call it. There seems to be little adjustments to the uArch in terms of shaders. Consumer facing Ampere I'd hope is different, but I can't see some revolutionary jump in performance coming as a result of this.
     
     

    The score is in line with what you'd expect from the same kind of clocks and 20-something% extra CUDA cores. I do hope the final clocks are higher though at the least, especially given the 350W rumour. At, say, 2.3GHz, it would become a 50% lead over the 2080Ti which is more like what you'd expect generation on generation. But in any case, the main point I'm trying to make here I'm fairly sure this is either the rumoured 3090 or Titan GPUs.
  2. Agree
    uzzi38 got a reaction from BiG StroOnZ in [HardwareLeaks/_rogame] - Exclusive first look at Nvidia’s Ampere Gaming performance   
    That 350W rumour exists for a reason.
     
    I highly doubt Nvidia will be happy with only a 30% uptick this generation. Not even slightly.
  3. Agree
    uzzi38 got a reaction from BiG StroOnZ in [HardwareLeaks/_rogame] - Exclusive first look at Nvidia’s Ampere Gaming performance   
    Yeah, so uh, we may or may not have used Guru3D's numbers originally (first search result that came up) but for some reason they compared multiple GPUs together with overall performance ratings, not graphics performance ratings from 3DMark like you'd think.
     
    The new numbers are actually purely graphics scores and can actually be compared with standard 2080ti performance figures.
  4. Informative
    uzzi38 got a reaction from Taf the Ghost in [HardwareLeaks/_rogame] - Exclusive first look at Nvidia’s Ampere Gaming performance   
    https://hardwareleaks.com/2020/06/21/exclusive-first-look-at-nvidias-ampere-gaming-performance/
     
    Long story short are the following details from the article:
     
    GPU core clock reports as 1935mhz, Memory clock is 6000mhz but likely a misreport due to new memory type/early driver.
     
    EDIT: Alright, if I'm to add in something original, then here we go:
     

    GA100 at least doesn't show any major uplifts in performance/flop or IPC, whatever you want to call it. There seems to be little adjustments to the uArch in terms of shaders. Consumer facing Ampere I'd hope is different, but I can't see some revolutionary jump in performance coming as a result of this.
     
     

    The score is in line with what you'd expect from the same kind of clocks and 20-something% extra CUDA cores. I do hope the final clocks are higher though at the least, especially given the 350W rumour. At, say, 2.3GHz, it would become a 50% lead over the 2080Ti which is more like what you'd expect generation on generation. But in any case, the main point I'm trying to make here I'm fairly sure this is either the rumoured 3090 or Titan GPUs.
  5. Like
    uzzi38 got a reaction from Perrin in [HardwareLeaks/_rogame] - Exclusive first look at Nvidia’s Ampere Gaming performance   
    https://hardwareleaks.com/2020/06/21/exclusive-first-look-at-nvidias-ampere-gaming-performance/
     
    Long story short are the following details from the article:
     
    GPU core clock reports as 1935mhz, Memory clock is 6000mhz but likely a misreport due to new memory type/early driver.
     
    EDIT: Alright, if I'm to add in something original, then here we go:
     

    GA100 at least doesn't show any major uplifts in performance/flop or IPC, whatever you want to call it. There seems to be little adjustments to the uArch in terms of shaders. Consumer facing Ampere I'd hope is different, but I can't see some revolutionary jump in performance coming as a result of this.
     
     

    The score is in line with what you'd expect from the same kind of clocks and 20-something% extra CUDA cores. I do hope the final clocks are higher though at the least, especially given the 350W rumour. At, say, 2.3GHz, it would become a 50% lead over the 2080Ti which is more like what you'd expect generation on generation. But in any case, the main point I'm trying to make here I'm fairly sure this is either the rumoured 3090 or Titan GPUs.
  6. Informative
    uzzi38 got a reaction from Taf the Ghost in Here to make an impression - AMD to release integrated graphics desktop APU in July   
    Nah, the Vega uArch itself hasn't changed with Renoir past the media and display engines getting switched out for Navi's. The main uplift in performance actually does come from the increase in sustained clocks, the increased clocks boosted rasterisation performance - the difference scales perfectly with clocks as the boosted iGPU clocks also boost the output of the related specifc FF hardware. The CUs themselves are practically identical otherwise.
     
    Also it may or may not be the final Vega iGPU product - only time or a damn good leak will tell.
  7. Like
    uzzi38 got a reaction from mr moose in [UPDATE - 220W CPUs? Get yo chiller] All Your + Are Belong To Us - Comet Lake S performance review leaks AND Power Consumption/Heat Output outted   
    Exactly. Nobody does, which is why nobody really tests battery life like that

    I mean, they often do, but nobody cares about the sustained performance battery life figures. Stuff like PCMark10 results are worth talking about, because more often than not if you're on battery you're doing short, bursty workloads - the type of which is exactly what that suite tests.
  8. Like
    uzzi38 got a reaction from 5x5 in [UPDATE - 220W CPUs? Get yo chiller] All Your + Are Belong To Us - Comet Lake S performance review leaks AND Power Consumption/Heat Output outted   
    At this point I honestly don't know what's worse...
     
    If you want a datapoint for how bad Comet Lake is, then here's one for you:

     
    Cinebench R15 1T turbo on the 10875H at 4.9GHz (reminder, it's rated for 5.1GHz 1T boost under the official specs, +200mhz over this is via Turbo Velocity Boost) and it pulls 35W. And this is a relatively light workload that, iirc, doesn't include any AVX. Certainly not AVX2.
     
    Imagine that. The entire power budget of the 4900HS on a single core for 4.9GHz. In a laptop. 
     
    But I guess it's part of Intel's strategy to improve gaming experience, I mean think of all the RGB you get as your VRMs start glowing on multi-core workloads!
  9. Like
    uzzi38 got a reaction from porina in Userbenchmark does more shady stuff   
    It's also done to reduce idle power draw. SRAM is extremely difficult to power gate, so simply not having the amount they have on desktop helps.
  10. Like
    uzzi38 got a reaction from leadeater in Userbenchmark does more shady stuff   
    It's also done to reduce idle power draw. SRAM is extremely difficult to power gate, so simply not having the amount they have on desktop helps.
×