Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Mira Yurizaki

Member
  • Content Count

    18,585
  • Joined

  • Last visited

Status Updates posted by Mira Yurizaki

  1. https://www.nytimes.com/puzzles/tiles

    This is a fun time killer. But you can only play it four times per day unless you subscribe.

    1. Show previous comments  3 more
    2. Mira Yurizaki

      Mira Yurizaki

      Aha, I found a challenge: get the lowest possible perfect score. If you can clear out two or more patterns, it only counts as one combo.

       

      So far I got 41.

    3. Mira Yurizaki

      Mira Yurizaki

      What the heck. They changed the patterns. 😧

    4. Jtalk4456

      Jtalk4456

      the highest I could get was 41, the lowest was 38, it's interesting but a bit too easy. 

  2. Tom's Hardware reports that the Ryzen 9 3950X beats the Core i9-9980XE in Geekbench single core results.

     

    This is starting to confirm a theory I had with Geekbench and why it scores well on Apple SoCs compared to other ARM processors and manages to sneak up on Intel's lower end offerings. But the short of it Skylake really only has four execution ports and they're shared between INT and FP operations. There's four other ports but they're only for load/store operations.

     

    Contrast to the first Zen at least which has four INT pipes and four FP pipes. So if Zen 2 has better IPC, I can see that making up for the 200 MHz turbo boost deficit the 3950X has.

    1. Show previous comments  11 more
    2. Sauron

      Sauron

      Whether it can exactly replace geekbench isn't really relevant if geekbench isn't good at what it does - if you want to measure SHA performance specifically that's fair but geekbench is used as a catch-all cpu benchmark and does not accurately distinguish how much of the result is due to SHA and how much of it isn't. I would also argue SHA performance isn't that important beyond a given minimum in general use, particularly on a mobile device.

    3. TopHatProductions115

      TopHatProductions115

      @Sauron 

      Quote

      if geekbench isn't good at what it does

      Quote

      geekbench is used as a catch-all cpu benchmark and does not accurately distinguish how much of the result is due to SHA and how much of it isn't

      Quote

      I would also argue SHA performance isn't that important beyond a given minimum in general use

      Sounds like someone needs to start writing their own benchmarking software if that's the case. You can throw together a suite of tools to do the job if you're really that serious about accurate measurement of overall CPU Integer performance and Compute capabilities. Otherwise, can't help you there. 

    4. TopHatProductions115
  3. I was curious about something and dug around to see if I could find anything regarding performance loss between DirectX 9.0c and something older.

     

    Unfortunately the well is kind of dry here, but Splinter Cell: Chaos Theory provides a good place to look:

    https://www.beyond3d.com/content/articles/88/

    https://www.extremetech.com/computing/75570-benchmarking-splinter-cell-chaos-theory/3

     

    The tl;dr, enabling all 9.0c features makes performance tank on GeForce 6. Most of it was due to HDR rendering.

  4. Friendly reminder: The RTX 2070 starts at $500, not $600

     

    https://www.evga.com/products/product.aspx?pn=08G-P4-1071-KR

    1. Mira Yurizaki

      Mira Yurizaki

      I suppose I'm going to sound like an NVIDIA apologist but eh, if you're going to make price comparison, at  least do it right.

    2. Dan Castellaneta

      Dan Castellaneta

      No, you make a completely valid point: AMD fanboys here love to twist shit to work in their favor. It's fine to compare GPUs from Nvidia and AMD, but motherfuckers need to get their facts straight before claiming things.

  5. Although I could take away something from the Project Scarlett announcement video: they mentioned hardware accelerated ray tracing. Did they just indirectly confirm Navi, or whatever AMD's future GPU will be, will have it?

    1. Dan Castellaneta

      Dan Castellaneta

      ...potentially? It wouldn't shock me if Navi ends up with a proper hardware-based ray/path tracing pipeline, especially if developers aim for nicer and more realistic scenery in their games.

    2. fpo

      fpo

      My personal favourite part is the ssd. They’re actually using it as virtual ram. xD

  6. I just watched that Xbox Project Scarlett announcement trailer. My only thoughts are:

     

    • Mark Cerny basically said what they said about the PS5.
    • Have any of those people who were speaking been paying attention to what's been going on in the PC hardware space?
    1. fpo

      fpo

      Did Microsoft announce the 4th Xbox? 

    2. Mira Yurizaki

      Mira Yurizaki

      @fpo

       

    3. fpo

      fpo

      Take a shot every time they say “we want” 

       

      all of the technologies they’ve listed have been around a long time. 

  7. Dark mode appears to be a purely aesthetic choice, except if using certain screens, then it's a battery saving one: http://flip.it/lQERmP

    1. TopHatProductions115

      TopHatProductions115

      An article on Dark Mode, in blinding white :P 

       

      Evil...

  8. More Unreal Engine ray-tracing stuff :B

  9. I feel like I'm going to stir up the hive with this one. But oh well.

     

    So on some adventures of looking up to see if AMD cards ever ran a DXR fallback layer compatible app, I came across this article (https://www.pc-better.com/dxr-on-radeon-vii/) of someone who ran a Radeon VII on the DXR samples that Microsoft has on GitHub. Then something interesting caught my eye

     

    RadeonVII-DXR-2.png

     

    The two statistics I'm looking at are the FPS (at 10) and the Million Primary Rays per second, at 7.69.

     

    So why do I bring this up? Well there was a thread on Reddit of people who ran the same demo on non-RTX NVIDIA hardware back in October 2018. The particular demo in the screen cap is the "reflection test". I don't think there's a GPU that was tested that was below 10 million primary rays per second.

     

    It makes me wonder what's going on here. My best guess at the moment is DXR is optimized to run on the graphics pipeline, which is a weaker point in GCN's architecture. I'm sure if you managed to convert this to compute functionality (which AMD's Radeon Rays ray tracing API leverages using OpenCL), the results would be much better.

     

  10. That MPX slot in the new Mac Pro reminds me of the VESA Local Bus.

     

    Funnily enough, VLB was also designed with graphics cards in mind.

    1. Mira Yurizaki

      Mira Yurizaki

      Also Apple is reusing initialisms. MPX was the name for the CPU bus used in PowerPC G4 processors.

  11. Why are people using consumer hardware pricing to judge the Mac Pro?

     

    I mean, I'm looking around at all of the professional workstations that other system builders are doing and I can't even configure anything like the Mac Pro due to it having two graphics cards. And even when I throw in the rest of the specs, the only one that seems to be cheaper is Lenovo, but they seem to be stopping at the 6-core Xeon W. Dell and HP both trip over $5000, and you still need another video card.

     

    I mean if you want to build a workstation for yourself with consumer grade parts, that's fine. But then the Mac Pro isn't aimed for you either.

     

    And no, I'm not defending its price. I'm arguing that if you're going to attack it's price, attack it properly.

    1. Show previous comments  2 more
    2. FloRolf

      FloRolf

      What are "professional grade" parts? A xeon isn't any more professional than a Core-I series, it's just the name. 

    3. TopHatProductions115

      TopHatProductions115

      @FloRolf I think Mira's referring to the intended audience/consumer. Xeon, Threadripper, and EPYC are all Enterprise/Professional parts, while consumer Core and Ryzen parts aren't a part of the same market segment...

    4. Mira Yurizaki

      Mira Yurizaki

      "Professional/Workstation grade" means it's intended audience like @TopHatProductions115 mentions.

       

      However there are additional tweaks regarding said hardware because they're expected to be run continuously and reliably for extended periods of times. Consumer grade parts are not designed around this expectation.

  12. If anyone's curious about bottlenecks work, I posted a blog about it: https://linustechtips.com/main/blogs/entry/1625-how-does-the-cpugpu-bottleneck-work/

     

    Note: This is not a troubleshooting guide. So don't ask me questions about "will X bottleneck Y?"

    1. Show previous comments  3 more
    2. LukeSavenije

      LukeSavenije

      will my pentium 4 bottleneck my crossfired v340s?

    3. LukeSavenije

      LukeSavenije

      seriously speaking tho, release it as a topic, I'll send some mods after you

    4. TopHatProductions115

      TopHatProductions115

      Will my overclocked Xeon W-3175X bottleneck mah NVlink'd Titan RTX's?

  13. If you wanted to get a reference point, is it fair to let a GPU that has boost clock capabilities to be benched with said boost speeds enabled?

     

    Because boost speeds are variable and aren't exactly guaranteed.

    1. Show previous comments  3 more
    2. Dan Castellaneta

      Dan Castellaneta

      I personally would just lock the card to its standard speed just for consistency and then denote that your frame rates may be higher or lower, depending on if your GPU clocks up for a short period.

    3. Mira Yurizaki

      Mira Yurizaki

      @Mr. horse "OCing" any part these days usually means raising the boost clock limit. If you are actually OCing the part, you would be changing the base clock speed from stock.

       

      Turbo boosting is not supposed to be guaranteed. Base clocks are guaranteed, provided the hardware is operating within the manufacturer's specifications.

    4. TopHatProductions115

      TopHatProductions115

      Lock it to stock clocks, like @Dan Castellaneta says. Either that, or set a range that goes from stock to 250MHz above stock. most GPUs can probably handle that.

  14. More evidence to throw in to the pile of why I question VRAM utilization reports as a measure of what a game needs in order to run smoothly (from https://www.guru3d.com/articles_pages/the_division_2_pc_graphics_performance_benchmark_review,6.html)

     

    index.php?ct=articles&action=file&id=494

    If the game really needed and used 12GB of VRAM, why isn't it a slogfest on the RTX 2080 and GTX 1660?

     

    Which also brings up another point: we assume that when a game maxes out the video card's VRAM and it wants/needs more, it'll just start swapping out data from VRAM to system RAM. And when that happens we assume it's like what happens when the computer itself starts swapping out data in system RAM to storage: it'll start hiccuping and becoming unresponsive.

     

    However, in modern games I don't think everything in VRAM is absolutely necessary at a given time when rendering a frame. The only thing I do know that's absolutely necessary are the render targets and the base quality assets for textures and meshes. The renderer doesn't necessarily need higher quality assets if it has the base quality ones to render the image. The image will just look fugly. So I think there is an absolute minimum amount of VRAM that's needed at a given resolution, but outside of that, the game uses the rest simply as cache space because locality is still important and if you can use the RAM, why not?

  15. The next time you want to ask "What happens if I do...?" and you can do it, why not do it and see what happens?

    1. Show previous comments  3 more
    2. ARikozuM

      ARikozuM

      I'll press all the "F" keys for you. 

    3. PacketMan

      PacketMan

      "Hey guys can I run DOTA2 in this computer? I'm too lazy to try..." 🤦‍♂️

    4. fpo

      fpo

      All programming section threads in a nutshell.

  16. When you play games on your local computer, but the files are stored on your NAS. 🤔

     

    (just showing that, yes, you can play games over the network, assuming the network path is mapped to a drive)

    call-of-duty-network.png

    1. Show previous comments  5 more
    2. ARikozuM

      ARikozuM

      You only need a 100Mb/s dedicated line from your PC to the server (through a switch or otherwise). I've only had trouble with HuniePop. 

      Spoiler

      Which is to say no troubles. 

       

    3. TopHatProductions115

      TopHatProductions115

      @ARikozuM That's the problem - I don't have that available. I'm stuck with Mbps. Check my original reply...

    4. ARikozuM

      ARikozuM

      You don't anything then? Poor guy... 

       

      F

    1. Show previous comments  2 more
    2. PacketMan

      PacketMan

      @WikiForce The ones that missed news have no longer the right to know

    3. lewdicrous

      lewdicrous

      You heathens are not up to date with your news?!

       

      /s

    4. TopHatProductions115

      TopHatProductions115

      I like this story, no matter the age :3

  17. How to make Microsoft work on an older OS: find a vulnerability that can make their OSes a slave to a worm (https://arstechnica.com/information-technology/2019/05/microsoft-warns-wormable-windows-bug-could-lead-to-another-wannacry/)

     

    Non-silly text: Microsoft is patching Windows XP and Server 2003 to fix a vulnerability that could let another WannaCry worm spread. The vulnerability also affects Windows 7. It does not exist on Windows 8 or 10.

    1. Show previous comments  1 more
    2. Schnoz

      Schnoz

      @Techstorm970I once accidentally did unleash Wannacry on my school but I closed it in Task Manager before it encrypted that many files...

       

      fear me mortals

    3. ARikozuM

      ARikozuM

      Windows 10 installation is a WannaCry unto itself. 

    4. fpo

      fpo

      @Schnoz, earth just lost its best defender. 

  18. When you're watching a TV show that was produced in Japan and it's using background music from an anime. And you know this because you recognized it.

    1. TVwazhere

      TVwazhere

      Quote

      When you're watching a TV

      Stop watching me! Creep!

      One day ill stop making these jokes. Today is not that day

  19. AnandTech went back and retested the i7-2600K against the i7-7700K (Intel's last mainstream quad-core) and for craps and laughs, the i7-9700K: https://www.anandtech.com/show/14043/upgrading-from-an-intel-core-i7-2600k-testing-sandy-bridge-in-2019/4

     

    If you want the tl;dr version in picture form (note the baseline is the i7-2600K at 4.7GHz)

    CPU%20Results.png

    Gaming%20Results.png

    Gaming%20Percentile.png

    1. Show previous comments  1 more
    2. Mira Yurizaki

      Mira Yurizaki

      Oh, it would be noteworthy to point out the GPU they used in gaming tests was an MSI GTX 1080 Gaming X.

    3. VegetableStu

      VegetableStu

      ... fair. 🤔

    4. Ashiella

      Ashiella

      AMD needs a graph.

  20. Remember a few years ago when Tim Sweeney was criticizing Microsoft for trying to create a walled garden or whatever (https://gamerant.com/epic-games-ceo-critizes-microsoft-windows-platform-initiative/)?

    Quote

    Part of Sweeney’s argument can be seen from the fact that Remedy’s third-person action game, Quantum Break, is coming to the PC, but it will not be offered or supported on Steam, one of, if not the biggest distribution services on the platform. Instead, players will need to purchase the game through the Windows 10 Store or pick up a copy on Xbox One which also comes with a code for the PC version as well.

    🤔

    1. Dan Castellaneta

      Dan Castellaneta

      Rules for thee but not for me.

    2. imreloadin

      imreloadin

      That was before the hookers and blow money that came from Fortnite...

  21. > Sees ad for "Blink" eye drops on YouTube, showing how fast it acts.

    > Notices it's at 60 FPS

     

    I see what they did there.

  22. According to this Gizmodo article/video, we can blame the following for the gamer industrial design

    • Sega for making the Genesis/Mega Drive stand out instead of blend in
    • Sci-Fi movies designed things to have bright RGB lines and stuff (Alienware's general manager specifically called out Tron)

    And I suppose somehow Razer is the lord and savior for shifting some designers towards less gaudy designs.

    1. TopHatProductions115

      TopHatProductions115

      oof - no wonder I can't buy gamer gear nowadays. 2 much RGB...

    2. ARikozuM

      ARikozuM

      I wouldn't mind seeing the Super NES material design again. Something about gray (sorry, Siri, does grey~ bother you? Stupid cunt.) and purple just looks so good. 

  23. Dumb IT joke. What's the preferred way a moth uses to set up a server?

     

    Spoiler

    A LAMP stack

     

×