Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About D13H4RD

  • Title
    The photography enthusiast
  • Birthday Aug 29, 1997

Contact Methods

  • Discord
  • Steam
  • UPlay
  • Xbox Live

Profile Information

  • Location
    Somewhere in the Abyss
  • Gender
  • Interests
    Photography, automobiles, technology, anime, exploring
  • Occupation


  • CPU
    AMD Ryzen 7 3700X
  • Motherboard
    ASUS TUF Gaming X570-Plus Wi-Fi
  • RAM
    32GB (16GBx2) XPG Spectrix D60G DDR4-3200
  • GPU
    MSI GeForce RTX 2070S Gaming X Trio
  • Case
    Cooler Master MasterBox NR600
  • Storage
    512GB XPG SX8200 Pro + 2TB Seagate Barracuda Compute 7200RPM
  • PSU
    750W Corsair RM750i
  • Display(s)
    25" Samsung S25HG50 + 24" LG 24MK600M
  • Cooling
    be quiet! Dark Rock 4 with Kryonaut
  • Keyboard
    Logitech G810 Orion Spectrum
  • Mouse
    Logitech G502 Proteus Spectrum
  • Sound
    ASUS Cerberus + Sony WH-1000XM2
  • Operating System
    Windows 10 Pro
  • Laptop
    Lenovo IdeaPad Y410P
  • PCPartPicker URL

Recent Profile Visitors

51,550 profile views
  1. My thoughts exactly. The only plausible reason I have in my head for why they're doing this is just so they can get some level of publicity. But I also really feel like Tiger Lake's marketing is overwhelmingly biased towards picking comparisons with rival offerings with very few comparisons to their past-generation products. I get that companies like AMD and NVIDIA also do the same, but at least for AMD, they seem to be learning (at least on Ryzen) that you also have to focus on comparison points to your past architectures to showcase how much of an improvement its become. NVIDIA....hasn't rea
  2. The 3080's encoder is the same as the RTX 20 series' encoder. And it has been reviewed very positively as the quality is pretty good, certainly more than sufficient for streaming.
  3. Intel....what are you doing man?


    You finally make something I am actually interested in but your marketing men decided that it'd be better to conduct a smear campaign on your main rival rather than pushing forth the reasons I should buy laptops with your Tiger Lake processors...

  4. I think a 5600X would be fine since you can take advantage of NVENC on the 3080 to help offload some of the encoding load from the CPU.
  5. And that's the really confusing part for me. When you get past all the "INTEL BAD!!11!" amounts of hate the internet seems to have a hard-on these days (along with the generally just BAD naming scheme, like WTF is 1185G7 supposed to be???), the Tiger Lake mobile CPUs really are interesting and potentially quite compelling. The Iris Xe graphics especially are a major marketing bullet-point on its own. But much of the marketing material I've seen lately have not really done all that much to reflect on Tiger Lake's big gains over past Intel architectures and especially their 10nm proc
  6. Okay, what the heck is up with me receiving DMs from bot accounts about explicit material? 



    1. Arika S
    2. Cyberspirit


      Huh, only had that happen once like 3 years ago? Is it common nowadays?

    3. soldier_ph


      Wonder what happens if I take Linus in his LTTStore.com underwear as my pfp 🤔



  7. At this point, Tiger Lake is going to be a case of a potentially interesting product smeared by an exceptionally questionable (at best) marketing strategy. From the entire Tiger Lake launch presentation to this recent one, I really feel like Intel's current marketing's strategy is essentially summed up as "mimic what our fanboys have been doing on YouTube comments". That is to say, find strawman arguments to depict competitors' offerings as being inferior rather than trying to convince people that our products are superior. And that's exactly what Tiger Lake's marketi
  8. Finally had a real-go with DLSS 2.0


    It's definitely something that really shows its worth at higher resolutions. At 1080p, while you get benefits, they are small and you do notice a difference in rendering. At higher resolutions especially at 4K, man....it's actually fantastic in how good it is.

  9. Radeon: looks down on machine-learning driven upscalers like DLSS


    Also Radeon: So uh, DirectML?


    6800XT is really good tho

    1. AluminiumTech


      AMD has had really awesome raw compute performance for a while. That's not been a weak point of AMD. The weak point of AMD was how their architectures were too focussed on compute and not enough on gaming.

    2. D13H4RD


      I wasn't really talking about raw compute performance though, at least directly.


      Was more-or-less chuckling at how the marketing spokesperson seemed to scoff at machine-learning driven upscalers but the engineering team is already working with Microsoft at implementing DirectML.

  10. I think a lot of it is because people are drawn to graphs like the one shown below; I have always said to take any performance graph released by the manufacturer with a grain of salt as they would most likely be cherry-picked and come with other limitations like arbitrary figures. However, I also feel like this is extended by performance data from reputable reviewers like Hardware Unboxed and Gamers Nexus, where the performance data on the graphs tend to make the product look better than it actually is in reality (that is, when you're actually using it). On the topic of
  11. I've felt that way a while ago. I still use a phone from 2017 as I felt like I don't need something newer. And even though I would love to upgrade my PC to a Ryzen 9 5900X and an RTX 3080, when I reflect on what my current system has accomplished for my requirements that has left me more than satisfied many, many times, I keep asking myself "why upgrade when what you already have already exceeds your requirements?" I think this is a sign of maturity for a lot of us tech enthusiasts, when we finally decide "what I've got is more than good enough for my needs" and we tone down our cr
  12. You heard it here first guys.


    CPU prices don't make a difference even if a $249 CPU goes up to $299.


    1. Den-Fi


      Have you not heard? You're spending $50 more, but AMD loves you $100 more. So it's like a HUGE discount.

      You've been on the forum way too long to not understand that this is how the market works.





    2. Senzelian


      Gonna buy a 5900X now for 700€ because prices don't matter.
      If someone tells me the same for GPUs, I might just buy  3080 for 1000€.

    3. genexis_x


      My opinion:
      Yes, 5600X and 5800X are slightly overpriced - non X parts are needed. However, if we compare 10600K vs 5600X, 10600K requires Z490 mobo (to support high speed RAM and have beefier VRMs) and beefier cooler, even without OC. Meanwhile, 5600X can just run on cheaper B450/550 board and cheap aftermarket cooler, no OC also. The total cost is not that far off actually


      He/she has a point. But saying 'CPU price doesn't make a difference' isn't that appropriate.

  13. A small build update


    Moved the case intake fans so they mounted outside the chassis. Was a bit of a pain but managed to get there.


    1. Letgomyleghoe


      what case? thats likely incorrect and stifling airflow

    2. D13H4RD


      Cooler Master NR600 


      It's actually recommended to put it this way due to how it is designed. 

  14. Welp, color me interested in the MacBook Air...for once. Seems to hit the exact right notes I want out of my laptop, but I'm gonna have to wait for reviews.
  15. Welp, guess who's looking at buying the ARM MacBook Air