Jump to content

BlakSpectre

Member
  • Posts

    39
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Not Telling
  • Interests
    Spending too much time on tools.
  • Biography
    I have loved all tech long enough to concluded that I hate all tech equally.
  • Occupation
    Software Developer

System

  • CPU
    Ryzen 3800X (gaming and VM servers)
    4770k (don't overclock, dell gave it with a non overclocking motherboard for some reason, upgraded everything so ended up with 2 systems, dedicated for games with kernel level anti-cheat due to privacy concerns)
  • Motherboard
    MSI x570 (gaming pro carbon)
    Dell custom from my old prebuild
  • RAM
    DDR4 32GB 3600
    DDR3, 16 GB
  • GPU
    RTX 2080
    GTX 960
  • Case
    Riotoro Morpheus/Silver stone HTPC GD09(for future use)
    Dell prebuilt chassis
  • Storage
    Samsung nvme evo 500GB, sabrent nvme 1 TB, 6 TB WD 7200 rpm, 8 TB seagate 5400 RPM
    256 GB samsung ssd, 2TB hdd
  • PSU
    Some evga 1000 Watt
    some evga 650 Watt
  • Display(s)
    LG 4k (43US79 or something), LG 21.5 4K ultra fine, LG ultra fine 5k, Dell 27inch 4k (U2718Q iirc), Dell 24 inch 1440p ultra sharp, some there random monitors
  • Cooling
    what came with the CPU, don't overclock so whatever
  • Keyboard
    Currently using: Massdrop CTRL

    logitech pro (tenkeyless), Leopold FC660M, Microsoft arc, Logitech trackpad keyboard. Others: DAS professional, Microsoft wave, k811, k380, k480, apple wired keyboard etc
  • Mouse
    Currently using: MX master, G602, MX anywhere Other: M65 Pro RGB, G502, G303, deathaddre and other I have acquired over time
  • Sound
    Sennheiser Jubler x58, HD 6xx, Bose QC 35, AirPods, sound blaster katana (and other headphones)
  • Operating System
    Windows for gaming, Mac and Linux for working. Hardware info is of my gaming machine.
  • Laptop
    MBP 15 (2017), MBP 13(2012), XPS 15 (2019, on Linux), XPS 2in1 13 (2016? Bought it used, don’t really remember model)
  • Phone
    iPhone XS, Note 10+, pixel 3 XL

Recent Profile Visitors

686 profile views

BlakSpectre's Achievements

  1. I understand the pain, your partner seems to be in a much worse situation than me. I hope she gets out of it. Really sorry to hear mate.
  2. I got my dream (as close to it as I am in a position to get) job. woot!
  3. It has been awesome seeing Tara on LTT however sparingly we did see him. All the best Taran, if you ever read this. And CONGRATS.
  4. thanks, I am glad you are free of that too. Cheers!
  5. Ladies and gentlemen of LTT As of today morning, I am no longer bound by my visa to my current company, I can switch jobs whenever, I did not have that freedom for 3.5 years. They have been underpaying me by about 40%, still "competitive" but lower than the offers I got, I am so happy and free. I sent some dms and shared it with the few friends but not many understand. I just needed to share it with some people. Thanks for reading and go about your day lol.
  6. Mate, can you re-read my comment? because you are confusing me. My comment was in response to someone saying 8k decoding is stupid because there is no 8k content. 8k is useful to video editors and within 2-4 years general consumers. Therefore if just cratering to a small number of users is stupid (by supporting 8k decoding) then more than one USB port on a laptop is also stupid, most people just use one (two if they need one for charging) and thus my disappointment for their paltry I/O bandwidth is stupid. Don't make this about mac vs non apple machines if possible.
  7. I dont entirely understand your comment. If you want to do a mac vs pc thing, that does not add anything to the conversation on this thread. Comparing a 10 watt SoC to a dedicated graphic card is petty. And I agree more options are better, including 8k decoding.
  8. you are also on LTT forum, you are a small percentage of the population. Most people dont need more than one USB port. By your logic, since most people dont need 8K playback (limited to ultra rich and video editors) and adding it is of no use, adding more than 1 USB port for a small percentage of population is also of no use. I strongly disagree with you regarding 8k playback being unnecessary.
  9. True. I forgot, even though the reviews of the device have been excellent (for the price and ignoring QA issues).
  10. What is the disadvantage of being future proof? Or designing pipelines for video editors using the machine? Thinking along those lines, my disappointment is completely baseless.... what is the advantage of having multi monitor support when 90% users will never use it, or having 2 thunderbolt controllers or better bandwidth for i/o? Most people wont need more than a gigabit connection or more than one USB port.
  11. You mean the 3 people that use linux desktops. JK. I would have loved to see linux performance on the surface pro x, but it is a custom CPU. For now we only have Rasberry Pi on the consumer side of things.
  12. For clarification, I dont disagree with you regarding the fact that the decoder is impressive. I just meant it is not a fair comparison. One way or the other ARM transition will work out for apple.
  13. No, based on my understanding a chip being able to play 8k video does not mean anything. I don't disagree that the chip itself is not impressive, but using 8k video processing capabilities as a benchmark in case of this architecture does not mean anything when comparing to x86. It is not a trivial achievement but an unfair comparison. You can design pipelines on a chip specifically for hardware decoding, and that is what apple is doing. Similar to how software RAID and hardware RAIDare 2 very different things, or Intel quicksync. This is hardware decoding of video and playback, not how general purpose processors do it. With that said, my disappointment is based on the interfaces and I agree with you that the chip performs impressively well. The GPU specifically is substantially better than I expected.
  14. Yeah, I apple is not going to ever share its CPUs but a huge part of performance per watt uplift apple is seeing is down to the chip being ARM. The reason people have been afraid of going to RISC architectures is the amount of work in software it takes and the fragmentation it will introduce. But this will likely force the industry to start looking into switch architectures... or so I am hoping... Microsoft has tried it for years and they fall on their face every-time because they cant optimize windows for it. There is already enough money into companies developing ARM based solutions for industrial applications. Or maybe me hoping this happens for notebooks is messing with my brain... who knows lol. (I agree with everything you said, just adding that this impact the industry broadly).
×