Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

BlakSpectre

Member
  • Content Count

    37
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About BlakSpectre

  • Title
    Member

Profile Information

  • Gender
    Not Telling
  • Interests
    Spending too much time on tools.
  • Biography
    I have loved all tech long enough to concluded that I hate all tech equally.
  • Occupation
    Software Developer

System

  • CPU
    Ryzen 3800X (gaming and VM servers)
    4770k (don't overclock, dell gave it with a non overclocking motherboard for some reason, upgraded everything so ended up with 2 systems, dedicated for games with kernel level anti-cheat due to privacy concerns)
  • Motherboard
    MSI x570 (gaming pro carbon)
    Dell custom from my old prebuild
  • RAM
    DDR4 32GB 3600
    DDR3, 16 GB
  • GPU
    RTX 2080
    GTX 960
  • Case
    Riotoro Morpheus/Silver stone HTPC GD09(for future use)
    Dell prebuilt chassis
  • Storage
    Samsung nvme evo 500GB, sabrent nvme 1 TB, 6 TB WD 7200 rpm, 8 TB seagate 5400 RPM
    256 GB samsung ssd, 2TB hdd
  • PSU
    Some evga 1000 Watt
    some evga 650 Watt
  • Display(s)
    LG 4k (43US79 or something), LG 21.5 4K ultra fine, LG ultra fine 5k, Dell 27inch 4k (U2718Q iirc), Dell 24 inch 1440p ultra sharp, some there random monitors
  • Cooling
    what came with the CPU, don't overclock so whatever
  • Keyboard
    Currently using: Massdrop CTRL

    logitech pro (tenkeyless), Leopold FC660M, Microsoft arc, Logitech trackpad keyboard. Others: DAS professional, Microsoft wave, k811, k380, k480, apple wired keyboard etc
  • Mouse
    Currently using: MX master, G602, MX anywhere Other: M65 Pro RGB, G502, G303, deathaddre and other I have acquired over time
  • Sound
    Sennheiser Jubler x58, HD 6xx, Bose QC 35, AirPods, sound blaster katana (and other headphones)
  • Operating System
    Windows for gaming, Mac and Linux for working. Hardware info is of my gaming machine.
  • Laptop
    MBP 15 (2017), MBP 13(2012), XPS 15 (2019, on Linux), XPS 2in1 13 (2016? Bought it used, don’t really remember model)
  • Phone
    iPhone XS, Note 10+, pixel 3 XL

Recent Profile Visitors

424 profile views
  1. Mate, can you re-read my comment? because you are confusing me. My comment was in response to someone saying 8k decoding is stupid because there is no 8k content. 8k is useful to video editors and within 2-4 years general consumers. Therefore if just cratering to a small number of users is stupid (by supporting 8k decoding) then more than one USB port on a laptop is also stupid, most people just use one (two if they need one for charging) and thus my disappointment for their paltry I/O bandwidth is stupid. Don't make this about mac vs non apple machines if possible.
  2. I dont entirely understand your comment. If you want to do a mac vs pc thing, that does not add anything to the conversation on this thread. Comparing a 10 watt SoC to a dedicated graphic card is petty. And I agree more options are better, including 8k decoding.
  3. you are also on LTT forum, you are a small percentage of the population. Most people dont need more than one USB port. By your logic, since most people dont need 8K playback (limited to ultra rich and video editors) and adding it is of no use, adding more than 1 USB port for a small percentage of population is also of no use. I strongly disagree with you regarding 8k playback being unnecessary.
  4. True. I forgot, even though the reviews of the device have been excellent (for the price and ignoring QA issues).
  5. What is the disadvantage of being future proof? Or designing pipelines for video editors using the machine? Thinking along those lines, my disappointment is completely baseless.... what is the advantage of having multi monitor support when 90% users will never use it, or having 2 thunderbolt controllers or better bandwidth for i/o? Most people wont need more than a gigabit connection or more than one USB port.
  6. You mean the 3 people that use linux desktops. JK. I would have loved to see linux performance on the surface pro x, but it is a custom CPU. For now we only have Rasberry Pi on the consumer side of things.
  7. For clarification, I dont disagree with you regarding the fact that the decoder is impressive. I just meant it is not a fair comparison. One way or the other ARM transition will work out for apple.
  8. No, based on my understanding a chip being able to play 8k video does not mean anything. I don't disagree that the chip itself is not impressive, but using 8k video processing capabilities as a benchmark in case of this architecture does not mean anything when comparing to x86. It is not a trivial achievement but an unfair comparison. You can design pipelines on a chip specifically for hardware decoding, and that is what apple is doing. Similar to how software RAID and hardware RAIDare 2 very different things, or Intel quicksync. This is hardware decoding of video and playback, not how ge
  9. Yeah, I apple is not going to ever share its CPUs but a huge part of performance per watt uplift apple is seeing is down to the chip being ARM. The reason people have been afraid of going to RISC architectures is the amount of work in software it takes and the fragmentation it will introduce. But this will likely force the industry to start looking into switch architectures... or so I am hoping... Microsoft has tried it for years and they fall on their face every-time because they cant optimize windows for it. There is already enough money into companies developing ARM based solutions for indu
  10. i expected first gen to be bumpy, but the whole I\O limitation is what is disappointing me, it seems to be lagging behind rest of the soc. Guess I am also mildly concerned that they are going to leave that as status quo for future generations. With that said, they also shipped first Macbook air with 1.8 inch HDD and a mini DVI (iirc) port .
  11. I was looking forward to the M1 chip, expecting them to massively boost consumer side propagation towards RISC CPUs. While I credit Rasberry Pi for starting it, once Apple does something, it becomes mainstream. (RIP pebble, I loved you). I think the boost will still happen but I was expecting M1 chips to do better. But I am disappointed in the chip. Don't get me wrong, the performance is good and it is essentially a beta product like the first MacBook air, but that was expected given the performance of the iPad and the "accelerators" it has for heavy tasks like vid
  12. Mate I never said you demeaned people for buying what they wanted. It gets annoying when people keep telling me how I overpaid for a mac like I am stupid but as long as they are not insulting I don't think they belong to the anti apple cult. I did the math, I have like 11 computers collected over time and 4 of them are macs and unless I am gaming they get used the most. I know what I am buying into. The $800 a year for my Mac primary is well worth having a managed unix env for me. It is similar to $700 year I end up paying for the XPS (my Linux primary) and have to deal with i
  13. I used to think that the apple cult was much better than the anti apple cult because they did not actively insult, degrade and demean people on "the INTERNET" (except blog writers) like the anti apple cultists do. This whole thing proved that I was wrong.... the gap between them is relatively minor. On an unrelated note: It is a bloody RISC chip, based on TSMC 5nm. It better be more power optimized for things it is designed for, there is a reason every Intel attempt at mobile x86 failed and every phone uses arm chips. Does not take away from the engineering apple did, but I think k
  14. I mean he wasn't wrong... he said the event was a dumpster fire, that he was cautiously optimistic about the chops and in the long term it would work out, and cracked a couple of jokes like he does about AMD and Intel. I used to think apple haters were the only people that get touchy about things and whine all over the place and apple users were generally calmer on social media except a few quips here and there... this whole BS proved me wrong about it... it is like they did not watch the whole video and just took his jokes and description of the event as offensive against the perfect M
  15. I beg to differ on the "more traditional" forum members being haters, while I have had brushes with some of apple/console haters, I have to say many, just like me just love tech. I am not sure why I am posting this message just to defend but here I am . Personally i think Linus loves tech and he knows he is slightly biased against macs so when it comes to serious performance testing for the donglebooks he defers to Anthony almost entirely and does not interject like he would with shitdos based systems.
×