Jump to content

Jovidah

Member
  • Posts

    341
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I don't think any sane intelligence agency would straight come out and say they'll vouch for Huawei; that's just setting yourself up for failure. I think most simply haven't looked into it yet, or they're more worried about upsetting their relations with China. But we've already seen the UK, Australia & New Zealand (all part of the 5 eyes) taking measures to prevent this hardware from getting into their networks. Now Japan is added to this list. While I could remotely imagine protectionist / trade-war considerations being a motivation for the US, it's harder to apply that explanation to all these other countries. The most likely agenda is simply to prevent putting a giant trojan horse into your own communication networks.
  2. This was a pretty comprehensive test done by a Dutch website fairly recently; should be easy to at least read the graphs and google translate goes a long way. Keep in mind that higher clockspeeds (mhz) often tends to go hand in hand with higher latencies - which is the main reason for the diminishing returns. That being said it does matter at least a little bit, but you have to watch out what you're buying. Just getting more mhz doesn't necessarily mean better performance. https://nl.hardware.info/reviews/7976/10/28-ddr4-geheugenkits-vergelijkingstest-het-beste-geheugen-voor-coffee-lake-en-ryzen-benchmarks-games
  3. My guess is this isn't about their phones but about their network hardware (the stuff your phone connects to). And considering several other countries already took measures against Huawei network hardware there might actually be some fire to this smoke...
  4. I'm inclined to agree with you there... and if 2019 it'll be at the tail end of it. But I wouldn't dare to rule it out entirely. On older roadmaps you can find their 7nm process even on 2017-2018, and while the whole 10 nm debacle might have pushed everything back (even if only because they want to get their money's worth out of 10 nm first before moving on to 7 nm) they can't keep pushing stuff forward indefinitly. If 10 nm doesn't work, and their 7 nm does, I just don't see them keeping it on the shelf for another 2 years while AMD is happily pushing 7 nm product to the markets. Up until now there just hasn't been much of a need to swallow that pill as long as their Skylake+++ manages to keep up with AMD well enough for them to be supply-constrained rather than demand-constrained.
  5. Although you won't hear me claiming we will see Intel's 7 nm next year (the odds of that happening are very small), there has already been some rumormilling about Intel tossing their 10 nm in the bin and going straight to their 7 nm (for example over at Semiaccurate). And throughout the whole 10 nm saga the problem was always accompanied with statements that the 7nm progress wasn't affected. Considering their 7 nm process goes for EUV instead of trying to apply conventional lithography with complicated multipatterning - which is the root of all their problems with 10 nm - this isn't altogether unlikely. But that leaves the question: when is their 7 nm ready for production? I'd put my money on 2020 for Intel's 7 nm too, but if anything the problems with 10 nm won't delay it: it will only bring it forward. I think the whole point of the new chiplet architecture is to blur these lines between iGPU and CPU/GPU on a substrate. I agree whether you lack of GDDR will make a big difference in performance, but on the semantics side it's really splitting hairs.
  6. Who knows, maybe nvidia 7 nm? While everyone's talking about AMD's 7 nm CPUs and GPUs it's not like nVidia has suddenly lost TSMC's phone number. I can imagine one of the reasons the Turing timeline feels somewhat sped-up is that the 7nm is already in the works and not too far out. I doubt they want to find themselves in the same situation as intel... They may be greedy but they're not idiots. If AMD gets a proper and competitive 7nm GPU out of the door fast enough, that might result in a rather shortlived Turing generation. I always thought Turing was a bad generation to upgrade, since it's usually the nodeshrinks (for example Maxwell to Pascall) that come with the most extra performance for your buck. When it comes to intel... god knows if they finally get 10 nm to work, or whether they'll go straight to 7 nm. Though I doubt we'll see their 7 nm before 2020. Who knows; if the GPUs were the problem with 10nm, maybe they'll try their hand at GPU-less high performance 10nm CPUs? The announced KF SKUs might be a prelude to that? I don't really see Optane breaking into the consumer space next year either, especially with the huge pricedrops on high performance nvme drives. The pricegap is only becoming larger. I think monitors might be interesting next year, with HDR, 4k and high refreshrate monitors (and/or cominations thereof) all becoming more and more affordable and breaking into the mainstream. Although to be honest I am looking forward to some kind of standardization on ... some standard. There's just too many different choices and competing standards and resolutions right now. I'm also quite interested in what Samsung is going to come up with to succeed the 860's and 970's. For years they've been dominant in the SSD market largely because of superior performance... but now that most competitors are closing the gap it begs the question whether they'll be able to one-up the competition once again. But who knows, maybe someone else will take the torch? In general I think it's another bright year. The nodeshrink allows AMD to come out with actually exciting stuff, and if nothing else it's going to require a response from their competitors (whether in products or in price). Then there's projected pricedrops in SSDs, pricedrops in RAM and you're arguably looking at one of the best years to upgrade in ages.
  7. I never understood all the hate for Vista. I've ran it on my i7 920 for something like 8 years and I never had any issues with it. I went to it from Windows XP and apart from disliking some of the cosmetic changes (whenever they move settings around or change control panels I get cranky and I always choose a legacy view mode if possible) I it always worked just fine for me. Don't know why people hated so much on it. IMO Windows ME (unstable mess that was worse than 98 SE in every way) and Windows 8 (bullshit counterintuitive tablet interface brought to PC) were far more deserving of hate. Sure, Vista wasn't one of those monumental steps forward like Windows 95 or Windows XP, but it never felt like a step backwards to me either. Windows ME and Windows 8 certainly did. Interestingly enough a lot of people mention that faulty hardware might be the case. A interesting example of this is how my parents completely hated Windows 7 and wanted to go back to Vista because their new Windows 7 laptop had issues where it kept freezing (due to a driver related issue). Maybe I just got lucky in that whole lottery?
  8. The 'running at 72 fps when you don't hit 144' only happens with V-Sync on. You don't play with V-Sync on - or any other adaptive sync technology - in competitive multiplayer games because all of them will add a degree of delay / input lag, thereby defeating the point of getting a high refresh rate screen. Sure, the G-Sync and other fancy syncs definitly suck less than V-sync, and all less input lag, but they're still all worse than V-Sync off. If you're playing online games with V-sync on... you're simply doing it wrong. Playing with <144 frames on a 144 hz monitor can lead to some tearing, but I found it a non-issue... the tearing is a LOT less noticable (if at all) than on a 60 hz screen. If you're ever bothered by tearing, it makes more sense to me to dump more money into a better graphics card than to dump more money into G-sync. Also, keep in mind that high refresh rate does not necessarily mean low input lag; those are different things. You can have a screen that refreshes at 144 times per second, with a consistent 50ms delay (hypothethical example). There's definitly differences there, so make sure to read up on reviews of anything you're considering buying.
  9. While I understand that... I think most of the people who would consider 'Noctua fans' a positive attribute would prefer a 'better fan' over 'black color'. Getting a loud Noctua is about as pointless as getting a slow budget sports car. Either go big or go home I say...
  10. I like the idea of selling it with Noctua fans, but why the industrial fans? Why not the brand-spanking new Sterrox fans? You buy Noctua for silence, and then they put the loudest Noctua fan on their product...
  11. Using a VPN in such countries is like putting a giant bullseye on yourself for intelligence services and marking yourself as a person of interest. Thinking a VPN will protect you is at best naive. VPNs are great against stuff like copyright trolls and such (due to their limited investigative powers), but I wouldn't rely on it for the safety of either yourself or your sources, especially in an undemocratic country. It might protect the contents of your internet traffic, but it will only increase the likelihood of old fashioned eaves-dropping, government-sanctioned burglaries and unplanned stays at a government-run 'bed & breakfast'.
  12. Installing Windows Millenium Edition. I wasn't even a noob back then anymore, but still by far my biggest mistake.
  13. Have you tried taking the gerbil out? They get quite noisy when their tail gets stuck in the fan...
  14. And important to keep in mind on this 'difference' is that on lower power levels, even the cheaper RMx will shut off its fan entirely. So depending upon workload / usage there's a good chance that the fan will be off 90% of the time, thus rendering any differences in fan performance / longevity rather irrelevant. YMMV of course, but I can barely even find any dust on the filters of my RMx 750w simply because the fan is rarely ever running (which is exactly why I went for that PSU and the relatively high wattage for my build).
  15. Hello, Over the last couple of years I've been using old hand-me-down amplifiers to run my old hand-me-down speakers and my headphones (Beyer Dynamic DT 770 pro) on my pc. After my last amplifier died and I'm struggling to find a new cheapo.... I've started looking at buying something new. What I need it to do: -allow me to drive my old pretty huge speakers (they're nothing fancy, but no tin cans either - good enough for me) -allow me to run my headphones (I really like having the volume control straight in front of me) And... that's about it. I'm running it on onboard sound (which is actually surprisingly good on my z170 motherboard) and frankly I mostly use it for stuff like games, youtube and that sort of thing. High end audio gear is complete overkill. So I've been looking at the fancier smaller amps/dacs stuff... but so far all I'm seeing is stuff to run your headphones. That's a no-go for me; I mostly want it for my speakers, if it was just my headphone I could just plug them straight into my pc. But when I look at amplifiers all I find is amps with a billion different audio inputs, all kinds of features for connecting to fancy hifi / audio gear that I simply don't have and never will. All I really want is an amp that can run both my headphones AND speakers from my PC, and that's it. Price preferably as low as possible without getting into gamerish-tin-can-toy-territory. I do have DT 770's after all... So... do these things exist? Anyone who can point me in the right direction? Thanks in advance!
×