Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Centurius

Member
  • Content Count

    1,404
  • Joined

  • Last visited

Everything posted by Centurius

  1. Again, even those don't have access to that kind of scale. Easily one of the largest corporations in the world when it comes to datacenter usage is Amazon, and at most internet exchanges they have 100 Gbps uplinks with a few key IXs having 400 Gbps uplinks. Most government departments are connected to these locations with 10 Gbps ports and often even just Gbps. The largest hub of exchanges, the Equinix Exchange has a maximum throughput of 18 Tbps. That's every single datacenter of theirs and every single peering connection. Basically most of the internet traffic on the world, and they barely have one tenth of the single connection illustrated by these scientists. So I stand by my timeline, Mars before this hits any kind of mainstream.
  2. Thank you, I've made the call and ordered the LG It is only for my own projects yeah, I just thought because LTT always stresses colour accuracy for any content production that it'd apply to my usecase as well.
  3. Well my recommendation would be more in the scene of content creation because that's what I do. So it will likely be overkill for your web conferencing use. But I've been using the Audio Technika AT2020 for a few months now and it's an absolutely amazing microphone. My voice is so smooth using it. It's a cardioid microphone so if you set the gain right it won't pick up a lot of stuff either. https://www.amazon.com/Audio-Technica-AT2020-Cardioid-Condenser-Microphone/dp/B0006H92QK As far as stuff I saw be decent but haven't confirmed myself https://www.amazon.com/TONOR-Professional-Microphone-Podcasting-Broadcasting/dp/B01LEEWO7C https://www.amazon.com/gp/product/B00XBQ8UGG
  4. Can't answer number 1 but for number 2, as long as it's an XLR mic it will be compatible.
  5. Hey all, To go with my new rig I'm looking at a monitor to actually do it justice. Right now I'm on an LG 27UD58 and it's kind of starting to show its weaknesses. I am looking for something slightly bigger that has DisplayHDR 600 or higher and proper coverage of different spectrums for content creation. At the same time I'm also somewhat on a budget and gaming features like Freesync are a pro. As is of course being able to watch 4k HDR content. This brought me to three options. LG 32UL750 at €581 https://www.lg.com/uk/monitors/lg-32UL750 Asus CG32UQ at €865 https://www.asus.com/Monitors/CG32UQ/ Asus ProArt PA329C at €1214 https://www.asus.com/Monitors/ProArt-PA329C/ Now obviously without money being an issue I'd jump onto the ProArt because looking over the stats it really does look like the best out of the three. My main question is kind of if the LG and/or the CG32UQ are close enough to it in things like colour accuracy and quality that the price difference and losing Adaptive Sync aren't worth it. I am aware of LG's Ultragear series of 4k144hz panels, unfortunately those are not really available here. I'd really appreciate your advice. The content creation is 4k btw for YouTube (and Twitch but there's no editing there).
  6. Microsoft almost certainly was aware, however companies are wary of something called the Streisand effect. If they had taken action sooner, they would have needed to file either a suit or issue a cease and desist. After doing that it would have reached the press who would have loved to jump onto 'Big company ruining small independent harmless software tweaker', initiating both bad press and making more people aware of the project even existing and possibly compelling them to seek it out. So leaving it to stay in its own extremely small niche would result in less attention than stepping in. Now the moment that one of the biggest technology YouTubers in the world makes a video on it that audience of a couple of hundred, maybe thousand people becomes potentially millions. At that point the damage is already done and you may as well proceed to take it down. That's why they likely had a cease and desist letter ready at Microsoft Legal just waiting for a large enough influencer to discover it or for it to grow larger naturally. You clearly weren't here back when Windows 10 initially released. People were happily giving up DX12 and any other creature comfort the new OS offered by sticking to 7.
  7. Not really, current pricing is based on current market trends. The prices in the promotion would be the MSRP, so for an apples to apples comparison you need to find launch MSRPs for the Xbox Ones (Playstations can't be compared the same way because the logistics are different for Sony).
  8. 178 Tbps is roughly 22.3 TB of storage, there is no way the Netflix library is that small. Even the Open Connect appliances they install in ISP datacenters for faster speeds to end users contain ten times that and those don't contain the entire library. This isn't even remotely intended for actual widespread use. These speeds are possible only under incredibly specific conditions and at a price that no consumer or business can afford. At best you'll see this used for site to site connections between research institutes and the like where they can actually utilize that datarate. Most of the world is still on 10 Mbps or less, the high end for consumers has only the last year or two started moving into >1 Gbps speeds for home networks and less than that for internet. Even the largest companies often don't have more than a 400 Gbps uplink and getting even that costs millions. By the time a regular consumer can download at 178 Terabits, we'll likely have colonies on Mars.
  9. Keep an eye on https://evedevices.com/pages/spectrum, slightly smaller but 4k 144hz and DisplayHDR 600 certified
  10. Eh due to NVENC I'm basically defaulting to Nvidia anyway, but even if that were not the case I wouldn't expect AMD to bring anything to the table that comes close to the 3080 or 3090.
  11. Before possibly doing something to your CPU that could permanently destroy it if done wrong, and as you mentioned you don't have the money to get a new part if it breaks, ask the person who gave it to you how it ran in their system. If the CPU ran with normal temperatures recently there is no reason to assume the TIM is the culprit why it's overheating now.
  12. Yes, it does support XMP, up to its rated maximum speed of 2666 Mhz. The bios can identify RAM but because the chipset is not rated for it, it won't be able to use it. The B chipsets are never intended for overclocking (which running RAM at higher speeds is).
  13. The hype is because based on the available information so far it is one of the largest generation to generation improvements seen in the last two decades with the x70 set to outperform the highest end GPU of the previous generation. Compared with some of the other features it's set to be an absolutely stunning release. That doesn't mean older GPU's won't still do fine, but if the charts pan out developers will have a lot more breathing room to go crazy on textures, effects, etc which will likely make older cards age much faster. 10 series specifically are kind of in a rough spot because with consoles also implementing raytracing, GPUs that can't do raytracing will find themselves providing a noticeably worse experience at even lower resolutions vs 20 and 30 series as well as Big Navi.
  14. 3600 Mhz RAM will definitely be an upgrade as far as performance goes, to which extent it is worth it though if you already have 3000 is going to depend on if you can sell your current sticks for close to retail and/or if you have enough budget to eat the loss. Steve did a pretty solid video on this.
  15. That's true, but the 32 kits are roughly twice the price and these cost me 118 each so the difference would be about ~220. Thanks that puts me at ease a lot more. And yeah, I got an Aorus 1 TB gen 4 NVMe for OS/scratch disk use. As for getting these kits in the future, yeah they likely will be more expensive but based on current growth of my channels I do think I'll be upgrading to a Threadripper system in about 2023-2024 with DDR5 so hopefully 32 will last me until then. I did not know about the stability issues at higher ram amounts. Thanks a ton for this clarification! --- In general thank you all for your very quick responses, you've put my mind at rest and made me comfortable about sticking to the current RAM amount.
  16. Hey everyone, For my first upgrade in 7 years I decided to kind of go all in and ordered an R9 3950X, Asus Crosshair VIII Formula and what I assumed was a 64 gigs of 3600 DDR4 (and of course a bunch of other parts but those don't really matter for this topic). When the parts arrived I realized the label on the website for the ram had been misleading (16 GB Dual-Kit which I assumed to be two 16 Gig sticks but were in fact 2 8 Gig sticks for 16 total). With two of these kits I didn't have 64 GB RAM but 32 GB RAM. Now the parts took a long time to arrive already (G.Skill Royal sticks are hard to get here) and if I send these back and order new ones not only will I be out about 220 euros extra but I'll also likely need to wait a month for them to arrive (I know other ram is fine too but the royals really fit the aesthetic I'm going for here). My workloads are gaming, streaming and video production (1080p and 4k workloads currently). I know 32 GB is way more than you need for gaming but my other workloads leave me with the following question. Is 32 GB of 3600 RAM enough to stream using OBS at 1080p 6000 Kbit while simultaneously running games? Is 32 GB RAM enough for 4k workloads in Premiere Pro, as well as in more rare cases After Effects? If I were to upgrade to 8k workloads in the future (the CPU can more than handle it and I am planning to get an RTX 3090 when they come out) will 32GB RAM still be enough if the answer to the above is yes? I'd really like to avoid having to wait for new memory because I've been dying to get this build started but if I can't properly run my workloads on this I'll have to. Thanks in advance. Edit: I forgot to mention as it is relevant for the app usage. I primarily make content for YouTube (and Twitch but that's only Live), so codecs will likely be H264
  17. The 3000 series is not going to be bottlenecked by a 9600k in any gaming workload except for the most extremely CPU intensive games (honestly even then only unoptimized games). There is likely no need to upgrade unless you have additional workloads that can benefit from more cores, PCIe gen 4 or any of the other features newer/different platforms offer. Edit: Just to place this in some perspective. I've been using the 4770K for about 7 years now and last year paired it with an RTX 2080. This CPU has not been a bottleneck in any gaming scenario so far with only one or two exceptions out of the dozens of titles I've played. It's even done decently well while simultaneously running OBS with a bunch of animations and effects and streaming at 1080p with 6000 Kbps. This CPU is a lot further behind the 2080 than yours is to the 3000 series. The only reason I'm even shifting to the 3950X next week is that I wanted PCIe gen 4, NVMe storage and would like a faster CPU to work on 4k (and 8k) footage in Premiere Pro.
  18. How old is your CPU cooler? You wrote you used the included thermal paste. The stuff that comes with the cooler is already not the highest quality and it does go bad after a while. If you use expired thermal paste it's possible that your cooler isn't able to properly take in the heat from the CPU and thus can't cool as well as it should. How did it perform in your friend's system and what temps did they get? If they didn't have the overheating I'd try ordering some new thermal paste.
  19. Are you being serious? As far as user privacy is concerned, Apple is doing pretty great among the tech companies. It's one of the reasons Siri is so bad.
  20. This is probably going to be an expensive potato.
  21. You're right, hadn't even though of that I guess then the earlier mentioned divide is what I envision if both systems were maxing out. Good to hear passthrough has been getting better, how large would you say the fps hit in percentages is right now in gpu-dependent game? My workflow primarily uses Premiere Pro, After Effects and Photoshop. From what I've seen TR is about even compared to the next up Intel offering so I imagine the same counts for TR2, if they haven't closed the gap even further. Thanks, yeah that's what I figured. I was just worried about an overhead hit on the GPU passthrough
  22. Hey all, So for the longest time I've wanted to make Linux my daily driver but as video editing and gaming are part of my daily activities I've been forced to stick with Windows(Also I really prefer Office 365 over the open source alternatives). Virtualization with direct-io has been something on my radar but because the 4770K had vt-d disabled it was something I couldn't do. Thanks to a new job however I will be able to get a massive systems upgrade in a few months and looking for some advice. While this would perhaps better fit in the CPU section, I feel this is enough of a niche that the people here can probably give more advice based on experience. For my new system I'm looking at getting either a HEDT AMD or Intel setup. Money isn't a real concern but I do prefer spending as little as possible so I've found the sweetspots to be either the Threadripper 2970 WX (24c/48t) and the Intel i9-7980XE(18c/36t) where I prefer the AMD. I'd likely dedicate 8 cores and 16 threads to Linux Mint with the remaining 16 cores and 32 threads committed to a Windows vm for gaming, Adobe and Office in case of the AMD. 4c8t for Linux and 14c28t for Windows in case of intel. For the GPU part I'm getting whatever is the highest end GPU Nvidia is supposed to release in a few weeks. AMD obviously is my preferrence, I like the higher core count at a lower price and in general want to go with them, there is one thing holding me back though and that is where my question comes in. Back when the original Ryzen was released hardware passthrough seemed to not really work well, especially compared to vt-d. Has this since been fixed? If I'd have a real performance hit on the GPU when going AMD I sadly will have to settle for Intel so I hope you guys know more. Thanks for any help.
  23. There is a difference between going "SJW" and putting your partners on a higher standard. So no, I'm not going to boycott it.
  24. This is wrong on so many levels I'm not even going to bother with a full explanation but no. Your brain doesn't work like that.
  25. It seems you are the one who doesn't understand what this can do to the tech industry as a whole. The same thing happened with Internet Explorer and Windows a few years back. The only thing they needed to do to comply was offer the option to pick IE or a different browser after installing the OS. Google can simply do the same, offer the ability to use the default apps or pick your own during initial set up.
×