Jump to content

marldorthegreat

Member
  • Posts

    1,465
  • Joined

  • Last visited

Everything posted by marldorthegreat

  1. Summary The UK-based JET laboratory has smashed its own world record for the amount of energy it can extract by squeezing together two forms of hydrogen. If nuclear fusion can be successfully recreated on Earth it holds out the potential of virtually unlimited supplies of low-carbon, low-radiation energy. The experiments produced 59 megajoules of energy over five seconds (11 megawatts of power). This is more than double what was achieved in similar tests back in 1997. Quotes My thoughts The fusion announcement is great news but sadly it won't help in our battle to lessen the effects of climate change. There's huge uncertainty about when fusion power will be ready for commercialisation. One estimate suggests maybe 20 years. Then fusion would need to scale up, which would mean a delay of perhaps another few decades. And here's the problem: the need for carbon-free energy is urgent - and the government has pledged that all electricity in the UK must be zero emissions by 2035. That means nuclear, renewables and energy storage. In the words of my colleague Jon Amos: "Fusion is not a solution to get us to 2050 net zero. This is a solution to power society in the second half of this century Sources Major breakthrough on nuclear fusion energy - BBC News
  2. Yeah completely agree, the price is way too high. But from AMD's perspective it's a card that people will buy anyways, if AMD can sell a card in high volumes for $479, why sell it for cheaper?
  3. One could argue that they don't have to, they are making money whoever buys them. But then one could also argue that long term, when the market crashes (again) the second hand market will be flooded with GPUs and the prices will crash, meaning Nvidia (And AMD) will lose future revenue because of the massive amount of second hand GPU's hanging around Also pretty sure the reason they can't just mass produce them is because there is a huge silicone shortage, not just from Samsung but across the entire industry, leading to less supply. To the extent that President Biden is acting on it, because it is now effecting things like car production.
  4. I recently tried to purchase an RTX 3060, waited until 5pm uk time on the launch date, and they all sold out pretty much straight away. I then gave up, having looked at the scalper prices on Ebay. Then I remembered there was a local (Independent) PC hardware store, so I checked their website, and they were "Out of stock" but also had a note saying "Call for availability". I decided to give them a call, fully expecting to find they too did not have any in stock, then something really quite amazing happened. The website was out of stock, because they were saving them for people who actually lived in the area, and were actively refusing people who live far away until stocks return to a normal (ish) level. I now have a confirmed order for an RTX 3060, paid a little more than I should have (£399 = $555) which is more than MSRP, but UK prices are always higher because of a much higher VAT (sales tax). But that may be a solution, find a local hardware store, somewhere not as well known nationwide, and they could be doing a similar thing. This is a retailer that is at least trying to fight bots and make sure real people are buying them, because you have to physically call, be put on a list, and receive a return call to make a purchase. This is how things should be done, obviously larger e-tailors can't do that, but if you can find a local hardware store they probably don't have the same demand, they can contact you when stock arrives, and will give you a more personal experience a company like Amazon can't give you. So if you are looking for a new GPU, try and find a local store because they might be able to help, and might desperately need customers. Also a second bonus is supporting jobs in your local area, and potentially supporting a local business at a time when they will probably be struggling.
  5. Was probably to save on power usage, also the bandwidth loss might be made up with the new cache
  6. By the looks of things, this outperforms the 3070 by a significant amount
  7. That's better than the 320W and 350W TDP the 3080 and 3090 have
  8. Wow , if this is true, this just looks almost unbelievably goo Because the product speaks for itself
  9. Assuming that nvidia was overexaggerating the performance of the 3080, could be safe to assume that would also be true for the 3070, but guess why have to wait and see...
  10. It's not just price though, AMD's zen chiplets are small meaning they can be more easily mass produced. Intel's dies are much larger so you get much fewer per wafer. Yes Intel is on a (very) mature node. But AMD's chiplet design means that pretty much (I know it's a lot more complicated) they just need to have one 8 core chiplet design which scales from top to bottom. Which means costs are spread across almost every CPU.
  11. It's a start, AMD haven't got anything really better a 2070 super. Even if it's close to the 3080ti for a much cheaper price, will be enough to force price changes from nvidia
  12. Depends what you mean by crushed, performance wise yeah, probably. But market share wise, and which product will end up in most laptops far from it. Intel will probably maintain it's dominant position simply due to the fact most non tech people don't really know AMD as well as intel (Due to many illegal, and legal practices). Intel sells, people are willing to pay a higher price because they see i7 and think its the best CPU in the world, they see AMD and either think it's a generic brand, or associate it with low end products. AMD isn't going to win this battle through performance alone, but through features, partners, and trying to change consumer opinions. I think that is proven in the fact Dell, Apple, and many others still only use intel in there highest end and most of their mid range products. It's just a shame, because AMD has a really good product that unfortunately most people won't even consider because, maybe just because they don't know any better. I honestly think the only hope for AMD is to get a CPU into an Apple product, because quite frankly they do have a certain leadership position and others will almost certainly follow.
  13. We always hear about AMD's "fine wine" technology, so maybe it would be beneficial if this was actually tested. So what you could do is take competing cards over a few generations, 1. HD 7970 vs GTX 680 2. R9 290x vs GTX 780ti 3. Fury X vs 980ti 4 .Vega 64 vs 1080ti And run benchmarks across games from the launch of the 7970 and 680 and run a few flagship game titles from each year, until present day. And compare how, and if, AMD's gpu really does begin to outperform Nvidias over a longer period of time. Although I do believe it might be pointless to do, since it might show nothing but might be interesting to see if the performance gap reflects the one found at launch.
  14. Erm, pretty sure he is from Taiwan, which is not (Well depending who you ask) part of china. Edit: Not part of the "Peoples Republic of China" but is recognised by the West as the "Republic of China", who both claim they are the legitimate government of China.
  15. It depends if they get used by the likes of cyberpower, people see Intel i9 and think they are getting the best performance possible so I think people who build it themselves or are invested a bit in the CPU market will care, but I can think of lot's of people who simply won't know or care. The fact that it's rated at a 125w tdp will almost certainly confuse people, and make them think it consumes less power than equivalent AMD cpu's. I agree, this is not going to be a good CPU, yeah performance will be there for gaming, but it will literally consume more power than my GPU! (And I run a 980ti I got second hand for like £100 (~$120))
  16. Yeah! It's very strange, although to intel's credit at least people will actually buy this (Well maybe if it's a cold winter this year)
  17. And when you consider that the most popular monitor type (I think) is 1080p 60hz monitors, it literally makes no difference since you literally don't really benefit from anything over 60fps
  18. I understand what you are trying to say, and I get it. But why do people like Linus, Gamers Nexus, Jayz two Cents, etc. all say if you are just gaming, intel is better, but if you are doing anything else AMD is the way to go. I'm not exactly an expert, I just enjoy hardware. I have owned both AMD and Intel CPU's (I was an unlucky f*****r and bought an FX 8350) . But surely for example, Nvidia are using Epyc CPU's in the big GPU thing ( Can't remember what it was called.) (Also it's 5am in the UK, so I'm pretty tired)
  19. I think you can now, clock for clock Zen 2 has better ipc, but intel can push clock speeds on a Very mature node. If ryzen 4th gen gets 20% better ipc, and 10% better clock speeds intel may actually be stuck. Also, as much as I want AMD to stick it to intel, I also want Intel to do the same. Competition is always good, I want to be a little excited when new CPU's launch and genuinely have to compare which CPU makes sense for me.Think about it, the highest end non HEDT Cpu from intel was 4 cores 8 threads before Ryzen, now it's 10 cores within 4 years.
  20. But also consumes more power (A lot more) , and literally doing anything else is faster on AMD. Not to mention the fact that AMD has a better platform, with newer features. Yeah gaming performance is better, but 3% misses that the CPU is more expensive, with more expensive motherboards on a socket that will be dead by the end of next year.
  21. Oh yeah! That's an interesting point. It amazes me that these things are going to have Mobile variants coming, might actually start a fire. And good luck putting this into anything small, going to need a very beefy cooler or your just going to get a 9900k on a new motherboard!
×