Jump to content

ZetZet

Member
  • Posts

    17,382
  • Joined

  • Last visited

Everything posted by ZetZet

  1. Your phones must have been really old. Last two Android phones I had would lose charge sitting on a charger and then trickle back up again. Laptops stop charging completely when full. Actually I still have my Nokia 7 plus which I charged every night leaving it plugged in the whole time. Battery is still fine in it after more than 2 years.
  2. But that's the key part. They are never full. You the user see a 100%, but the safety margin is on top of that already. Same with 0%. That's why the phone still shows you something on the screen after it hits 0%, it's not at 0.
  3. I expected you to talk about electric cars because they do have battery management (on the car). Normal cars with gel and acid batteries are by far the dumbest, the generator just spins and gives ~14V the whole time. It's just that those batteries can't be overcharged. Nickel-cadmium chargers are usually also extremely dumb, they just trickle charge with current limit and those batteries can sit forever on a trickle charger, that's why the charging takes so long on them. My guess is that with the "bad" chargers you experienced is that they just pumped in more current to charge faster, which in turn killed the batteries faster. The only chargers that do have battery management inside them are the hobby Li-Po chargers. But that's because those batteries usually have ZERO protection beyond a thermal-fuse
  4. Seems about what they cost. Germans get slightly better deals, but they have a large market advantage so it makes sense.
  5. Car chargers are dumb too. The battery management/charger is on the car. The only thing car chargers do is tell the car how much power IT can pull without killing the charging network, either DC charger or at home.
  6. Charger can't do it, there is no protocol for it anywhere. Charger is dumb, it just gives voltage and that's it. Android phones can set up maximum charge, some Windows laptops can do it. It's not important because battery management got good enough to not cook the battery by overcharging and the battery chemistry itself has improved to handle more and harsher charging cycles.
  7. It's definitely the CPU hitting a wall. Generally in games you want to see your graphics card sitting at 97-99% usage and CPU not doing too much. Your situation is backwards. You can see when the CPU chokes your graphics card usage goes down a lot.
  8. You can set that up on the device itself. But it's not really important anymore these days.
  9. 6800XT is the fastest one from your choices so sure why not. Drivers are pretty comparable on both sides these days.
  10. You can get the new 240Hz one for only a little more. https://pcpartpicker.com/product/VNt9TW/lg-32gq850-b-320-2560x1440-240-hz-monitor-32gq850-b or a much cheaper 165Hz version https://pcpartpicker.com/product/8snypg/lg-32gp850-b-320-2560x1440-165-hz-monitor-32gp850-b They will all be similarly good. I personally have this one https://pcpartpicker.com/product/MXfnTW/gigabyte-m32q-315-2560x1440-170-hz-monitor-m32q https://www.rtings.com/monitor/reviews/gigabyte/m32q
  11. No it will not explode, it will be fine. The phones/laptops/tablets and the charger do a simple handshake every time and they ask the charger for however many volts/amps they need. Just if you plan to charge your laptop through USB C make sure you buy a cable that actually supports 100W charging, many of them do not.
  12. A 13900K and RTX4090 can't run that game without stuttering so that's completely out of bounds. I think you can do 1440p 30. IGN ran it with a 2070 which is faster than 5700XT and it didn't reach 30 at 4K. But at 1440 should be doable.
  13. If you believe in magic then you might notice a difference.
  14. Yeah, but like my build was Xeon 1230v2 + 1060 3GB and I could still play recent games at 1080p. Only in 2021-2022 it started to feel like it's getting CPU limited.
  15. Is Sandy Bridge that old? I used to have an Ivy bridge Xeon build, just started buying parts for my next build two months ago. Ugly low effort pic
  16. I used to play CS:GO at a pretty high level and I think the only thing that matters is consistency. If the sensor is set to 800 and is 750, but keeps it at 750 the whole time you can just adjust the sensitivity. Everyone goes by feel anyway.
  17. But that's not even expensive. When I was a kid a basic desktop cost 2000-3000 dollars and it wasn't nearly as capable AND we had inflation since then. PC gamers just got really used to cheap hardware during the golden times of die shrinks. Now that die shrinks alone aren't able to bring more performance the prices will go up again. Oh and don't forget just how small and weak PC hardware used to be. For example, this was the top of the line card from ATi in 2007 https://www.techpowerup.com/gpu-specs/radeon-hd-3870.c203 look at the die size. It's a lot smaller than a 3050 which most gamers these days think is too weak to even care about. And it's built on a much simpler process node.
  18. Voting with your wallet works perfectly well, it's just that most people think the prices are reasonable enough and buy it. It's like complaining about iPhone prices. That's not what I'm saying though. Popularity is what is making it expensive. The hobby is growing that means there is a huge demand for new hardware especially. We don't even have to look at PC, look at consoles, they haven't been selling for MSRP for how long now? You can reasonably expect new generation consoles to be even more expensive. Maybe PS5 Pro launches for a 1000 dollars when the current PS5 can't run titles at 60 fps anymore (which is coming soon).
  19. I mean you kind of started reading it and then you gave up and claimed victory. Apple M1 doesn't have HT. So Apple M1 single core is a lot fucking bigger than Zen3 core when it comes to single threaded workloads. That's purely a design decision and doesn't have much to do with x86 or ARM. You have to look at overall performance when the CPU starts to hit power limits. Like I said if they were both using 5nm it would be splitting hairs, Zen4 most likely wins in multi-threaded applications. There are no big victories for anyone in terms of architecture.
  20. ATX 3.0 did not make ATX 2.xx obsolete. AMD can use the older standard for as long as they want.
  21. For necessities often after a disaster. You do not seem to understand what price gouging regulation actually means. No it isn't. It depends on what portion of a society you are talking about. Vegans think eating animals is immoral for example.
  22. Morality is not a defined concept. And it's legal to sell whatever you want for whatever amount you want as long as you find buyers. PC hardware is getting more expensive, because it's becoming a very popular hobby. Hobbies are expensive, popular hobbies even more expensive. No one actually NEEDS a high-end or even a mid range computer. You can get a computer that does basic things for like 50 euros.
  23. What's the point of ARM on Windows? People kept saying laptops for "efficiency", but then the largest power draw in laptops has been the display for a good while now. https://www.notebookcheck.net/AMD-Ryzen-7-6800U-Efficiency-Review-Zen3-beats-Intel-Alder-Lake.623763.0.html x86 is matching or beating Apple magic sauce M1 and that's on an older/cheaper process node. When AMD starts shipping Zen4 on 5nm to laptops they will be faster and more efficient than Apple M2.
×