Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About danwat1234

  • Title

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Equipped with a good Level 2 autonomous driving system in its own right.
  2. 11:05 the Captioning reads "RAM chips" but she said "RAM SIMs"
  3. Oh look, the iPhone 12 has an OLED 1080p screen. Cool. My 2017 Motorola Z2 Force has a 1440p OLED screen, with a ceramic shield shatter-proof display, so...Apple definitley is not the first much of the time LOL
  4. Surely the reason why CPU usage when playing a 4K Youtube video is only 20% is because the GPU is doing most of the decoding work? I really enjoyed this video. My main rig still uses a similar architecture, a 3.3GHZ X9100 Penryn laptop. Runs dang hot but reasonably fast for what i use it for. But it can barely play a 1080p 60FPS Youtube video without maxxing out a core and stuttering. In VLC player it is much more efficient. That DangerDen system is awesome.
  5. Perhaps the reason why the game didn't get significantly better FPS when water cooled despite some CPU throttling, is because the CPU was not the bottleneck. GPU temperature wasn't high enough to throttle with air cooling so that would not have been affected. Might have been beneficial to run prime 95 or another CPU cruncher to compare between stock air cooling and this.
  6. Agreed I'm not sure how hot the VRMs in that laptop run but it's the original motherboard.
  7. 100C is fine!!! I have an old Asus g50VT laptop core 2 duo X9100. It red lines at 104C. for almost its entire life it has lived near this level running distributed computing in the background at a slight overclock of 3.3GHZ and slight over-volt. Hasn't failed yet. So unless 10 nanometer is real delicate with regards to temperatures it's no big deal at all.
  8. Why no mention of single core performance differences between tiger Lake and AMD? Not allowed to show IPC difference?
  9. Check out a cool app Brian Batista developed for the Chevy Volt. Allows you to control many parts of the car with your phone. Roll up/down windows. control all the interior/exterior lights individually, wipers, horn. https://photos.app.goo.gl/mxRb4CWvSWED4dmD7 You can find the .apk file in the Chevy Volt DIY and Modding Group, FB. Or DM me. It will even let you roll up all the windows with the touch of one button on the phone's/tablet's touch screen! Something Linus complained about in his video of having to hold all the switches up besides driver's to roll up all the windows
  10. It requires lifting the car and dropping the traction battery same as most electric vehicles. When travelling on trips hitting the quick charger repeatedly it can significantly slow down as the pack heats up. But of course the Volt can't quick charge at all besides the engine. Thankfully Nissan's newest EV the Ariya does have active thermal management. For the sake of less waste and energy use, replacing the pack is an option. Good that one can replace the original Leaf 24KWh pack with one up to 60KWh in size aftermarket.
  11. But the traction battery is not liquid cooled and heated. A lot more degradation is the result.
  12. 7:20 I don't believe this is correct that the battery SOC window (buffer) increases as the pack degrades. It is absolutely wrong for a 1st generation Volt (2011-2015) and it is unlikely for 2016-2019 model years. You can read the voltage of the battery pack with an OBD2 dongle and the MyGreenVolt app for Android. 335V when discharged (when engine will start to come on), 385V when done charging, which equates to roughly 20% to 85% SOC window. For 2016+ Volts it is a bit different. My 2013 Volt has had the exact same voltage ranges at 133K miles as it does with 200K miles on the Odomete
  13. Express card slot (the successor to PCMCIA slot). Does this have Thunderbolt to support eGPUs? If not, an IO module to get that port! Max batteries. I love Toughbooks. I remember my CF-28, a Pentium 3 fully rugged model. Awesome design. My first modular laptop was a Dell Latitude LM Pentium 1. Had 1 or 2 modular bays. Could swap in and out a CD-ROM, or floppy or second battery. And, had a nifty Infrared port to wireless print for free as a High School student
  14. The Linus server only shows 110 public jobs right now. Did something go wrong? https://apps.foldingathome.org/serverstats Does the similar Rosetta@home project have any meaningful impact, or is it wasted computing power and should just run Folding@home only?