Jump to content

NicoV

Member
  • Posts

    33
  • Joined

  • Last visited

Everything posted by NicoV

  1. To everyone saying that hibernate is an usable workaround: Sometimes it isn't. On both of my windows laptops, when waking from hibernation, sure my programs reopen, but all browser tabs get reloaded. For something like YouTube or this forum, that's fine. However, I will usually have many tabs open that can't just be reloaded since those websites need to reauthenticate, I can't just jump to the link I had open. Signing back into all of those websites and clicking through their convoluted and slow menus can easily take 20 minutes. This is also why shutting down doesn't work, even though yes, both laptops barely take 10 seconds to boot. Additionally: I've had machines that were supposed to be in hibernate mode shutoff for no apparent reason, in which case nothing automatically reopens when I boot back up. Ironic side note: Sleep works perfectly on the Linux install on my main laptop. It has literally never been a problem.
  2. Read the first paragraph of the article carefully, they were the ones who created the fake scores. https://chipsandcheese.com/2022/10/27/why-you-cant-trust-cpuid/ They don't say this outright, but the low single core scores are probably because the test is being run in a virtual machine, incurring a slight penalty.
  3. I'm pretty sure that is where this display is from: It seems like it is using the same panel as the duo's secondary screen. Same size, aspect ratio (31.41818/9 to be exact), refresh rate, advertised response time, and resolution. That said, you can buy kits with these duo panels and a controller board for about $80 a piece (Or both separately for about the same). In fact, most of those kits use 12V DC input from a power brick as power input, so they might essentially be selling those with a not-so-fancy enclosure.
  4. Blade Runner 2049, highly recommended, especially if you've seen and liked the original Blade Runner.
  5. NicoV

    Lenovo L27q-30

    I'm looking for a cheap monitor to have more screen real-estate for some video editing than just my 15.6" laptop, and to have more space for my too many tabs. My budget limit is around 175 USD. So far the best thing I've found is the Lenovo L27q-30, a 200 USD, 1440p monitor (There's also a 24 inch version that's 165 USD). There aren't many reviews on it, but the one article I could find seemed mostly positive, with the color accuracy being "good". The two main complaints in the amazon and newegg customer reviews are the non adjustable included stand and the wider than advertised bezels. Neither of which are very important to me, since I, respectively, plan on getting a monitor mount in the near-ish future, and don't care. However, I thought I should ask if anyone has any recommendations for anything comparable or better in the same price range. Since 200 USD for the 27" model is at the far upper end of my budget. Here are the link in case anyone is curious or wants to read the specs in more detail: 27: https://www.newegg.com/lenovo-65fcgcc1us-27/p/0JC-0006-00UN2 24:https://www.newegg.com/lenovo-24/p/0JC-0006-00UR2
  6. 7702. A lower power version of the 7742, which is a lower power version of the 7H12. Not to be confused with the 7702p, which is the exact same, but can't run on dual socket motherboards and is 463 dollars cheaper on amazon. If you set the video to 4K, it becomes possible to read the text in task manager because of the lower compression loss, even if you only have a 1080p monitor.
  7. No, they only work on Intel Xeons, and even then compatibility is, messy...
  8. Probably depends on the workload. For the 1 core you run into the problem of scaling with frequency, the higher the speed of a single core, the harder it is to feed that core with information. For the 80 core, you don't need to give each core as much data, making the pipeline much easier, but it is hard to make something that scales well with 80 cores.
  9. Nitpicks: "Pushing the limits of science" No sh**, they shot for the moon for the original 10nm, and totally missed. We just don't have the level of chemical control that they needed to get that cobalt trick to work. Maybe try something a little more reasonable this time. "give each Intel chip limitless potential" With enough volts and LN2, sure. "It's one of mankind's most complex feats" No argument there. "Intel has essentially doubled transistor density every new generation" -literally shows a graph of incremental yearly improvements- You haven't doubled transistor density in a single generation in a long time, neither has anyone else at that level of transistor size for that matter, and that isn't even what your graph shows. "Intel has devised several innovations to overcome fundamental barriers to continue transistor density scaling" How many years ago was 10nm supposed to be? Again, Intel pulls off a ton of amazing technological tricks, they just went down the completely wrong road 5 years ago. Also their sentence is worded in the past tense, so I'll give them that. "Innovation processor packaging has become a critical feature of advanced computing architecture" What was that about glued together chips? "2D and 3D packaging technologies are enabling new device form factors and additional boosts in performance and energy efficiency. Sooooo, EPYC?.. Admittedly AMD did mess up with HBM a bit before they figured out a more useful version of the multiple dies idea. "Adding more performance and features to each new processor generation" Thanks for the finally not quad cores I guess? I don't know, for some reason it feels like that was more reactionary than actual innovation, I wonder why... Oh right, 1700Xs. "Intel's integrated design and manufacturing capabilities have enabled humanity to innovate game-changing technologies that impact nearly ever facet of modern life." Ok yes, computers are very important for almost everything we do now, and you did make a lot most of the consumer and datacenter chips for almost a whole decade, but that is some serious self congratulation. Also, you're only in this ridiculous situation because of your insistence on not releasing what is now a several year old architecture until you had a new process to make it on. -The next 30 seconds is copy-paste marketing speak about being powered by creators- ZZZZZZzzzzzzzzzzzzz Whoever is in charge of Intel's presumably enormous marketing team, please stop making stuff life this, and please start figuring out a new naming scheme that actually makes sense. Who knows? The engineers who design the chips might be able to give you some tips on the actual differences. Now time to find an AMD equivalent to this video and go on a similar nitpick.
  10. Infrared and microwave radiation. Just like how a lightbulb emits visible light (photons) at a few hundred degrees, the surface of a satellite will emit photons with less energy at a lower temperature. While it is true that "space", or rather the dust in space, is very, very cold, there's so little of it that it can be completely ignored, at least for heat transfer. The computers in spacecraft are also purpose built, which makes them very power efficient. Interestingly, the main problem for spacecraft isn't cooling off, it's staying warm. Most of them have a large enough surface area that they dissipate more energy than they output through normal operation. The power source on New Horizons (A tiny piece of plutonium) outputs far more power than the spacecraft's electronics need to run, the extra power is used to keep it warm enough for those electronics to not "freeze" to death.
  11. Not sure about the motherboard costs, but some of the first, and even second gen threadripper CPUs can go for really cheap. As in, 300$ for a 16 core CPU cheap.
  12. SCROLL TO THE BOTTOM IF YOU DON'T WANT TO READ MY LONG STORY I have a Lenovo Legion 5 with a 4800H that I use for video and photo editing. I was exporting a long video today (Software Render BTW, my 4GB 1650 would have run out of VRAM) and happened to have task manager open. The laptop was running on quiet mode, which limits the CPU to 25W, allowing it to only barely have the fans spinning, and only under heavy load. Despite this, the 4800H was still boosting to 4.05 GHz, all cores. And it was able to sustain this for the 35 minutes the export took. I was amazed, I'd never heard of these laptop being able to do this, and it wasn't task manager bugging out, I downloaded several other apps to verify this, and they all gave the exact same answer. I then exported the same video, but in the laptop's other two modes: Normal (45W), and Performance (65W(+?)). In normal it went from being quieter than the air conditioning to partially audible from across the room, but was now boosting to 4.2 GHz, which is the listed maximum boost clock for the 4800H. In performance mode, it became almost loud when sitting next to it (Not enough to impede normal volume conversation though), and audible but not bothersome from across the room. As for the clock? 4.5 GHz... That shouldn't really be possible, it's nearing the clock speeds found in Intel laptops, and according to AMD's website (And Lenovo's), the 4800H should only boost to 4.2 GHz. AND it specifically says that that is for short, single core workloads, which this is neither. My guess is that Lenovo managed to build a cooling system more powerful than could normally be used by a 4800H, so they decided to keep it and mess with the power limits in the BIOS, effectively overclocking a CPU that isn't overclockable by an end user. And as far as I could tell, the CPU wasn't being cooked. Even with the not actually loud fan speed, neither the laptop, or air coming out the vents seemed anything more than warm. I don't live in a cold area, the AC was set to 25C, at it's nearly 33C outside. (If anyone's wondering, the 4.05-4.5 GHz jump reduced the render time from 35 to 31 minutes, about what you would expect) The laptop does falter a bit when the GPU gets involved though. On performance it drops to 4.4 GHz, and the fans do go into "loud" territory. I would not want to sit next to someone trying to focus with that much noise. In quiet mode, the fans ramp up to what they sound in normal mode with a CPU only task, and the clocks drop to 3.9-3.8 GHz. AMD has suitably impressed me this time around. SHORT VERSION: My Lenovo Legion 5 4800H can boost to 4.5 GHz all cores in CPU intensive tasks. I'm curious to know what clock speeds users with Legions and other similar laptops are getting (XMG, HP Omen, Dell G5, etc) are getting. Or completely different laptops, I know that some of the 4800U and 4700U laptops are configured with the same 25W TDP as the Legion's quiet mode.
  13. I was actually thinking about super high bandwidth PCIe storage devices for data crunching and AI accelerators (Sort of like a PCIe version of the DGX). Something like Liquid's compose-able infrastructure (I think that's what its called) that Linus showed in the "This is 50x faster than your PC" video. I said no one would have a use for it because of the massive side needed for a motherboard that can use the 6 or 7 possible x16 slots of a single EPYC CPU would make it impractical.
  14. I can't imagine that anyone would actually have a use for this-but how many PCIe x16 slots can you get in one system? I know that there are servers that have 8 Intel Xeon CPUs in one system. However Xeon has a lot fewer lanes than EPYC. And the EPYC platform has some dual socket motherboards.
  15. I really need help, this is going to drive me insane. One week ago I got a new laptop with a GTX 1650 and a 4800H. I quickly realized that Minecraft was running on the Vega 7 on the 4800H, and wanted to change that to the GTX 1650. I eventually found the nVidia control panel and was able to set it to the 1650, and my FPS tripled. A few days later I decided see if GeForce experience had anything interesting, so I made an account and logged in. It then said "Do you want to optimize your games?" I clicked yes and then started going through the various pages in the app. I then decided there was nothing interesting. I then resumed playing Minecraft, but the FPS had dropper down. I, again, quickly realized that it was back to running on the Vega 7. So I opened the nVidia control panel again and saw that it had reverted back to: "Auto Select: Integrated". I tried to change it, but clicking on the grey bar did nothing when previously there was a drop down list that let me select between a few options, dedication 1650 being the important one. Two days later and I still can't change any settings. I've tried DDUing the drivers and reinstalling them, but it didn't work. I was really excited for what was a massive upgrade, finally getting a dGPU, but now it's just sitting there, slowly eating through my battery.
  16. I see that the Lenovo Legion 5 15.6" has been mentioned, so I thought I might poke my head in. The screen is fine. I don't want to call it great since I haven't had that much experience with really high end monitors, but it's good enough for video editing and color correction. The RAN, SSDs/HDDs and battery are all replaceable. The keyboard doesn't flex while typing, normally or quickly, and you have to press really hard to get it to flex (Harder than I would actually ever press, even while playing a game), but it is there. As for the GPU, I've found that if I set everything to APU only in the control panel, it basically turns off. With this, the battery life is somewhere around 7-8 hour. Just thought I might respond to a few things said since I have on off these slightly weird computers. They really shouldn't be selling it as a gaming laptop, it's more like a "You either need to render something but don't have much money, or you need a large amount of CPU power and ASUS keep messing up their cooling solutions".
  17. I know about that. It seems to limit you to moving application windows one at a time though, rather than letting you select multiple and moving them all at once (Something like moving multiple files from one folder to another)
  18. This might be a long shot, but is there a way to move multiple windows from one virtual desktop to another?
  19. WHY?! This is the one thing that I think is undeniably better about Windows or OS X.
  20. I switched to using windows recently, and have been getting used to some of it's quirks and improvements compared to OSX, but there's one thing that's making me go crazy. Both windows and OSX have virtual desktops, but on OSX you can move their position relative to each other. Example: Say I have three virtual desktops arranged like this: [Desktop with task manager, chrome, and some screenshots] [Desktop with Premier and file explorer] [Desktop with another chrome window and a game] Is there any way to move the entire virtual desktop with premier and file explorer so that they're arranged like this: [Desktop with Premier and file explorer] [Desktop with task manager, chrome, and some screenshots] [Desktop with another chrome window and a game] I really want there to be a way to do this, because it's driving me insane to have to move a bunch of windows if I want to change the order of anything.
  21. How much worse? If it's warm and makes some noise that's fine, but if I can't hear someone 15 feet away talking then it isn't... How is the 144hz one?
  22. I'm looking for a laptop for video editing (Adobe premier) and some medium gaming (Kerbal space program with too many mods), and the Asus Tuf A17 is super tempting, specifically this model (https://www.newegg.com/fortress-gray-asus-tuf-gaming-tuf706iu-as76-gaming-entertainment/p/N82E16834235407?Description=asus%20tuf%20a17&cm_re=asus_tuf_a17-_-34-235-407-_-Product&quicklink=true) but I'm wary of all the reviews of it and the A15 that say that it has thermal problems. My main consideration is that the Lenovo Legion 5 is a similar price, and while it has a 1650 not a 1660 ti, it does have more memory, and more storage. I am willing to make that trade off if the A17 is good, but I am not sure if it is after seeing so many varying opinions. SO, my questions are: What temperatures and noise levels are to be expected from the A17 (4800h, 1660 Ti), both under a heavy all core load during video editing, and heavy single core and GPU load while playing games? In single core loads how much does it thermal throttle down from the 4.2 max boost clock? For the reports of 95C, what effect will that temperature have on CPU and GPU lifespan. How is color on the screen? I've seen amazon and newegg reviews that say it's either pretty good or complete shit, and website reviews say it's mediocre or good for the price. How bright is the screen?... Is the response time annoying or problematic at all? Side note: The Asus G14 seems to a nearly perfect laptop for my purposes, but despite supposedly being ~1200$ at the lowest end, I can't find anything except a 1500$ best buy link and the same model on amazon for the same price, is there any other place that sells this thing or is it just those two?
  23. There is an A17 on newegg: https://www.newegg.com/fortress-gray-asus-tuf-gaming-tuf706iu-as76-gaming-entertainment/p/N82E16834235407 It's 1,100$, and has a better graphics card, but it would require a ram upgrade, which would make it 50$ more, but I'm pretty wary at this point of all the reviews about it's mediocre thermal performance.
  24. I couldn't find anything with the amount of memory I need/want for less than 1400$
  25. I'm looking for a laptop with a 4800h (or equivalent Intel model, but nothing seems to exist in the price range) and 32gb of memory. The best I have found so far is a Lenovo Legion 5 for 1,199 USD (4800h, 1650, 32GB 3200mz, 512gb ssd, 1tb hdd) But before buying it I want to know if anyone has any recommendations for something of the same price (I can't go over at all)
×