Jump to content

Darkseth

Member
  • Posts

    3,751
  • Joined

  • Last visited

Everything posted by Darkseth

  1. Which ones do you use? How often do you use? Is it like once per week? Do you use multiple Applications daily? If you don't need that often, a Mac could work. Forget Dual Boot, inconvenient af, even if it was possible. Windows 11 ARM VM with parallels or Vmware for example, and you can run most Windows Applications, despite them running inside a VM on Windows for ARM which emulates x86 Applications to ARM. I did this for a bunch of Busines Intelligence Applications with Parallels (i used Standard settrings with 4 Cores and 6gb Ram, no issues. Felt like native Windows). It's rare, when any application doesn't want to install or Run, mostly some very niche special things. But even bad programmed small tools, like a Rom-patcher for GBA Roms worked perfectly fine, Emulators ran. With Parallels, they also have a "coherence mode", that means the Applications inside the Windows-VM can run in their own extra-Window. That means, you can start Windows-Application-1 inside it's own Window within MacOS, and it looks like a MacOS Application. It isn't "stuck" inside the Windows VM-Window. It needs some testing if your Applications do run inside the VM, but that's a great option. If you use more Windows Applicationst han MacOS itself, then a Windows Laptop does make more sense probably - unless you do benefit otherwise from a Macbook (iPhone User maybe). Good Windows Laptops.. I recommend JustJosh on youtube, he has among the best Laptop Reviews out there. Solid choices are Elitebook 845 (14") and 865 series with Ryzen Zen4 for example, Thinkpad Z16 maybe, or XPS lineup. Really depends on the Applications you want to run, how your workflow looks like, what specs are important etc.
  2. It's automatic. However, it will be completely Random if the Data lands in the Dual Channel Area or Single Channel. For basic everyday tasks, Dual Channel vs. Single Channel is not noticeable, so don't worry. Any demanding Tasks where it would be noticeable would be bottlenecked by the old Ryzen 2500u anyway. Even today, i'm using a T16 G1 Thinkpad for Work with a single 16gb Stick in Single Channel, everything i do runs fine.
  3. Nope, it's correct as it is, but this Thought can go into both directions. The Sensor is 4:3, so that's the full Image it can "see". 16:9 only works by cropping the sides. Imagine you have an iPad, and you stream a 16:9 Video, it will crop in with black bars, giving you less space than your iPad Display Area. There won't be any additional Pixels outside of the iPad.
  4. i'd say M2 Pro and M3 Pro will last a similar amount of time. I would take the M3 Pro: - new sexy Color - 18gb Ram over 16. Not much, but very welcome!! - Not really much faster Multicore-CPU Power, but not really less (faster Single Core!!). but faster GPU with faster Raytracing But keep in Mind, Apple reduced the memory interface. 150 gb/s on the M3 Pro vs. 200 gb/s on the M2 Pro. I wonder how much difference it may make, but i personally would chose 18gb Ram over 16.
  5. I do. Alot. Like many others do as they should. Why the hell would i not care for that? It's basic common Sense as it has only benefits - even if you ignore the Cost difference. Just because You don't, don't reflect that upon the whole World. As for RDNA1: Like i said, Price decides if it's good or not. If It's significantly cheaper than anything else in a similar Performancerange, then yes. If not, i don't see any reason why anyone should go specifically for RDNA1. Nothing more, nothing less did i say.
  6. Noone ever claimed it has. But it managed to stay relevant for long thanks to the massive Vram at the time back then. While RDNA1 is outdated just a few year's later, it came out 2019, it doesn't support features that can be used today. True, but i would chose RTX 2000 series over RX 5000 series. That beeing said, i would've never considered RDNA1 due to the stability issues those Chips had. Touring has more features than RDNA1 and "aged better". RDNA1 did not aged well. In no Aspect at all. Not Features, not efficiency. Only thing it has going might be when you can find it cheap enough.
  7. Nope, RDNA 1 isn't good, and should not be considered anymore tbh. Sadly German, but maybe there's english subtitles? Let me break it down: This Video is about the GTX Titan from 2013, how it keeps up today. And how it aged really well. In the End (Minute 32) he talks about the negative example, what didn't age well: RDNA 1. It comes without Ray tracing capabilities and is missing alot of Features like Variable Rate Shading. RDNA1 diesn't fulfill the DX12 Ultimate Featureset. Well, at 150 bucks today (Dollar or Euro doesn't matteR) is really low, and used Market is a different story. If it's Pascal, Touring or RDNA1, it most likely depends on how much pure fps-Power per Money you c an get. But it's not a good buy if you want anything futureproof for anything - it's more like "i want to game, i don't have alot of money, what's the least worst choice". However, there's another point Reviewers never talked about: RDNA 1 had horrible instabilityy problems for a LOT of Users. Many Tech Forums were full of that problem. I knew someone who tried like 3 or 4 different models (RX 5700 and 5700 XT), but all were unstable for most Games. Nvidia was withotu issues. I also chose RDNA 1 back in 2019 for my GF's PC, she had horrible black screen and freezes all the time. As soon i had the idea to put my GTX 1080 inside, no more issues. I personally wouldn't risk RDNA 1 anymore today, there are enough options available. Seeing that Benchmark screenshot above, i would chose RX 6600 > RX 5700 XT anytime, unless the price difference is very noticable. 5% more or less is not something anybody will notice - especially if you use VRR (Freesync/Adaptive Sync). But it consumes like 80-100w less power and has a better Featureset.
  8. M1 Air throttled like 15-20% max when doing Cinebench Runs nonstop. League is perfectly fine even on M1 Air, as every Youtube Video showed. Cap FPS at 60, and problem solved. The Chip won't run at 100% Load. Edit: There it is. 30 min Cinebench R23 run on M2 Air decreases Multicore points from 7700 to 6700. That's like 15%. Literaly every single Windows Laptop even with Fans don't throttle less, when comparing short Burst Performance vs. long time Sustained Load (PL1, PL2, thermals, etc). No big issue there. If that throttling means an fps-decrease from 150 to 120, so be it. Or directly any LoL Video on youtube: fps barely drops down to 2-digit numbers on a small Aram Match. Whole M2 Package consumes barely 9 Watt of Power with uncapped fps and maxed out settings. M2 can hit 20w under full Benchmark Load.
  9. Idk about the US; but something like this here in Germany, if it's not a Scam, it's just a price-error. Even if you purchase it, they will cancel the order (sometimes you can be lucky tho). If this purchase is "protected" through Newegg, and you have nothing to lose, you can try.
  10. Lol true, just checked it. They refreshed the 3050, Memory Bus went down from 128bit to 96 bit, so 6gb were possible. Anyway..it seems it's either: - 4 P-Cores with 32gb Ram - 6 P-Cores with 16gb Ram. Since it's also a VM running, i would lean towards 32gb Ram tbh.
  11. Check the second Configuration again. afaik RTX 3050 is a 4gb Model, but 3060 is 6gb. Is it 3060 or 3050? You mentioned Work, day to day use, workstation. I don't see anything here, that uses the GPU. Are you doing Gaming or any 3D Work? (video editing, blender, etc). If not, the GPU is irrelevant. GPU Performance is probably not much different, but RTX 4000 series comes with Frame Generation, which could help a bit. i7 and i5 here are almost the same, but i7 brings 6 P-Cores over only 4 P-Cores in the i5. If you do heavy work, the i7 might be more usefull. However, if you do more Work and alot Multitasking, 16gb Ram could become a problem earlier. Since Ram is soldered, i believe the i5 Model with 32gb Ram is a better choice. 2 P-Cores less will be 10-20% less Multicore Performance probably at best (If it's not 100% Load, you won't feel a difference). But 32gb Ram is more usefull if you do alot Multitasking. Having 2 Cores less, the worst it will do, a Task could take a tiny bit longer to complete. But if Ram isn't enough, the Application could even close.
  12. M1 Pro > M2. Better Display, almost same batterylife, better ports, better speakers. Performance difference can be there, if you do very heavy work and 100% Load situations (rendering a video). Then the Fan will help keeping performance up. During everyday stuff (aka NOT 100% Load), battery will seem to last forever. In this Situation you will probably struggle to empty the battery in less than 8-10 hours. When it comes to "best batterylife while doing work", there's no way around the Macbooks. If the Air is enough or if you should go for the 14" Pro, idk. Depends on your Workload. If you can afford the M1 Pro / M2 Pro, i don't think it's a bad choice. If it's same prife than the M2 Air, get the 14" Pro. Only get the Air (same SSD/Ram), if it's significantly cheaper. Like 100-200 bucks maybe.
  13. Let me try to clear that misunderstanding, that plagues too many for no Reason Sorry if it's too long, i try to explain it as simple as i can, because it will be so much better once you understand this^^ - Bottlenecks can not harm you. And your Computer won't break. Don't worry. it is okay to have a Bottleneck. Because.. - You ALWAYS have a Bottleneck. Without one, you would have unlimited fps. Ever seen a PC that had unlimited fps? Me neither. - Which part Bottlenecks depends NOT on the Hardware (which is why bottleneck calculators are the biggest waste of Storagespace on the Internet, they are made by people who lack even basic understanding about what a Bottleneck is and how it works), but which Software (Game) runs, and which Settings you use. Example with your 5900X + 3070 combination (numbers made up for explaination): Cyberpunk on 4k Ultra with Raytracing: Your GPU bottlenecks. because that resolution is so many Pixels, your GPU will sweat, struggle and cry itself to sleep. Your 5900X will be bored, beause it could push more fps. But the 3070 can't. Cyberpunkt down to 720p Medium Settings without Raytracing: Pixels so low, 3070 could probably push 300 fps maybe? idk. However: The GPU can only calculate 300 fps, if the CPU is fast enough to deliver the data for it. CPU doesn't care if you play on 720p, 4k or 16k, it's the same data. But the CPU can only calculate enough data for 150 fps. --> CPU bottlenecks, beause it holds the GPU back. It could go to 300 fps, but delivers 150 because that's what the CPU delivers. GPU has only 50% usage in this example, and is bottlenecked in this one single Game in these Settings. But is that a Problem? Your GPU is bottlenecked, but you have 150 fps. Is that not enough? See it that way: 150 fps are way more than enough, and your GPU doesn't need to waste Power, stays cool and quiet. Win win. As you See, Exact same Hardware, Same Game, but different Settings. Completely different Bottlenecks. You have to know how these line works: First the CPU calculates all the data needed for a Frame. CPU gives this to the GPU, which renders the Image. Imagine you and me are in a production line. I'm folding Paper planes (CPU), you color them afterwards (GPU). I can fold maximum 100 per hour. You could color 300 per hour. You are bottlenecked by me. But if the Goal from the Boss (how much fps you expect) is only 80 Paper-planes, there's no issue, right? TL;DR: Do not think like "does x bottleneck y". Think about which Settings you want to play with and how much fps you want at a minimum. And does your System deliver these fps?! If the CPU is fast enough for your desired fps, it's all good. To find out of your CPU can: Change Resolution to lowest possible, disable Anti Aliasing to force a GPU Bottleneck. The fps you have here, is what your CPU can manage. Is that enough? If yes, your CPU is perfectly fine. If not, you need a new CPU. Simple as that. The GPU is irrelevant for this question here. If your CPU does deliver enough fps for you, but your desired GPU could do even way more than that in this particular Game: Be happy your GPU doesn't need to go all out. Or give your GPU more to do. Increase Settings more, use DSR to render in a higher Resolution, use Raytracing, etc. There's many options. As for FF14: My GTX 1080 had no issues with 1440p maxed out. Idk if i had 140 fps at all times, probably a bit lower in Raids, or Cities with many Players. At some point with enough Players, the CPU can bottleneck fast, or even the Game Engine itself. I doubt you could hit clean 240 fps in every single Situation, but does it matter? I'd probably just use Riva Tuner, cap the fps at 100, 120, 140 or something like that, and enjoy the Game with a mostly perfect consistent Frametimes. Clean & perfect 120 fps at all times feels much better, than fps jumping between 140 and 200 up and down. Just buy the GPU model you can and want to afford or find a great Deal and be happy. You'll probably play a different Game at some point. Or Change your Monitor to 1440p or 4k (trust me, edge flickering is insane on a 1080p Monitor, even 1440p on 24" doesn't look clean. ff14 does great on 4k). Maybe consider 4000 series to have access to Frame Generation for future Games. Maybe even for FF14, as they are planning a technical Graphic Update with 7.0.
  14. It's extremely rare for a 750w PSU to have only a single 6+2 Pin. It's probably a very cheap noname fireworkcracker-PSU. It's really not recommendet to use them. They might have issues delivering even the Power it says on Paper, but also might miss some of the security mechanisms Which means: Even tho it says 40A on 12V, which is 480w, i kinda doubt it can safely deliver this amount without any issues. If you're unlucky, your Hardware can be destroyed. If you're VERY unlucky, a fire could start. No matter how much you want to save Money, invest at least into a halfway decent PSU Model like idk, BeQuiet SystemPower 500w (which is MUCh superior to a 750w firecracker.)
  15. SSD. Doesn't matter how old your Hardware is (unless it's like a 30 year old one maybe), which OS you want to run.. If you use an old HDD, this is your Bottleneck. Had a 2012 laptop once, i5 3rd gen i think. It was unuseable with HDD, the HDD was nonstop at 99% usage, everything was so slow. SSD fixed it. Any basic Sata SSD does the job. Second, see which CPUs are available. Pentium is meh, but if you can find an i7 2600 (over an i5 2400/2500, because the i7 has Hyperthreading), that thing can get you far. Yes, you could get Windows 11 running, with some hacks, but it won't be official. You might get security updates, but maybe only untill the next major Update, and you have to reinstall again. Don't do that. Get Linux on it, it's not difficult to use for basic stuff. Unless you need special Windows-Applications, give it a chance. Official Updates, sometimes long term support for many years before you have to upgrade, very efficient and quick. I recommend Linux Mint with cinnamon. Simple, Long term support, based on Ubuntu, popular enough to find alot of troubleshooting-stuff on the Internet, kinda lightweight. Cinnamon is not the most lightweight, but it's oriented alot on Windows XP/Windows 7 Design, so you have good chances to feel at home.
  16. So just to be sure, you're going to take the Motherboard out, and build your own "Laptop/handheld" around that? Then it's not much relevant which Laptop is the best, because that's defined by many aspects. That makes it simpler for you. If it's iGPU you're after, you want a Ryzen 7 6800 (preferably -U Model because of efficiency) or even Zen4 (7840u), so you get the Radeon 680m in a 6000 series, or 780m in 7040 series Zen4 model. 780m is maybe 10-15% faster.
  17. short: Zen4 > Intel 13th gen. Because it's more efficient in lower TDPs, and low Load. Intel 13th Gen might be stronger in 100% Benchmark-Load. Long: Question can't be answered, because a Laptop is more than the CPU. Not only are there Display, Ram, Quality, Size etc, but also CPUs can be configured differently. A Ryzen 7 7840HS can perform different in 2 diferent Laptops. If it's the exact same Model just AMD vs. Intel configuration, i'd take AMD because i prefer better efficiency, lower Fan Noise/heat, longer batterylife over the last x% max Performance
  18. If it's the same base M1/M2 Chip, it also features only 1 external Monitor (2 total, one beeing the internal). It has to be a 14" M* Pro Chip to support 2 additional external Displays. Or Displaylink, that works with M1 too. Let's to back to the first post. Data Analysis with Excel on a Mac? If it's Power Pivot you need, you're out of luck. Doesn't happen here. You will need a Win 11 arm VM, install Windows-Excel and use Power Pivot there. Which destroys the efficiency a little bit (not much, it still lats forever with a windows VM). But i do know of a Windows Laptop, that comes close to M1 Air: https://www.notebookcheck.net/Lenovo-ThinkPad-Z13-laptop-review-AMD-s-premium-ThinkPad-with-long-battery-life.639685.0.html#toc-7 Thinkpad Z13 series. But the Base model with 6650u 6-Core CPU (NOT 8-Core) and 1200p Panel (NOT 3,5k OLED). Also, it has a sensory-based Trackpad like Macbooks do. Notebookcheck maesures 14+h on wifi web-browsing, which is alot for Windows. Imagine a few hours less and there's still alot puffer. They did test the Ryzen 7 + 3,5k OLED version, it managed 7,5 hours~ or so in Wifi test, compared to over 14 hours with the Base model. And that batterylife of 14 hours is with a solid Multicore-Benchmark Performance. Compared to Dell XPS 13 series: - XPS 13 Plus 9320 has similar performance, but half the batterzlife. - XPS 13 9315 (-U series) has similar batterylife, but half the performance. Similar perforance as a M1 in Low Power Mode, which caps at 4 Watt.
  19. Yup, and even if Windows Laptops do throttle (CPU often not that much, but GPU can throttle to much below 50% even), they still don't last as long as a Macbook during Work. @ Thread: I'll vote for M3 too. 2 generations newer, M1 Max might have more Graphic cores, but the Cores got stronger and better. Apple sais, M3 Max 40 Core GPU is 50% faster than M1 Max 32 Core GPU, but it's only 25% more Cores. Also, M3 has better GPU utilization with dynamic caching, that could do something. But for Video editing, you'll probably be carried by the Media Engine more than the GPU Cores. Also, M3 family supports AV1 now too. Hardware raytracing, much stronger neural engine.. I would take M3 Pro over M1 Max for sure. Your biggest Difference might be 14" vs. 16" however.
  20. Tell me you don't have/use a Macbook without telling me. Like not a single Word you just said is true. As i said, my Chrome on my Macbook consumes less CPU% than Safari does, despite Chrome having more Tabs open. It is optimized for Laptops just as every other browser, and it is absolutely 100% optimized for Apple Silicon, because it runs natively on ARM. There is no issue there. You probably don't even know what "optimized for laptops" even mean lol. Chrome on x86 Windows even can freeze unused Tabs to save Energy The times where "Chrome = Battery hog" are long over, that was true on Intel macbooks, but as soon Chrome was available as ARM Native, it's no issues anymore. It doesn't consume more Power than Safari, at least nothing you could notice. I have it here. I see it with my own Eyes, and inside Activity Monitor. Nope, it's not even twice as fast, when i run Windows 11 ARM inside Parallels, and work inside Windows. Maybe its 1/3rd less batterylife or 1/4th less. But not even half, i can use the VM for over 10+ hours straight without anything. No Fans, no Heat, nothing. Even back to Intel Days when you could use Bootcamp to dual-boot Windows on a Mac, batterylife wasn't that much off. Not even close to half. Why are you lying and make up random numbers?
  21. What does that even mean? Chrome is Apple Silicon ARM-Native. Of course it's optimized. I just started Chrome on my M1 Macbook Pro, after my 8 Tabs loaded (a few stackoverflow, etc), it consumes 2,8% CPU according to Activity Monitor. I have to scroll down to find Chrome. Safari with 1 Tab is at around 5% right now. There's absolutely nothing wrong with Chrome on Apple Silicon, and 200% is absolutely NOT right. Something is wrong there, like something is sucking up CPU-Power. Maybe an active Tab or something, this is NOT normal.
  22. like shown in Video, the "Issue" with x86 Chips is, they do boost to a high Clock speed to get the task done. But that means, high spikes for basic things like Page refresh. And it does depend how you use it, it's very possible for a XPS 13 Plus to last for 4 hours only, or 8+ hours with a different Workload. I think XPS 13 9315 Multicore-Performance should be similar to your M2 Macbook Air, when you hit Low Power Mode. But my Work Laptop is a Thinkpad T16 G1 with an i5 1235u. And my previous one was a Dell Latitude 5430 with an i7 1255u, so i'm familiar with those 2-P Core + 8-E Core chips. They do a perfectly okay Job for any basic Tasks, you really don't need much performance for stuff like that. Once you start Video editing, it might be different. Still, the AMD Chips are better than Intel's U-Series and P-Series (usually similar or better performance while beeing mroe efficient). T-series is always a very solid workhorse. But again, i'd prefer an AMD Model than Intel, but if the Intel model is much cheaper, fine too. Same Class as a T-Thinkpad (Higher Range Business Laptop) is the HP Elitebook lineup. HP Elitebook 845 G9 (Ryzen 6000) and 845 G10 (Ryzen 7000) with the 400 nit Panel (avoid 250nit and 1000 nit, they are both crap) does a GREAT job. Solid batterylife, performance, they still offer 2 Ram Slots so you could even put 64gb Ram inside if you want. And they feature USB4 with 40 Gbit/s. Latest T-Thinkpad Models like to solder Ram on the board. Also Nope, something is absolutely off. Unless something is working in there (like a browser-game). If you don't actively do anything and nothing is going on, it should be very low. Maybe a Reboot fixes it or Updating the Application if it's not up to date.
  23. Have you tried the brandnew Teams, which uses a new codebase and technology? might solve the issue. Also, depending on which Applications you need, might Parallels be a solution? But let's get to business! i'd ignore the Samsung Galaxy Book. Midrange Laptop, and you should avoid 8gb Ram on Windows devices. Dell XPS 13 Plus (9320~) is slightly around/above Macbook Air M2 performance, but MUCH lower Batterylife. You can expect maybe 35-50% of the batterylife, while 50% is a good case for the XPS. XPS 13 9315 (Intel -U Series) can reach Macbook Air battrylifes, or at least come somewhat close, but at cost of half the Performance. Lenovo Thinkpad Z13 (Ryzen 5 Pro 6650u 6-Core (NOT 8-Core) and 1200p IPS panel (NOT the Oled Panel) is a very good Macbook Air contender. It reaches almost similar batterylife, like the XPS 13 9315. But it doesn't suffer from lower Performance as the Dell. But.. There's absolutely NOTHING out there, that delivers Macbook Apple Silicon Batterylife in Reality (not script stuff like videoplayback). You will have to make sacrafices there. 15,6" is max: Why is 16" not ok? 15,6" is used for 16:9 Laptops, that's old format. Most models got upgraded to 16", which is the same Size, but just a few additional Pixel rows at the bottom. Just the switch from 1920x1080 to 1920x1200 increased the size to 16", but in reality, there's just more Pixels at the bottom, and less thick plastic-border. If you really take 15,6" as your maximum, you're filtering out all the modern 16" Models, which are "the new 15,6", so keep that in Mind Oh yea, Lenovo is a good choice, look over at the Lenovo Yoga 7 lineup with AMD Ryzen 6000 or 7000, they come in 14" and 16" Convertible with Touch screen and Pencil.
  24. RX 580 is not much faster than a GTX 1060, it's maximum 10% or so - since the 480 could never really run past the 1060 over the years. Can be ignored. 1070 > 580 in every Aspect, if both are similar priced. Regarding the GTX 1080: If it's not the best Cooler, consider a big Undervolt. I have the MSI Gaming X Model, and mine ran in 0,800 Volt / 1823 Mhz (1963 was stock) / +500 Mhz Memory OC for years. More than 10°C cooler, a LOT quieter (almost unhearable under Load), and thanks to G-Sync i never noticed a performance difference, which was around -5% on Benchmarks or so. With 0,800v and 1823 Mhz i got it dopwn to 140w Powerconsumption, which is similar to a OC 1060 (but this UV 1080 is still 70-80% faster), and 1/3rd less Powerconsumption than an RX 580. I personally would not recommend an RX 5700 (XT), these models had severe stability issues, Forums were full of that - but most Fanboys never believed it. My GF had one once, because i was stupid enough to chose the 5700 for her, instead a slightly more expensive Nvidia card, because Money. Took too long to find out it was a faulty GPU, and after that, prices were down. I also knew someone, who tried like 3 or 4 different 5700(XT) GPUs, all had problems and freezes. One Nvidia, and everything was great. The Issue was apparently fixed with 6000 series.
  25. If you want to stay at AM4, go straight up to the 5800 3DX, it's the best AM4 CPU for Gaming. Check, if your Motherboard needs a Bios Update first in order to support it. Maybe you can find some used for a good price from people upgrading to AM5 or Intel. That 58003DX can increase max fps by over 50% on CPU-intense games, that large Cache is really helpful if you stick to a lower resolution, but want super high fps. If you want a complete new Platform with DDR5, then maybe ask in 1 year again with your Budget over at the Upgrade section, since idk what's a good deal then AM5 wo't be a bad choice i guess . If you get a new Monitor, 2560x1440, 3440x1440 or even 4k, then you'd have to upgrade your GPU first probably.
×