Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Darkseth

Member
  • Content Count

    3,132
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Darkseth got a reaction from im_diegod in XPS 13 (9370) best cooling option   
    just spread it thin with a spatula, an old credit card or something like that, so the whole area is covered with a thin layer.
    That should do it just fine, and it won't be too less or too much.
    Anything too much will be pushed to the sides anyway with enough pressure, that's why a regular paste is much safer than liquid metal. So from me also: Get the regular Kryonaut, and NOT the Liquid Metal Conductonaut.
     
    From my experience, it really doesn't matter if you spread it thin, put a Drop on it, write an X, or any other Symbol.
    That maybe 1 or 2 °C more or less is more often measurement tolerance, than actually better/worse. I never had any differences, so i did it the most clean way.
  2. Agree
    Darkseth reacted to Hans Christian | Teri in Intel macs OS life expectancy?   
    How long? My guess would be a couple of years. Historically speaking, Apple supported PowerPC Macs for 4 years after announcing the switch to Intel. I don't think Intel Macbooks will get any cheaper because of the switch to "Apple" silicon, they'd rather just not sell them then. It isn't really "Apple" to sell "second-rank" products.
     
    As in picking up an Intel Macbook to run Linux? I'd say overall a bad idea. For the past many years Macbooks have been fitted with Broadcom wifi adapters, and these are notoriously horrible with Linux. Some people have managed to get them to work under Linux, but most people (including myself) have found it absolutely impossible to live with. Their Linux drivers are proprietary (if they even exist for your specific adapter), and breaks at the blink of an eye. Spent a month trying to get it to work probably, both on LTS, standard and rolling release, and eventually got so fed up that I went back to OS X. Can not recommend.
     
    Wanna run Linux? Pick up a refurbished Thinkpad. Cheap and incredibly well supported.
  3. Agree
    Darkseth got a reaction from whm1974 in How to switch Android's default download location from system storage to MicroSD?   
    Why would you want to download Apps onto the SD Card anyway?
    SD Card is by FAR slower than your internal storage. That would be like using a SSD for your System, but installing all Applications/Games on an old HDD.
     
     
    Use SD Card only for stuff like Pictures, Videos, Music. 
  4. Agree
    Darkseth reacted to Tom_nerd in Heard Samsung collects a ton of data...   
    just whenever it asks "would you like to send optional data to Samsung?" just always say no. Then none of your data will be sent to Samsung
  5. Like
    Darkseth got a reaction from D3strukt0r in How exactly does G-Sync work?   
    G-Sync is an Adaptive Refresh Rate (VRR).
    It's a Technology, similar to Freesync, aka Adaptive Sync (from VESA), that is pretty much the opposite of V-Sync ( Vertical Sync).
     
    First of all, V-Sync:
    Monitor has 60 Hz, GPU is set to Monitor's Refresh Rate. Which means, 60 fps Limit max, and everything above, will stay at 60 fps. This is the optimum here. perfectly smooth 60 fps, with every single Image beein displayed for 16,67 ms duration.
    If you fall BELOW 60 fps, THEN you have the Problem: Stuttering.
    Imagine it as a simple Math problem. Monitor has 60 Hz, it refreshes 60 times per Second.
    V.-Sync forces the GPU to "give" the Image only as a full one (no half-images, like when you have Tearing). As long the GPU is fast enough to render it in Time, everything is good (except maybe a bit Input Lag). If GPu is NOT fast enough, then the Monitor will display the Previous Image again --> It feels like that one image "freezes". It won't stay for 16,67 ms, but 33,4 ms (double as long).
    This is stuttering.
    50 fps and 60 Hz. How do you fit 50 images in 60 Hz? By doubling 10 of the 50, then you get 60.
     
    Now G-Sync: Exactly the Opposite. The GPU pushes out frames, but the Monitor will match the GPU. If the GPU can only render 52 fps, the Monitor will run in 52 Hz.
    If the GPU can render 38 fps, the Monitor will run in 38 fps. This changes in realtime without Delay.
    This way, you will have no stuttering (because no doubled frames. Monitor refreshes as soon the GPU has the next Frame ready), and no Tearing (because it's "synced"). Also, Input Lag doesn't increase noticeably.
     
    Your UHD Monitor with 60 Hz, and g-Sync will have G-Sync act ive up to 60 Hz / 60 fps.
    If you disable all sync, and your game runs in 80 fps, you will NOT have G-Sync enabled, You will simply have 80 fps on 60 Hz, and with Tearing.
    If you have G-Sync enabled, and your fps are BELOW 60, then you will have the perfectly smooth G-Sync experience.
    As v-Sync on: If your fps reach 60, it will switch from G-Sync status over to V-Sync status (since you have the full 60+ fps), which MIGHT cause stuttering.
    So you usually use an fps-Limiter at 59~.
     
    And THIS is supposed to be that way.
    Since your Monitor has only 60 Hz, then anything above 60 fps is a complete Waste. you won't see anything. 120 fps on a 60 Hz screen will look exactly as smooth, as 60 fps. The Monitor will just show you every 2nd Frame, and skip every other.
     
    So, the ideal for you is, set your settings so you always have around 50-60 fps, or 45 fps or whatever your GPU can handle in your Game, in your prefered Settings.
    fps-Limiter is a good way, to keep the fps constant^^
    Perfectly even 45 fps lock will feel better and more consistent, than fps-fluctuations between 45 and 60 fps up and down. And g-Sync works at any fps range below 60 (even below 30!)
  6. Agree
    Darkseth got a reaction from Another_Blood in 6 or 8gb ram Mobile   
    That's a very wrong conclusion there. That's not how Ram works.
     
    You can't run out of memory under Android. Even if you had 3gb Ram, you could NOT run out. It will NEVER be "full" with 0 mb free.
    As soon a certain threshold is hit, Android closes an older Application, so you can run your current one.
    If you open that closed App again, it will reload, instead "beeing already loaded, ready go go".
     
    Checking free Ram on Android for this question is like opening your Fridge over and over again, just to check if the light really goes off if you close the door. That's simply not how Ram works, not in the slightest.
     
    I have a 12gb Ram phone, and i have 8,4gb Ram "in usage". Not playing any Game right now, it's justt sitting on the Desk.
    If i had 8gb Ram (like my Last one), Ram usage wouldn't go past like 7,2 gb or something like that, and older stuff will be kicked out.
    --> If more Ram is there, more Ram will be used. That doesn't mean, that this used Ram is "needed". But it's there, so it makes just sense to use it to cache some stuff. Unused Ram = useless Ram.
     
     
    8gb Ram instead of 6 simply lets you have more Apps open at the same Time without reloading.0
    If you have 8gb Ram instead of 6, the OS and other App will simply use more. Because there is more.
     
    Do you absolutely HATE whenever an App reloads when you open it again? Then you might want more Ram.
    You don't care that half a second reload, as long the App runs, you want to use? You can stick to 6gb Ram. there is no App, that won't work. Older Apps will just be closed earlier.
     
    Not visual enough? Imagine your Desk. Now imagine a larger Desk, where you can put more stuff on it at the same time, before you have to put older stuff back into your Drawer.
     
    Or let me try this way: On 8gb Ram, you could probably switch from Black Desert Mobile to CoD, and back again. And both Games will still be fully loaded, ready to go. With 6gb Ram, the older Game COULD get kicked out, and if you switch back, it will have to reload again. Depending on your other Apps, and how much Ram those Games actually use.
     
    That deal with the Scooter is quite good, since that thing costs like 300+  alone?
    So yea, get the Bundle if you use the scooter. Or simply sell the scooter for 200-250.
    If you don't care for the scooter, and don't want to waste your Time with selling, get the 8gb Version.
  7. Agree
    Darkseth reacted to huilun02 in 6 or 8gb ram Mobile   
    The bundle, unless you have no use for the scooter
  8. Like
    Darkseth got a reaction from Sid1298 in Laptop Battery while gaming   
    The thing is, most Gaming Laptops have like abysmal low Performance while not on Battery.
    i've seen some, where the GPU Performance drops down to <20% while not plugged in.
     
    And i really do not mean "by 20%". 
  9. Informative
    Darkseth got a reaction from Delicieuxz in Looking for Macbook feedback and recommendation   
    I think the price is way too expensive, since it's still Intel's 8th Gen.
     
    If you don't need Windows or Windows Applications, you can go straight to the M1 Macs.
    Macbook Air M1 already destroys every Intel Macbook Pro 13", and it starts at 999 USD.
     
    However, you should check first, if that Application your friend uses runs on M1, and the Plugins too.
    Also, these M1 Macs are MacOS 11 Big Sur only. No possible Downgrade.
  10. Like
    Darkseth got a reaction from Den-Fi in AirPods Pro: Too Convenient to Ignore   
    I have to agree with the Title.
    I only have experience with teh Libratone Trackair+, coming from the Logitech UE900 (which were pretty high quality 399€ IEMs with Quad-BA-drivers per side.Got them for 199€ many years ago, and even Today they are extremely high quality.
     
    In direct comparison, the Libratone Trackair+ can't compete with the UE900 in terms of Sound. Even the Airpods Pro can't, even the Sonys, or  Sennheisers won't. I can say that without having to try them out. Good Audio doesn't age.
    But the thing is... Your ears, your Brain will "get used to" it. In direct comparison, you will notice a difference. Maybe in quality, maybe mainly in tonality (like "a bassy V-Shaped sounding vs. a neutral natural sounding") But after a few minutes, hours, or maybe days, your Ears/your brain adapted to it, and the difference in Sound quality becomes much smaller, than it was in a direct comparison.
    Also, when you're "on the go", and have the Music as background, you're not focusing on it 100%, trying to analyze every little tone. That adds to the Fact, that differences in Soundquality will continue to become smaller and smaller.
    Well, at least once you pass a minimum Quality level.
     
    However.. No Cables to take care for (carefully storing it in a box etc), no "setup" (putting them in, moving cables so it doesn't just hang around etc) actually is really nice. I never could've believed that before, especially before the "True Wireless Inear" Era.
    And what comes: Instant pairing from just opening the Box, and putting them in your ears - and voilá, paired and ready to go. Storing them back, and closing the Box, and the Sound switches from Bluetooth back to normal Speakers.

    In Addition to that convenience, i get:
    - ANC which does work really great, can automatically change the strengh depending on where i am, and even pass-though for when i need it
    - Controls for when i can't decide for a Song, and keep on pressing on next song all the time. I had to grab my phone every single time before
     
     
    All that as an overall package is really worth so much.. that convenience is not just difficult to ignore, it's a gamechanger kinda.
    Even considering you might have to buy new ones every 2-5 years because the battery will be done, while cable-IEMs will never have that Problem.
     
    Oh yea, in my entire life, i never managed to break the cables on my IEMs. Not on my expensive ones (the Cable is the original one, and it looks like new after 6-7 years), and not 15 years ago when i used 20€ cheap earbuds/IEMs with very thin/cheap cables.
    So broken cables has never been an issue for me. Another Argument for the True Wireless Side for many.
  11. Like
    Darkseth got a reaction from TwilightLink in How good is the macbook air 2020? (not the m1 model)   
    Fair enough, I think an Ideapad is a solid choice, especially if it's a Ryzen 4000 series.
  12. Like
    Darkseth got a reaction from TwilightLink in How good is the macbook air 2020? (not the m1 model)   
    Compared to the M1 model, utterly Garbage.
    The Chip is so bad, it gets hot even for basic tasks, and Fan can be heared. For example, Zoom Calls. It's slow, gets loud fast, and hot when you do more than the most low-end basic things.
     
    The difference between the Intel Air and the M1 air is much bigger, than the Intel Macbook Pro and M1 Macbook Pro.
     
    M1 is not available? Wait for it. It's THAT good. Please, do NOT waste Money on the Intel Air.
    MaxTech compared both Base Models:
     
    Consider Windows Notebooks, if you actually need Windows for something, or special applications.
    If you can survive 100% on only MacOS, and you can afford the Macbook air, there is nothing better for that price or below in the Windows World.
     
    Doesn't mean, that you can't get your stuff done with a 30-40% cheaper HP ProBook / Thinkpad E14 Gen 2 with Ryzen 4000.
    Or an Ideapad with Rzyen 4000 is also a good, solid Choice in the Windows World below 800~ Bucks or so.
     
     
    Macbook Air has some big strenghts for around it's Price (and below), it's a VERY good Display with 400+ Nits brightness and good color calibration, and it has very very good speakers, which makes Videos watching really good. The overall incredible good Speed, and that optimization also is just next level.
    So maybe it can be worth for you. Maybe not.
  13. Agree
    Darkseth got a reaction from uncreativespace in Should I Pull The Trigger? ThinkPad P14s (AMD)   
    Well, the difference here is, the Thinkpad is much better build than the Huawei, and has much better warranty, and you can repair it better. Also, better service options, like those 3 years warranty, and you can even buy the service, where a technician comes to you the next day and repairs it.
    Aside from other Features like SmartCard, dedicated Docking connector, mechanical slider for Webcam which supports Infrared for Windows Hello, Not just Mil-STD-810G, but also Keyboard can handle liquid splashes, the Thinkpad typical trackpoint some people prefer over the Trackpad.
    Oh yea, you can replace Ram and SSD yourself.
    Not to mention probably THE best Keyboard you can find in a Notebook.

    Don't mistake Business Class Notebooks with Consumer toys there. These are 2 entirely different Products for 2 different Targets. But that's fine, since the Matebooks come with a much lower Pricetag.
    If you can spend that much, there's no reason not to get this one tbh. You will not find better Build quality, repairability, and durability for that Price.
     
    Of course, thinkpads aren't perfect, you kinda "have to like them". Design isn't the prettiest for example, and you can see fingerprints on that soft surface. But they are durable and reliable.
  14. Agree
    Darkseth reacted to Commodus in M1, Ryzen or Nvidia for video editing   
    Gotcha, although Apple does have its News app and there's the web stuff.
     
    I was mainly thinking of the M1 chip's advantages for performance, battery life, noise and temperature (this won't heat up your lap as a general rule). Beyond that, though, it mainly comes down to things like the better balanced display and the great overall keyboard/trackpad combo. I still scratch my head at how laptop screen choices in the Windows world tend to be split between either a no-frills 1080p panel or a 4K touchscreen of death that cuts battery life in half. I'd say Apple strikes a nice middle ground between resolution and longevity, but... well, it has a good resolution and longevity.
  15. Like
    Darkseth reacted to huilun02 in iPhone SE vs iPhone 12 Mini: Help me chose my Chistmas gift...   
    The 12 mini
    Becuase dont be an idiot
  16. Informative
    Darkseth reacted to Carstenpxi in M1 Macs Reviewed   
    I’m a 71 year-old old fart who has been around this industry since the late 1960’s. A few observations:
    High-performance design requires short electrical paths, and systems as small as possible.  For these reasons attempts to boost performance by using discrete units connected with buses, irrespective of the type of bus, will ultimately be limited by the speed of light. Keep it small, keep it close.  Keep things on chip, and integrate chips in tight 3D packaging (SoC). Only loosely coupled units can be at a distance.  High-performance design is best achieved by power efficiency.   The smaller the system the less power is wasted on communication between units. High speed communication is a power hog.  High-performance design requires a total system view.  Software and hardware co-design is vital for good performance.  Aside from low-level drivers and signal processing algorithms much software is written to be easy on the developer, and not good for efficiency. Look at the processor that flew Apollo to the moon. And in the mid-1980s we were 200 engineers sharing a cluster of three VAX-11/785 with a total Mips rating of 4.5. Systems must be designed with specialized co-workers performing dedicated tasks that would be inefficient to perform on general purpose CPUs. Thus, dedicated encryption, video en-coding/de-coding and machine learning are examples of increasing speed while reducing power.  Administering dedicated co-workers is most efficiently achieved by very long instruction words, both on-chip and off.  Interrupt-driven architectures have expensive context shift overhead. Game consoles use polling of architectures.   Cache memory, in principle, slows things down.  Caches are primarily built as buffers between systems with different bandwidths. They are a necessary and useful evil, but waste power and chip real estate, and waste energy and time for every cache misses to their power consumption, and overhead with cache misses.  The closer you can move your memory to the computing cores the less cache you will need.  Well designed memory pools on chip or very close to the CPU in reduce latency.  Speculative execution and look-ahead branching are wasteful examples due to mismatching of resource availability and bandwidths. In addition, they are inherently complex and potentially dangerous. Object-oriented programming often hurts performance significantly.  A friend who is the lead programmer at a major game provider told me: “The first thing we teach kids straight out of school is how to write efficient, re-usable code without using object-oriented languages.”  As I have closely followed the evolution of the system designs leading up to the launch of the M1, I see that many of these principles have wisely been adhered to. The. M1 laptop I just bought bore this out by being the most “boring” personal computer I have ever used.  Boring in the sense that its responsiveness and smoothness made it so non-instrusive. For me that is the ultimate benchmark.
     
    Carsten Thomsen
  17. Funny
    Darkseth reacted to LAwLz in M1 Macs Reviewed   
    Hey everyone! Look at how bad Apple is! 
    Their quad core running at 10 watts can't even keep up with this 54 watt octa core! What a fail, am I right? 
     
    This is totally evidence of geekbench being a bad benchmark as well since clearly this other benchmark gives a different result (except they don't, I just don't know the difference between single and multi core scores)! Everyone knows that if two benchmarks give different results then only the one that shows the result I want is valid! 
  18. Agree
    Darkseth reacted to Spindel in Leaked MacBook Air GB5 benchmark shows score higher than 16-inch MacBook Pro; SC higher than 5950X   
    Willfully ignoring that the M1 still beats a lot of the CPUs in multi core too (both geekbench and affinity benchmark). 
  19. Agree
    Darkseth reacted to RedRound2 in Leaked MacBook Air GB5 benchmark shows score higher than 16-inch MacBook Pro; SC higher than 5950X   
    I think most people so hyper focused on the performance that everyone forgot about battery life gains. These Macbooks also lasts 15-20 hours on a charge.
     
    Should be really interesting. The LTT video covering the M1 chip quite pathetic tbh. So I hope that he'd at least admit in his review that he underestimated the "iPad chip". Jonathan Morrison made a response video to it, which I hope @LinusTech will at the very least acknowledge
     
    For those who don't know what I'm talking about
     
  20. Like
    Darkseth got a reaction from Aidanlockett1 in Is paying extra for more options on a macbook pro worth it?   
    Apple said 2,9x the Performance in "Rise of the Tomb Raider" (probably native, not emulated) compared to previous Intel IGP.
    But big Problem: No more Windows (no Bootcamp, no VM), and no eGPU support.
    As long a Game is available for MacOS, it should run.

    Also interesting, the Geekbench leak: https://www.macrumors.com/2020/11/11/m1-macbook-air-first-benchmark/
    I know, it's "just" Geekbench, not the best, realistic benchmark, but on par or beats the 8-Core 16 Threads Notebook Chips from Intel AND AMD.
     
    I Hope Cinebench comes soon.
  21. Agree
    Darkseth got a reaction from ne0tic in Leaked MacBook Air GB5 benchmark shows score higher than 16-inch MacBook Pro; SC higher than 5950X   
    They optimized R23 for M1 chip.
     
    R20 will work, but it will probably only be through Emulation. So it's still a good test, to see rosetta performance.
    But for native, R23 is now here, and it seems to work better with Multithreading.
  22. Informative
    Darkseth got a reaction from Aidanlockett1 in Is paying extra for more options on a macbook pro worth it?   
    Well, MacOS (Big Sur) will be native.

    However, Software like Adobe, Microsoft Office, Affinity Photo, and whatever you could use, will run MUCH better, if the developer optimized it for the ARM Architecture (instead of x86_64 with Intel Chips).
    For Software, that isn't optimized yet, a Translater called "Rosetta 2" will be used, which will be a performance hit. However, There it wouldn't be surprising, if the new M1 Chips will be so fast compared to the last one, that they will emulate Software better, than the Intel models run them natively.
     
    For some things, this already is the case, like graphic stuff (where the new Chips can hit 5-6x the Performance).

    Exact Numbers are yet to be tested next week, but yea... you can pretty much expect more performance, less power consumption, more battery life, cheaper price.
    Oh yea, and you can run iPad and iPhone Apps native too, which should not work on Intel Chips.
  23. Agree
    Darkseth reacted to Fasauceome in what AMD gen is the laptop at now?   
    the 4000 series laptop CPUs are Zen 2, the same as the 3000 desktop CPUs
  24. Agree
    Darkseth got a reaction from Vitamanic in Laptop For Graphic Designer   
    Graphic Design is mostly leaning towards Macbook (Maybe wait for the Macbook Pro 13" with Apple Silicon, Nov. 17th? Would allign well with Black Friday), because it has an excellent Display, and most in the industry work with Macs. Also, some Applications are Mac only (Sketch for example).
     
    If Windows, don't bother with anything that doesn't offer a 100% sRGB Panel (or at least close to 100%).
    There are many decent <800 Notebooks, but they often come with 250 nits Panels, and have an sRGB Coverage of about 55%~. Really not good for any graphic work.
     
    I personally wouldn't really buy a 5 year old Laptop for that Price. If it's a second Hand model, there could be Macbook Pro 15" 2014-2015 as an option too.
    But
     
    On the Windows world, maybe < 3 year old HP Elitebook / zBook models?
    Again, focus on a 100~% sRGB coverage.
  25. Informative
    Darkseth reacted to jaslion in Laptop For Graphic Designer   
    80%+ with decent calibration is good enough really. Most people will be looking at everything on far worse screens anyways. If it's for print it matters more but really you can easily get by with it.
×