Jump to content

ConfusedJoaker

Member
  • Posts

    6
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

ConfusedJoaker's Achievements

  1. I'm currently embarking on a full-stack project that (apparently) requires MacOSx. My question is, would it suffice to simply dualboot Ubuntu with my windows laptop or do I really need a MacOS machine(either through some painful hackintoshing or borrowing a much slower dualcore Macbook)? Mods please feel free to close this thread. Got more specs and it was because Docker apparently runs better on MacOS than Windows and natively on Linux so I'll be proceeding with Ubuntu.
  2. Thanks for the really informative reply! Guess I'll try out 19.04 then. I chose 18.04 primarily because everyone online was saying how the older 18.04 was more stable, but what you said kinda makes sense. As to this: "The first thing you should do is installing the Nvidia proprietary one through the driver manager right after installing your system. (which idk how you get it installed if you say it doesn't support nvme??)" This was more an issue with the storage mode of my SSD in the laptop, which by default was set to RST Premium. Apparently with 18.04 and older, only AHCI storage mode(which doesnt support NVME) is allowed or the system wouldnt even recognise my SSD. And: "Also, don't even try to use Nouveau (the open source driver installed by default)" Could you advise me on how I could go about installing the Nvidia propreitary driver on Ubuntu? Thanks! I've been scouring google for hours and this is the first time I'm seeing this as a suggestion.
  3. Damn.... any news on when the developers at Ubuntu will get around to fixing this? I'm seeing a crapton of threads online on how disastrous 18.04/19 is to users but no end in sight for any of us. So is Mint better supported? I chose Ubuntu because(foolishly with 18.04) it seemed like the most stable Linux distro out there. Anddddd as i type this i lose 1% per minute and battery life fml.
  4. Hi all! I've decided to make the switch into linux through Ubuntu 18.04 dualboot due to the nature of my development work (Apparently Docker works better on unix-based systems) and BOY, has it been pure HELL. For starters, here's my system: Acer Triton 500 2019 i7-9750H six-core processor 16gb ram RTX 2060(Not that this matters) Sounds pretty damn sweet right? Yeah, runs windows 10 perfectly, never breaks the 50C mark on idle or when compiling my code. Then... in came this horrendous monstrosity we call Ubuntu 18.04. I mean ok, i get that its pretty much open-source with no hardware manufacturers' backing, but.... NO NVME native support? Non-intuitive undervolting and overclocking? Seriously? Back in Windows 10 everything literally is plug-and-play. Here I had to wade deep into my computer's hardware configs to switch over to AHCI just to install. Now the main issue: The damn OS causes my laptop to overheat. Badly. Frequent jumps to 90C when doing NOTHING, the whole keyboard deck feels like I'm typing on an Iron and the fans are constantly on half-full throttle. The battery has gone to trash. On Windows 10 I can get easily 6-7 hours on battery, here on UBuntu I'm lucky if I even get 4 hours. Can ANYONE please give me some advice on what to do with regards to optimising this cranky-ass system? I cant even imagine compiling code and running two servers in conjuction when testing out a fullstack project. If it already hits 90C on IDLE, guess whats gonna happen here? LOL.
  5. Hi! thanks for the reply, funny you mentioned it, I just did a full reinstall of Windows 10(but preserved the data) and reinstalled the OEM drivers. Scores remained the same as before I did the reinstall. I should point out though, the OS i'm using(windows 10 pro grey market key) is very buggy. Key macros and brightness controls stopped working which I fortunately(for now) solved by re-installing the OS and installing the oldest OEM Intel Graphics drivers. I had to turn off automatic driver updates too as for some weird reason Windows 10 would keep updating my graphics drivers to the latest ones(which then overwrote the OEM drivers and screwed the brightness control and key macro for brightness control)
  6. Hi guys, looking for some opinions here. I got myself a Sager NP8950 laptop last year. Its specs are:7700HQ, GTX 1060 6GB full, 16GB 2400Mhz ram.Its a thin-and-light laptop, so temps definitely ran a little hot under load. On stock with no undervolting, repasting, I was getting GPU temps of up to 87 degrees when playing Witcher 3 or basically any heavy game. After undevolting and repasting with Liquid Metal, this dropped temps to about 83 degrees. Using a laptop cooler further improved temps to about 78-80 degrees. I noticed though, that fps remained roughly the same. Repasting only improved fps by about 2-3 on average.I noticed some FPS drops occasionally in games like D2, MHW, and Witcher 3 where fps would dip by 5-10 for no reason too. I'd sometimes even be getting 10fps drops in intense scenes in these games. This is despite running at settings on the games that a 1060 should clearly be able to handle well at >50fpsThen, it tanked completely at Benchmarking.Firestrike Benchmark spit out a crappy 9497 while doing a search online found that similar laptops like the Razer Blade 14 running a 1060 scored a easy 10500 and the highest percentile scoring over 11000. What gives?Firestrike Stress test was even more startling. I got a horrendous 94.7% score which failed the stress test, despite GPU temps staying stable at 78 degrees and CPU temps staying cool at 51 degrees. I know there's 'silicon lottery' in getting a good gpu in laptops too but i can't possibly have gotten such a lousy draw on the lottery right? With these scores its hinting to me that my 1060 is basically a bottom-of-the-barrel 1060.Running Cinebench didnt improve the mood either. MY 7700HQ scored at the bottom of the barrel(again) with a average 701 score, well below the internet average. This despite task manager showing a steady clock speed of 3.4Ghz throughout the entire cinebench test. What gives? Did Sager just decide to buy the crappiest CPU+GPU combo and stick it into my laptop?It can't be thermal throttling since the temps are decently low right?
×