Jump to content

Rankhole

Member
  • Posts

    34
  • Joined

  • Last visited

Everything posted by Rankhole

  1. I decided to get a used X1 Carbon 4th Gen. Specs: I7 6600U, 16Gb Ram, 256Gb SSD I think I'll definitely be happy, since the price was 549 Euros total :)! On a side note...does anyone of you guys know where to get a fitting case for this specific X1 Carbon 4th Gen? I could only find generic cases which probably won't fit ideally...
  2. I just checked and the E595 also has an IPS display. It's just that it has only 250 cd/m2 brightness, aswell as being a mat display, which makes it look less colorfull.
  3. I literally just saw a similar one. I've found one for 600 euros, with the best i7 (6600) aswell as 16Gb ram and a 256Gb SSD. That looks like a damn good deal! Do you know if there is any significant differences between X1 4th Gen and letest Gen? Other than performance I mean?
  4. Hi there, I'm currently looking for a new laptop, and I'm kind of torn apart. On the one hand, I was looking at the Macbook Pro 13", because it has some of the best build quality - while at the same time sucking because of the shitty keyboard and being extremely overprised for the specs (duh, Apple...). On the other hand, I was looking at some Lenovo Laptops that struck my interest. One of them was a (used) X1 Carbon 4th Gen refurbished on amazon. With this one, I thought I'd get "X1 Carbon" levels of build quality and light weightness. However, it's still actually more expensive, even though it's way worse in terms of specs AND refurbished. The other one is the relatively new Lenovo ThinkPad E595, which has just insane specs for the price. 16Gb ram, Rzyen 7 3700U processor which, from what I've seen, seems to work just as good as comparable 8th gen Intel i5/i7 processors, at least in terms of multicore results. On top of that, it comes with a 512Gb SSD...pretty damn impressive for the price of 760 euros (including taxes here in Germany). The thing is... I'd theoretically like the Macbook Pro OR the newest X1 Carbon the most. However, the X1 Carbon actually ends up costing me way more then the cheapest Macbook Pro, since they don't even offer "entry level specs" with that machine. That's why I was looking for refurbished (or possibly even used) X1 Carbons. These two would offer me the best build quality... However, obviously, they get crushed by price to performance, and honestly even performance, by the new Ryzen E595. This one however, weighs way more, about 2.1kg. Which is alright for a 15,4" Laptop, but still... What do you guys recommend here? I'll be using it mainly for university, where I do software development. Also, I'd obviously be glad for other recommendations that I may have missed! Best regards, Rankhole
  5. I want to go to 240Hz because I really, really want as much responsiveness as possible. In competitive games where I can easily reach that refresh rate, I'd benefit from it. I've also now looked at the HP Omen X 27, which seems to be a 27 inch, 1440p 240Hz monitor. @SolarNova do you think X 27 would be as good as the X 25f in terms of the criteria you mentioned?
  6. I've checked that review site for other monitors, and it claims that my current monitor is "great" in terms of motion blur, which is absolutely not true. It's complete garbage because of ghosting. They mention this and say that "there is noticable under- and overshoot", but still give it a 9.3/10? Idk about that if you ask me. Is the Omen X25f really that good? Also, even the 2540 got a better rating in the gaming category than the X25f.
  7. Thanks for the opinions guys! What about the lack of G-Sync support though?
  8. Hi there, I am currently looking for a new 240Hz display. What's really important for me is: At least 1080p 240Hz with *minimal* ghosting I currently have a 144Hz MSI Optix MAG271CQR and am not quite as happy as I should be. It's a 1440p panel with 144Hz, but it somehow feels unsmooth because of the high amount of ghosting that's going on. Now, I have two Monitors that might interest me, for different reasons: One is the Benq XL2546, which is a 1080p 24" 240Hz display with the - supposably great - DyAc Anti Motion Blur, which would be great for CS:GO and other games in generall, as I'd love to have flicker free, non brightness reducing anti blur. Then there is the Benq XL2740, which is a 1080p 27" 240Hz display. This one lacks the DyAc technology, but is 27" big (which is great, since I'd be rocking a dual monitor setup with my currently also 27" monitor), and also has G-Sync, which the other monitor lags. So...I'm really in a dilemma here. And I'd like to get a couple of other opinions from you guys. Which one is a better buy? In terms of price, XL2546 retails for around 490 euros in Germany, while I can get the XL2740 for 520 euros. So the price isn't the issue here. What's a better "have": DyAc technology or 27" with G-Sync? Best regards, Rankhole
  9. Yeah, well, that would be the ideal scenario. If that won't happen, I want to know (2), if I can at least have both of them plugged in the system at all times and can just manually select which GPU to prefer in the Nvidia app profile settings. This is my "main" goal. At least being able to run CS with the GTX 580, while being able to run other games with my main GPU.
  10. Unfortunately, the i7 6800k doesn't have an iGPU. Also, the reason why I'd like to rather use the power of the 1080, is obviously for framerate reasons. While I'm not planning to play on too much of a high resolution, the 580 is still going to perform worse in terms of FPS than my GTX 1080. Plus, this would allow me to play any game I want on my CRT, even moddern titles, without having to worry that the GTX 580 might explode
  11. Hello there! I have found myself in a bit of a weird spot, as there is not much specific information about something like this. I have a GTX 1080, which normally I'd use for everything. However, I'm going to switch to a CRT Monitor pretty soon, and I'd not rather use digital to analog conversion, but analog straight away. You might know though, that starting from the 10 series GPUs from Nvidia, they dropped DVI-I support, meaning it no longer is capable to directly output analog (shame!). The CRT monitor obviously only works on VGA. I also have a GTX 580 lying around, which does happen to have DVI-I. This means if I was to only use the GTX 580, it can power the CRT, no input-lag adding adapter needed. Two main questions: (1) I wanna know if I can somehow use my GTX 1080 ingame to actually perform the calculations, and then somehow direct that output to my second GPU, so it can, without additional delay, transmit in analog to my CRT. (2) If not, can I at least keep the GTX 580 running alongside my GTX 1080, and plug in the CRT into my GTX 580, using it as the GPU for only certain games (like CS:GO), while being able to play any other game with my GTX 1080 on my main monitor? The idea is: Nvidia Controlpanel does allow you to "select a GPU" for certain applications, so I could set up a manual application profile for CS:GO, and select my GTX 580 there. Is that somehow possible? I'm on Windows 10, and for completions sake, here is my rig data: i7 6800k, MSI x99a SLI-Plus, 24GB 3000Mhz DDR4 Ram, MSI GTX 1080, Gainward GTX 580 (not plugged in currently) I'll be interested for *any* insight you guys might have! Maybe this could spark an interesting discussion, or maybe even make it to an LTT vid? :) Best regards, Rankhole
  12. I don't want to do that, I'm happy with the way it is now. I just report this to get other people to try it out and see if it's a rare bug or actually a "feature" and me just being too dumb to find Bixby.
  13. Hello there, first, I want to say I live in Germany. Important, since the update might not be available straight away in every country. Since a couple of weeks, it is known you could deactivate the Bixby Button's mapping to open up Bixby Home. However, lots of people were saying that holding it would still trigger Bixby listening to your voice and still being annoying. I can confirm this, I also did that and holding it would still trigger the voice assistant. Less, but still annoying. Today, an update rolled in for Bixby. I installed it on my S8 and when I went into Bixby Home's settings, I saw that right on the first page there was an option do disable the Bixby Button's functionality. I do not remember what exactly it said, but in a nutshell it described that Bixby would not open over the button anymore after the setting was enabled. Now, here is the fun part: I literally am unable to even FIND or OPEN Bixby at this point. I can press the button as much as I want, it doesn't open (which is good, it's supposed to be like that), but Bixby Home kind of "disappeared". When I go into Galaxy Apps, it shows up as installed, but I can't open and it's grayed out. Bixby Vision still worked and I was able to access it's features and settings over the camera app (not over galaxy apps though). Even if I search for Bixby on my phone, the settings or the galaxy apps, I can not find it anywhere anymore. Bixby is officially dead for me.. Which I'm honestly kind of glad about, but the bug with it completely disappearing shouldn't be a thing. Anyone else confirm this by testing it out? (Can't replicate it since I don't have Bixby Home anymore) Best regards, Rankhole
  14. It is. Some games only utilize 2 cores. Turning down my clockspeed to 2GHz for example will obviously result in lower FPS since then there is a severe bottleneck. That goes the other way around as well, when I up it to 4,2GHz, I'll get better performance. IF the games were to use all cores, then of course what you say makes sence. But they don't.
  15. Certain games only use let's say 20% of my cpu. Now speedstep decides that therefore the cpu does not need to run at full speed. THAT means the game runs slower, since with a higher frequency per core it would run faster. This is why I want to have 4,2GHz whenever im somewhat in a load.
  16. Hey guys, I have an i7 6800k overclocked to 4.2GHz. I have Intel speedstep enabled, since I do not want my processor to run at full even though im just idleing. Turboboost is also enabled. Now my question: When I play games, my cpu never reaches 4,2GHz, but only around 3,8-4GHz, since games definitely do not use a 6 core cpu enough. How can I make it so whenever the cpu usage reaches a certain minimum (e.g. 20%), it automatically goes up to the full 4,2GHz? In other words, speedstep should stay enabled, however at even lower loads it should go up to 4,2. Best regards, Rankhole!
  17. I achieved a maxoverclock of 3.7gh. It was running at that while getting the bottleneck results.
  18. But it's not like the cpu would have to work "less" for the game on higher resolutions? How come the cpu would not be a bottleneck anymore at 4k? I don't quite understand that aspect. I thought having a much better cpu would always give me higher and more stable framerates?
  19. I think I would not be able to wait a whole 3 months just fore "potential" 10% increase - also, keep in mind that my GPU is bottlenecked so hard right now. 60-80% in Mafia 3, 70-90% in Overwatch. Also, since I play CS:GO, getting that faster CPU AFAP would be better for me. What do you think about the following rig: NZXT S340 Elite Case i7 6700k Boxxed (paired with my current Noctua NH-D14) MSI Z170 M7 mainboard G.Skill RipJaws 16GB DDR4-3200MHZ CL14 Ram This is what I could afford down this month. Next month would be a Kraken x62 just for the looks. What do you guys think? Best regards, Rankhole.
  20. Alright guys, I just did the same with Overwatch. Even with that game, I got only about 70-90% GPU usage at best. Would you recommend the i7 6700k or the i5 6600k? Best regards and thanks for the answers!
  21. I am playing on 1080p on purpose. Im a 144hz gamer and prefer FPS > Resolution, so the 1080 was still the better card for me.
  22. Okay, just for clearance: If I were to buy, lets say an i7 6700k and overclock it, would that mean that the gpu usage would "always" be 100%, yielding optimal performance?
  23. Hello LTT members. I got my MSI Gaming x 1080 today and instantly went to GTA V. I am using the redux mod aswell. Now, the FPS was particularly low in GTA V. It was always around 35-50 FPS, NO matter what setting I would pick. I would play on Ultra, 8xmsaa (8x reflection msaa aswell), and it would yield the same result as playing on High, 2xmsaa (almost). Now, here is what I observed in Afterburner: http://prntscr.com/d357v1 This is the Graph for GPU usage while playing. As you can see, for the most part, its not even close to 100%, most of the time being at around 50-60%. At its lowest GPU usage points, I get the framedrops to 30-40. Now: does this mean my cpu is bottlenecking my graphicscard? Or is it some kind of weird problem with the card itself? Hope you guys can help! Best regards, Rankhole!
  24. It does run at 16x. Btw do you think it is worth upgrading to an i5 6600k? I can not overclock this cpu (lost the silicon lottery). I feel like my mainboard is causing so much trouble, might as well upgrade...what do you think? Thanks for the help!
  25. Put it in the second slot. Now works like a charm... Again, I don't even think i needed to do that. Prolly just unplugging it and putting it back in the slot like before would have fixxed it. I don't know why it did that though, very weird.
×