Jump to content

GR8-Ride

Member
  • Posts

    49
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About GR8-Ride

  • Birthday June 22

Profile Information

  • Gender
    Male
  • Location
    Toronto, Canada

Recent Profile Visitors

615 profile views
  1. It's iStat Menus 6. I just installed it on my M1 MBP, and it can show the current wattage draw in the MacOS Menu bar.
  2. He's asking for a 13-14" inch display though, while the X270 is only 12.5" The X1 Carbon or the T470s both meet his requirements, though both are out of the desired price range. Patrick
  3. I want to like Windows Phone....but the overall ecosystem is just too small. Might not stop me from trying it, however....just not sure if I can commit to Windows Phone as my sole platform. My big problem with Samsung is that they effectively abandon their phone models within a year of releasing them. So updates from Kit-Kat to Marshmallow to Lollipop to Nougat are basically non-existent (either carrier dependent, or so way, way behind that you're either vulnerable security-wise, or simply missing compelling new OS features). It's not like the hardware is incapable of running 5.x or 6.x or 7.x....just that Samsung virtually abandons the phone after 12 months (or sooner). At least with the iPhone, you know it will get updates for the next 2-4 years. In theory, Windows Phone has been doing the same. Google's Nexus phones are good this way, but again, you lose some Android common features (ie, expandable memory via micro-SDXC cards, for one). I have an iPad Air...that's my standard bed / couch surfing device...works pretty well for that. Haven't found an Android tablet that works as well as the iPad overall, though I had a Surface Pro 4 that was pretty good. Wasn't great for ebooks, but surfing and YT it worked at least as well as the iPad (although a bit heavier). I've gone back to Windows 10 laptops as well, though I still have a Macbook Air for work. Personally, I'll probably switch to a Windows 10 laptop whenever my HW refresh date comes through for work. It's not that OS X has let me down in any way....just that Windows 10 has become very, very good, compared to the Win XP / Win 7 / 8 experiences of old. I was always a pretty big Apple fan, but actually typing all of this out, I have come to realize that I seem to be moving away from Apple. And honestly, I'm okay with that. Patrick
  4. Can't think of much worse damage than a hole in the side of the block, but yes, exactly true. I know a lot of guys who used to run big-block V8s (396, 427, 454) with long strokes and try to run them out to 7500 RPM (or even 6500 RPM)....and ended up blowing chunks of con-rods, crankshafts and sides of the block all over the drag strip or road course. It's a lot of mass to move, stop and reverse in a short period (time and space). You need really, really, really good con-rods and bearings. Not to mention banged valves and bent push-rods because parts were just moving too fast for the rest of the engine. Speaking of the S2000, I saw a few of them on the road just yesterday. I thought most of them were toast by now....too bad Honda built it so underpowered....otherwise a great little car. Patrick
  5. Well, not exactly that simple, but yes, more cylinders does generally = more torque (and more parasitic drag as well). The larger the bore, the greater the ability for the engine to breathe, as well as the ability to use larger valves (again, better breathing....thus more power). The longer the stroke, then generally the greater the mechanical advantage for the rod pushing down on the crankshaft (in theory, more torque). However, like all things when it comes to building an engine, there are tradeoffs. A longer stroke means greater parasitic losses (ring drag) as well, and at some point in time, the combustion will reach a point where it's no longer producing significant force on the piston / rod / crankshaft. So there is a limit to how long the stroke can (or should) be. For street use, a square engine is generally the ideal, though with modern forced air systems that changes the design criteria quite a bit. For a race engine, I'd probably take a bigger bore over a longer stroke any day (bigger bore + shorter stroke will likely give me a higher and broader RPM range to operate in). Patrick
  6. I had an S7 Edge for awhile that I liked quite a bit...and was potentially going to use that as my replacement for my (now) 2 year old iPhone 6 Plus, but I thought I'd hold out for the Note 7 instead (I like the pen, and I wanted USB-C for some reason). Obviously the Note 7 is a no-go at the moment, and because I have high-end in-ear headphones (Klipsch X11i), I'm staying away from the iPhone 7 Plus. Just not a fan of having a dongle to plug in.... I might pick up another S7 Edge and toss my iPhone 6 Plus, or I might just keep my 6 Plus until it actually dies on me (never done that before!). I'm seriously contemplating trying out a Windows phone as a change (950 XL dual-sim or HP Elite X3). At the end of the day, email, text, maps and web browsing are what I use my phone for 95% of the time. Patrick
  7. They're guessing that it has a slightly lower TDP, which may or may not be accurate (it DOES has fewer transistors, but also a much, much higher clock speed). It is, however, a smaller die size, which means higher temperatures, even if they both generate the same amount of heat. This is especially true if the die size is significantly smaller, which I suspect it is, given the process shrink (28nm to 16nm) and the drop in transistor count from 5.2B to 4.4B (GTX 970m/980m vs GTX 1060 laptop). Patrick
  8. The big question is, what exactly IS the bottleneck for an external GPU. The logical answer IS the TB3 interface, but at 40 Gbps, it should have more than enough bandwidth for a full PCIe 3.0 4x connection (which TB3 does). And based on some of the testing I've seen, running a M/B at PCIe 3.0 4x (vs 16x) doesn't seem to have a huge impact on graphics performance (at least, not to the tune of 25%, anyway). Obviously some latency will be introduced, as running across 18 inches of copper wires is different than running across millimeters of silicon circuitry, so that would have to account for some of it. Perhaps the length of the cable and the requirement for PCIe retimers is what's impacting the performance (if I had to guess, it's the presence of retimers that would impact it, but we're getting beyond my knowledge level on that one). There's no translation element required, as TB3 carries PCIe 3.0 natively, so it's just a protocol dropped onto the wire. So processing overhead should be minimal, if any. As to what I'll do next....I think I'll wait for PCIe 4.0 and TB4 to get released (2017 and beyond, which is not really that far away....). I suspect TB4 will be a move to fully optical interconnects, or an optical / electrical hybrid (electrical for power, optical for data transfer). The move to optical should eliminate any bandwidth constraints (maybe even allow TB4 to move beyond PCIe 4 lanes to 8 or 16 lanes). I like the concept of what Asus is doing with their Surface Pro 4 competitor with a TB3 interface and the ability to run the XG2 external graphics dock. It will be hamstrung by the dual-core, ULV CPU in it though. Personally, I'd be okay if Intel came out with a quad-core, non Turbo boost mobile CPU (ie, 6700HQ that ran at 2.6GHz ONLY), and stuffed that into a Surface Pro 4 form factor (with TB3). Could even go as big as 14 or 15 inches for it, as 12.5 was a little small for my tastes, to allow for bigger battery and better cooling. I find myself using laptops less and less, and my tablets more and more. Even when my laptop is docked, I use an external keyboard / mouse / monitor. I rarely use my laptops as a pure laptop much (except when travelling). If I'm couch surfing, I grab a tablet (the SP4 beats the iPad Air 2 for this as well, though the iPad Pro 12.9 is pretty good). Anyhow, I digress. Patrick
  9. Note, I did say "could gain full access"....not that it would necessarily work. Trust me, I tried it....and was never able to get it to work. Just having Windows 10 via UEFI vs Bootcamp / CSM had enough hassles with it (function keys didn't work right, etc), much less spending hours trying to get Intel iGPU to work. And I'm generally pretty good with some of this stuff....I gave up, and went back to Win 10 on Bootcamp. And then just bailed, as I figured spending all of that effort just to run an optimized version of Windows on a Mac was pointless, so I switched to the SP4 and sold my rMBP instead. I got 7-8 hours of battery life out of my SP4, vs 2-3 out of my rMBP in Windows.
  10. This is the biggest question here. Thus far, Apple has not made drivers available for Bootcamp which allow Windows 10 to switch from NVidia (or AMD) discrete GPUs to the on-board Intel Iris graphics. The biggest reason for this is because Apple builds in their own "BIOS abstraction layer" on top of UEFI, called CSM (Compatibility Support Module). CSM effectively blocks the Intel integrated graphics and shuttles all GPU commands to the dGPU. So even installing Optimus or AMD's Dynamic Switchable graphics won't matter, because to Windows 10, the Intel Iris graphics don't exist. Now if you install Windows 10 directly over UEFI and bypass Bootcamp / CSM, then you could gain full access to the hardware layer, and thus be able to install Optimus / AMD DSG, and have integrated graphics support instead. On my own late 2014 rMBP, I was never able to get it to switch to the Intel GPU, and instead it ran constantly on the NVidia GT750M, which sucked power and drained the battery rapidly. Patrick
  11. Not having a desktop rig to compare it with, I'm well aware that there is a performance hit running over TB3. In my case, I can run Crysis 3 and Doom at 4K @60+ FPS. So any further increase in performance is moot (and I'm switching from my 4K monitor to a 34" curved ultrawide 3440 x 1440, so performance will be even less of an issue). In my case, I wanted a portable that I could game on, which 'morphed' into a desktop setup when I got home, and still be able to keep all of my main files / data in one place. When I look at what I spent on the Blade + Core + GTX 1080, I probably could have done better building a gaming rig for home. But desktop rigs aren't really my thing anymore, despite the fact that they're still the best choice for a pure gaming machine. And the fact that a desktop rig doesn't sit on my lap very well when I'm in the family room watching the game. Patrick
  12. Honestly, the notion of buying a Macbook Pro, paying the Apple tax, and then running Windows 10 on it seems pointless at best. And trust me, I've worked with some very smart engineers in the past who have done just that....picked up a retina Macbook Pro, and then run Windows 7 as their sole OS on it. I get the industrial design element....I've had several Macs in my life and still do. But a fully loaded rMBP is over $4K CAD and a top notch Windows 10 laptop from Dell or Lenovo will easily run $1,000 less. I ran Windows 10 in Bootcamp on my rMBP (late 2014 model, fully loaded with the GT750m), and battery life in Windows was horrendous. Battery life wasn't even great in OSX, but that's because I run Pathfinder as my default desktop / Finder app, which forced the dGPU to run instead of the integrated graphics. Look, I'm generally an Apple fan, but quite honestly, the Apple tax is right up there, and unless there is a strongly compelling reason to go Apple (and there are some...), you can get a better deal on various Windows machines. In Apple's favour, the support network is fantastic. I smoked a power supply in Bangkok, Thailand on a business trip, and I walked into an authorized Apple service center and walked out with a brand new power supply 15 minutes later. Completely free of charge. That was 10+ years ago, and their service and support has been good to me ever since. Also, they generally just work, but I will say, with Windows 10, it's been remarkably stable for me as well. I honestly think of OS X and Windows 10 as a wash when it comes to stability these days. The only other real advantage for Apple is the *nix sub-system. If you REALLY need a Unix-like subsystem, then Apple is the way to go. That carried some "geek cred" a few years back, but these days it's not worth much. Especially since it's so easy to pick up a Windows machine and dual-boot into Ubuntu or some other flavour of Linux. As to the Surface Book, I'd stay away from it. Great concept....the execution is lousy. I'd go with the Surface Pro 4 instead. In tablet mode, you lose all of the ports, and most of the battery life on the Surface Book. And when it's connected to the keyboard base, the screen is so wobbly that pen input and touch input becomes a jiggly nightmare (try it for yourself at any MS store). The SP4, on the other hand, has the kickstand which makes it incredibly stable for touch / pen based input. Patrick
  13. Hang on, we've had this argument already. Time for you to go read a physics book. Power = Heat....it's a very linear relationship (mathematically). Heat is a measurement of energy, and it's measured in Joules, and the formula is 1 Joule = 1 Watt / 1 second. As wattage goes up, so does energy (ergo, heat). As wattage goes down, so does energy (ergo, heat). Temperature, on the other hand, is a measure of the average thermal energy (heat) of all of the molecules in a substance. You are correct, however, in that thermal density applies in which a smaller object (Pascal GPU) has a higher TEMPERATURE than that of a larger object (Maxwell GPU), even if the same amount of "HEAT" is being generated by both. Pascal has fewer molecules in it than does Maxwell, therefore the same HEAT results in different TEMPERATURES. Pascal is also physically smaller (again, fewer molecules), so it has both less surface area to conduct heat away from it, as well as less mass to store heat. I get the point you're trying to make in that power consumption isn't a linear relationship to TEMPERATURE, but if you're going to tell people to read a physics book, you should understand the differences between HEAT and TEMPERATURE yourself. @Jayvin As to the Razer Blade, it's in the nature of physics that a thin and light gaming laptop is going to run hot and/or loud (or both). The 2016 Blade seems to be somewhat better than the 2015 Blade in terms of thermal throttling, however it's not exempt from it. I can make my 2016 Blade throttle when playing games on the internal 970m and running it bone stock (either Balanced or Performance power profiles in Windows 10). When it throttles it never drops below base clock (2.6 GHz) rates on the CPU, and I haven't seen the GPU throttle yet at all (I run mine in closed, clamshell mode connected to a 4K monitor, playing at 1080p). I can avoid throttling by disabling Turbo Boost and just running the CPU at it's base clock rate of 2.6 GHz. No performance issues with Doom on the internal 970m, but it's also not pushing the CPU very hard to begin with. Again, I've never been able to get my 970m to throttle. I also run a Razer Core with a GTX 1080 in it, and it never throttles at all under that configuration. I was actually trying the other night to see how far I could push it (2200 MHz boost clock and 11010 MHz memory clock, successfully), and whether I could get the system to throttle running the external GPU (a big part of why the Blade throttles is the shared heat-pipe between CPU and GPU, in addition to it simply being thin). Several hours of running Heaven at 4K and various 3DMark benchmarks to find my maximum GTX-1080 overclock, I was never able to get my CPU to throttle. It does get into the low to mid 90s, however (peak was 93C). My external GTX 1080 barely gets into the 70Cs. As a desktop card, it's pretty darn impressive. Just for giggles, I threw a cheap ($20) laptop cooler underneath it (again, still in clamshell mode), and my CPU temps then peaked at 87C running those same battery of tests. I actually ran the laptop cooler tests (with and without several times) just to validate that it was actually doing something....over my career I've thought of most laptop coolers as being completely useless. As to it being loud, I don't notice it at all. But fan noise is highly subjective, and in my setup, it's all in a closed cabinet in my desk with monitor and keyboard up top. So fan noise isn't an issue for me. So honestly, yes, the Blade does run hot with the 970m in it. To Dackzy's point, I can only imagine that the GTX 1060 version of the Blade would likely run hotter, given that it's physically a smaller die. It's the first laptop I've ever owned in which I've demonstrated that a laptop cooler actually made a difference. My general view is that laptop coolers were snake oil. On the Blade it makes a difference, which is not a good testament for the Blade's cooling system. Look, the 2016 Blade isn't a bad laptop compared to other thin and light models; I happen to love mine. But I'm not a college kid, and money isn't an object for me. With the Blade, you are paying for the looks, and the all aluminum construction on it. From a pure performance / dollar equation, there are better deals available (MSI GS63VR or Asus GL502VS). If you're not gaming much on the go and have a desktop rig at home, then go with something like the Asus UX501 that @don_svetlio mentioned. It's actually the laptop that I'm suggesting my wife should look at. Patrick
  14. Ohm's law still applies. V=I*R and P=V*I. Nowhere in either of those calculations is there a variable for Freq. For Fermi, the GTX 580M was a 40nm process, with a TDP of 100W as well (similar to the GTX 970M and likely similar to the 1060, though I've heard as low as 80W for that). Just like the GTX 485M prior to that, the frequency was higher, but power consumption remained the same. As process die shrinks you require less voltage to manipulate the logic gates, and due to shorter interconnects, you draw less current for each circuit. You *can* require more voltage as frequency goes up IF the logic gates are unable to respond at a higher clock rate with a lower voltage (raising the voltage can equal faster logic gate response, and thus, support for higher clock rates). Howver, IF the logic gates respond at a higher frequency and DO NOT require greater voltage levels, then power consumption remains the same, or drops (smaller process size means logic gates require less voltage to operate). Remember, there was a massive change in architecture (and a slight bump in frequency) between Fermi and Kepler. From 384 pipelines in Fermi to 1344 pipelines in Kepler, and from 1.95B transistors to 3.54B transistors (and up to 5.2B transistors in Maxwell). Energy usage is based upon work, not architecture (P=W/t). You can have one core running at 2 GHz, or 2 cores running at 1 GHz each. As long as they're both able to get the same work done in the same time, the power consumption will be identical. CMOS gates that are not changing state consume no power. Remember, YOU guys are the ones who are saying that Pascal is nothing more than a superclocked Maxwell. If a 1060 were clocked to a 970M clock rate, then the 1060 would consume significantly less power than the 970M (due to lower transistor count and smaller process size). Now due to the smaller die size, if it consumes equivalent power to Maxwell, then the thermal density is higher and you require more effective cooling. Your IBM fridge computer is irrelevant. We're comparing CPU / GPU die shrinks and frequency increases.....a mainframe or mini of old had far different systems running to draw significant power (multi-disk arrays, large vacuum tube based logic circuits, cooling systems for mechanical elements, etc. Nowhere in any of our discussions have we been comparing the overall power draw of the entire system (NICs, MB, RAM, HDD, SSDs, USB et al). Patrick
  15. I used to run Stoptech's on my E36 racecar....they were always pretty reliable brakes for me. I've been thinking about a set for my M4, but I haven't found the stock brakes to be too bad thus far. I think the stock rotors for the M4 might be a little too thin for a long lifespan. Patrick
×