Search the Community
Showing results for tags 'arm'.
-
Hello LTT forum, I've been following Tomaž Zaman, a YouTuber who is planning to create a high-end router. His focus isn't solely on providing behind-the-scenes insights into the design and planning process but also on everything involved in commercializing it.... from picking components for the device and seeking investors and establishing a company. What sets this project apart is its commitment to transparency. Through his videos and posts, he intends to document every step of the process, from idea to execution. He's already uploaded several videos, ranging from the presentation of his prototype to announcing his interest in creating and selling the device. He's also discussed potential pricing based on the bill of materials he's already shared. Summary Personally, I don't know him beyond his channel, but I am very interested in this project, and I believe it deserves a litter of exposure for everyone to appreciate. By sharing this project with all of you, I hope to generate additional interest and foster a community that can contribute ideas and feedback. Sources His YouTube Channel:https://www.youtube.com/@tomazzaman
-
Summary On October 2nd, 2023, ARM disclosed three security vulnerabilities for the Mali GPU Kernel Driver. The security vulnerabilities affect a significant amount of devices, and there are indications that one vulnerability may be under limited, targeted exploitation. Patches may not be available for all devices. Quotes On October 2nd, 2023, ARM disclosed three security vulnerabilities in their GPU kernel drivers. The security vulnerabilities affect a wide range of ARM GPU drivers, including multiple different architectures and versions. For specific information on affected drivers, please see ARM's Specifications Page. The vulnerabilities have been addressed in Android's latest security patches (2023-10-05 or higher). According to Google's Android Security Bulletin for October 2023, there are indications that one vulnerability (CVE-2023-4211) may be under limited, targeted exploitation. For more information regarding the vulnerabilities and security updates, you should check with your device manufacturer. Assuming, of course, that your device is still supported... Note: for Google Pixel owners, this vulnerability has been addressed in the September security update (2023-09-01). My thoughts I'm not an expert in this area so I can't say this with confidence, but it seems like a LOT of devices are going to be affected by this... including devices that are not supported anymore. Not only are Mali GPUs used by many manufacturers, the security vulnerabilities seem to be included in many driver versions. Look at their Bifrost architecture drivers, for example. ARM says that kernel driver versions from r0p0 to r42p0 are affected, and going by what's on ARM's developer page, that's every kernel driver before March 24th, 2023... and r0p0 was released in June 2016. Nearly seven years worth of drivers are affected, and who knows how many phones can't update due to discontinued support? Also, my current phone is a Galaxy S9+, which uses ARM's Mail-G72 GPU... which is based on ARM's 2nd generation Bifrost architecture... and Samsung doens't offer updates for the S9 anymore. Time for a new phone... I guess? Sources ARM Security Bulletin https://developer.arm.com/Arm Security Center/Mali GPU Driver Vulnerabilities Android Security Bulletin - October 2023 https://source.android.com/docs/security/bulletin/2023-10-01 Ars Technica https://arstechnica.com/security/2023/10/vulnerable-arm-gpu-drivers-under-active-exploitation-patches-may-not-be-available/
-
Intel however is not giving up on the partnership with Oracle and insists that the new line of Xeon Scalable offerings will be a strong player on the enterprise market. Quotes One thing is clear -- Intel can no longer expect its hyperscale client base to adjust their upgrade/migration cycles to the pace of its own R&D and manufacturing capacity delays. Times have changed. Source: https://www.crn.com/news/components-peripherals/oracle-bets-on-amd-ampere-cpus-ellison-says-intel-x86-architecture-is-reaching-its-limit-
- 2 replies
-
- larry ellison
- x86
-
(and 2 more)
Tagged with:
-
So I've been going down the RISC rabbit hole both ARM and RISC-V and 1 thing I am confused about and getting conflicting information is the actual capabilities. My main question is, will ARM and/or RISC-V hardware out now continue to get faster (ie Linux on arm Pine64 etc) from mostly software optimization or from hardware optimization?
-
Hi All With all the talk about Arm and Nvidia using Arm, Apple using Arm chips and Microsoft potentially making an Arm versions of windows hardware and software (again), the success of SteamDeck. How long do you think it would be until there is an Arm based SteamDeck version? Following Apples path with rosetta 2 being able to run x86/x64 bit software on Arm and given SteamDeck already needs to use Proton for many Windows games to run on it, Valve is already working with emulating/virtualizing software. Another consideration is Arm being able to use AMD or Nvidia or Intel graphics for better gaming performance and comparability with games that currently exist and work with those graphics cards, APU or iGPU. Arm, Intel, AMD and Nvidia are all members of the UCIe (Universal Chiplet Interconnect Express) i understand that doesn't mean Valve could just take an Arm CPU and glue on graphics from Intel, AMD or Nvidia but in a hypothetical device being able to have an Arm CPU for high performance and low power draw with a glued on GPU on a single package for much faster die to die communication and gaming performance. I'm talking about a device for playing games on the local device not a "cloud" steaming device like the Logitech G Cloud. If it could run a Linux version to get away from Windows, IMO would be better then having to deal with all the shit from Microsoft. Is this just wishful thinking...
-
Summary NOTE: This is a Rumor, not confirmed by Apple. Everything here should be taken with a grain of salt. A new small Mac Pro with Apple Silicon will come out and the redesigned iMac will have colors. The small Mac Pro looks like "3 to 4 Mac Mini stacked on top of each other" and apparentally looks like the PowerMac G4 Cube. The iMac will have a redesign that is similar to an iPad Pro and the Pro Display XDR, and comes with "fun" colors. The IO and it's location is unknown. Quotes My thoughts As with all leaks, this one might not be accurate. To me personally, I think this would be a pretty logical move of Apple. The small Mac Pro was pretty interesting to me, since it would probably be cheaper than the current Mac Pro, and it seems like it's the same design idea as the 2013 Mac Pro. Small and premium looking. With the iMac, I was genuinally suprised that Apple would add colors to the iMac, but I guess it's targeted on the younger people market. Anyways, any Apple news are exiting, since when Apple releases this creature we may finally see other companies finally get high performance ARM processors into powerful laptops and desktops. Sources
-
Hi all I recently got an HP Chromebook 11 G1 (Samsung Exynos ARMv7 based 32bit CPU) not sure what year it is from I would say 2013 or earlier.. anyway is there a way to replace chromeos with Linux on this? As it's EoL so I'm hoping to give it a new life.. and all I can find is stuff using Crouton which isn't really what I would like to do.
-
Hey folks! I had a quick look around for similar questions but couldn't find anything relevant. Looking for a good monitor desk mount for my Corsair Xeneon monitor. Something for a small space (not much swivel needed and the screen doesn't need to get any closer either, see pic below). If there is a mount that can move upwards for easy access to ports then all the better! Or maybe a mount where one can easily remove the screen without unscrewing anything (or at least not much, bit like the ergotron mounting mechanism) I've seen a few options but I'd rather not faff around with returns, or worse, a broken/damaged monitor. Cheers!
-
Question as written in title. any info is appreciated, maybe someone is working in oracle and knows something
- 3 replies
-
- virtualbox
- arm
-
(and 1 more)
Tagged with:
-
Budget (including currency): sub 250 EUR for mainboard, cpu and ram Country: Germany Games, programs or workloads that it will be used for: Steam cache, 2 Discord bots and a few small scripts Other details Main focus should be energy consumption and speed. Physical size or looks doesnt matter too much Hello guys, i want to build a small ish storage and home automation server, it will have a 2tb ssd for storage and a 2,5gbit network connection to my main pc as well as a 1gbit connection to the rest of my Home Network. Its main purpose will be as a steam cache, as my Internet connection has only 100mbit/s download and no way to upgrade it for less than 6k EUR for a fiber connection. Other uses will mostly be some home automation stuff and 2 discord bots, as that currently runs on a xeon e5 2680, which draws a lot of power and heats my room up quite a bit. As an OS id prefer some linux variant as i already have some experience with it. My question is, if that will work at full speed (500MB/s) as well on ARM hardware or if there would be drawbacks like speed constraints.
-
I recently built a PC for one of my friends and he has a small TV (30-32 inches I think) that he wants to mount above his two displays. Initially he was going to wall mount it but thinks a desk mount would work better. There are plenty of products like these on the market but the clamps are limited to 1-3 inches. His desk has a drawer beneath it and is about 6 inches thick. Does anyone know of a single monitor arm that would work for a desk this thick?
-
I'm searching for a new monitor and found this MSI one that really fits what I'm looking for and I have this articulated monitor arm (ELG F80N). It specifies that the maximum weight is 6,5kg, but the MSI monitor (Optix AG321CR) weighs 6,59kg. Do you think this 90g are gonna be a problem or is it within the margin of what the arm could bear?
-
Geekbench scores are up! Not only did the MacBook Air shred the 2019 MacBook Pro with the 9980HK, but it somehow also beat the ARM MacBook Pro with the cooling fan to "push it to the limit". Intriguing... Linus, looks like your models are arriving late. Want me to run some tests for you? https://browser.geekbench.com/v5/cpu/compare/4652718?baseline=4651916
-
Apple Silicon seems to have ruffled a lot of feathers and a caused bit of a circle jerk on twitter. I will start this off by conceding that Apple's presentation had some over the top claims. They does this all the time and while it may be annoying to hardcore computer enthusiasts I think a lot of people are getting angry and pointing to those claims instead of evaluating the the processors on their own merit. I like Apple products (mostly for development) and wanted to give some reasons that I think even if you are not an Apple fan these processors should make you excited A major OS is transitioning to ARM. x86 is not a perfect architecture and has a list of its own issues. Apple's transition will force a lot of user onto an ARM based platform and we will see a lot of applications start supporting ARM to meet this demand. Windows for ARM already exists but doesn't have a ton of support from major application providers. I think the power benefits will force many other companies to have look to ARM based processors in order to compete in the long term. Complete Stack integration. Whatever you think of Apple and their design practices they have just shed the last major constraint they have on products. They now have free reign to craft their computers in their image. I think we will begin to see a crazy amount of tight integration between hardware -> OS -> Application and streamlining that can't be imagined unless you control the entire stack. I have a feeling the apple "ecosystem" might be getting some new snazzy features that other companies will have to compete with. Power Consumption Standard. I see a lot of people arguing over the raw performance and benchmarks of the M1. I think its very important to consider power to performance. I don't think anyone can doubt that the power to performance ratio of these chips is anything less than astounding. As an engineer I am almost shocked. I think this will be another motivating factor for more companies to transition to ARM. When looking at these new computer I am suggesting that you consider what this means for the industry as a whole. Apple one of the most dominant tech companies (even though they have been severely lacking performance for awhile) just just put some cards down that the rest of the industry will have to match.
-
Summary Reviews just dropped, M1 mac lineup of Macbook Air, Macbook Pro and Mac Mini Reviewed. From Anandtech where they say the M1 in the Mac mini behaves like a 20-24W TDP chip at peak loads. CPU performance: GPU performance: Quotes My thoughts Damn. This is super impressive stuff, I was skeptical about the non-native ARM performance but either Rosetta 2 is magic or these M1 chips are absolute monsters. I'm glad that the 10hour+ laptop is back to being a thing too! Edit: GPU performance is equally as interesting. Looks like Rosetta 2 performance is just below the Nvidia 1650 and above the AMD 560X. Edit 2: I'm also personally disappointed to see each of these new Macs only support 2 displays total so only 1 external display for the Air and Pro Sources https://www.theverge.com/21569603/apple-macbook-air-m1-review-price-specs-features-arm-silicon https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested
-
Please tell me no, I'm tired of apple taking over in everything. I dislike apple, mainly because how the people that use their products behave. Many girls I know did this stupid challenge video where they put down a finger if they wanna date the guy described.. kind of. There's just a voice saying "He is cute but he has bad teeth" "He is cute but he is short" AND "He is cute but he has an android" I watched like 50 videos before one got their finger down about the android guy. That's the kind of person I talk about, and I kind of developed hate for apple. I know I shouldn't but idk I can't really control it I just kind of hate them. But any way, Please tell me amd and intel has what to show apple, I don't want macbooks to take over everywhere. In my country they are still rare, but with how many people use iphones in my country it's just because It didn't really become a trend yet.. No way they can just beat 2 companies with all the experience and years behind, right? right..?
-
Who will Win Windows on ARM?? QCOM, NVDA, AMD, INTEL???
Surfski posted a topic in General Discussion
General Discussion and poll for us nerds..... Who will win Windows on ARM? QCOM, NVDA, AMD, INTL? I Think we all agree, in 5 years x86 systems will be like AS400s. They will be around and supported, but legacy. Windows on ARM will take over laptops, desktops and probaly even next gen server farms. The writing is on the wall and the benefits clear. So who will be the new "Intel"? Obvious choices are the below. Each has some advantages and disadvantages 1) Qcom 2) Nvidia 3) AMD 4) Intel- 1 reply
-
- arm
- windows on arm
-
(and 3 more)
Tagged with:
-
I Think we all agree, in 5 years x86 systems will be like AS400s. They will be around and supported, but legacy. Windows on ARM will take over laptops, desktops and probaly even next gen server farms. The writing is on the wall and the benefits clear. So who will be the new "Intel"? Obvious choices are the below. Each has some advantages and disadvantages 1) Qcom 2) Nvidia 3) AMD 4) Intel LINUS - Do a video on this please.
- 1 reply
-
- windows on arm
- arm
- (and 4 more)
-
Hi everyone apparently Huawei have shown this new thing they will use in PC in China what makes it Very interesting its a 8-core high performance ARM CPU with PCI-E the stock config of the machine comes with a Radeon card! Now what would been cool if Linus and group could try to obtain one and run some benchmarks on this with a high end graphics card maybe? The only concern currently is I am not sure Windows 10 is even compatible with it...
-
(This is definitely a bit long but please bear with me till the end; jump to the question in bigger font near the end of the post if you don't want to read everything) What inspired me to create this post is some thoughts that occurred to me recently about Nvidia's and AMD's latest GPU releases. Stock shortages aside, we can't deny that computer parts, especially graphics cards, have become increasingly power hungry due to Dennard scaling (not quite but related to Moore's Law) breaking down around 2006 stemming from current leakages caused by quantum effects (e.g. quantum tunneling) at ever shrinking nodes. Therefore, one of the ways that this problem has been combated so far is through the use of alternatives to x86-64, the biggest perhaps being the ARM architecture family. This has actually been pretty successful already, as a quick glance at https://gs.statcounter.com/os-market-share tells us that when looking at the desktop AND mobile OS market as a whole (despite a few obvious problems with that approach, such as the difference in nature of desktop vs mobile devices), Android had already surpassed Windows in market share. Given that electricity does not come cheap for some, especially those in developing countries, and not to mention the increased environmental impact that increasingly power-hungry GPUs and even CPUs will have (see https://www.tomsguide.com/news/ps5-vs-xbox-series-x-with-great-power-comes-greater-electric-bills), this trend in desktop parts needing greater and greater power draw is pretty worrying (at least to me). Furthermore, I'm sure that no one wants to have to get 800+ or even 1000+ watt power supplies just to make sure that their computer doesn't just randomly shutdown in the middle of a gaming session, and even then still end up tripping their breaker (which isn't all that implausible given how common 10 amp breakers on 120V outlets are especially in apartments at least in the U.S.). And while "technically" a driver update can solve the issue, depending on the TDP of the GPU/CPU itself it could drastically reduce its performance (one need to look no further than rumors about how RTX 3070+ GPUs in laptops will have up to a 40% performance deficit due to power constraints especially with the Max-Q variants). Furthermore, Apple just demonstrated this past year that there's lots to be gained in both performance and power efficiency by switching over to a custom ARM architecture (although by how much is still disputable as Apple throttled pretty hard the Intel CPUs that they were putting onto their Mac-Minis and laptops). Furthermore, ARM has at least a reputation of being much more efficient than x86-64, especially with their widespread-use in high-performance smartphones such as the latest Samsung Android flagships and iPhone flagships. But the deeper I dove into the debate of x86-64 vs ARM efficiency, the more confused I got. For example, this webcodr.io website (https://webcodr.io/2020/11/ryzen-vs-apple-silicon-and-why-zen-3-is-not-so-bad-as-you-may-think/) and even the following post on this forum (at least the OP one - https://linustechtips.com/topic/1214401-apple-and-arm-a-quasi-insiders-thoughts/?tab=comments#comment-13758213) both emphasized that ARM is more efficient than x86-64. However, here's also another 3 different articles/posts (including another post on this forum) that emphasize the specific micro-architectural design of the chips themselves rather than whether it's simply ARM vs x86-64 when it comes to efficiency, and even outright state that beyond a certain wattage limit both x86-64 and ARM exhibit very similar levels of efficiencies (even the webcodr.io website stated earlier somewhat acknowledges this as well): 1. https://community.arm.com/developer/ip-products/processors/b/processors-ip-blog/posts/the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient 2. https://www.extremetech.com/mobile/312076-what-kind-of-performance-should-we-expect-from-arm-based-macs 3. https://linustechtips.com/topic/1157141-how-come-pcs-use-so-little-power-compared-to-other-machines/?tab=comments#comment-13310148 So as the title states, my question really is - when it comes to maximizing performance/watt, is it much more about basically optimizing every layer of your "ecosystem" all the way from the hardware microarchitecture to the APIs, system applications, and even user applications themselves (kind of like how Apple has always done it especially with iPhones), OR does using ARM in general truly have a performance/watt advantage over x86-64 which can be capitalized without sacrificing too much performance? P.S. Honestly, as a computer science student looking to do web development but also looking to graduate with a specialization in computer systems (e.g. networking, computer architecture, etc.), I'm not sure if I will like the answer either way. Microsoft so far has been seemingly dragging their feet on Windows on ARM (which is not the same as Windows 10x; Windows 10x is more for competing against ChromeOS than anything else), and if it wasn't a x86-64 vs ARM issue, then I would absolutely hate having to learn all of the quirks and features of a dozen+ different microarchitectures for anything I develop (especially if I decide to switch to system programming) just to get an idea of how I would go about optimizing the user experience for my program. (e.g. oh crap, I forgot that ARM Cortex XYZ can only support 2GB of RAM, better utilize storage more or oh no, these custom ARM instructions can't carry over to x86-64 so I better use a whole 'nother library, etc, etc. and I know that modern compilers and interpreters have made it much less of an issue, but then I also don't want to go towards the other end of the spectrum and spend the rest of my life coding nothing but iOS apps; plus I see all the time web dev jobs ads calling for "full-stack" developers or just knowledge of a whole slew of programming languages/ecosystems like knowing development for both iOS and Android.) Anyway hope that wasn't too long.
- 20 replies
-
Summary Here are two HUGE new points Arm wants to do from 2025 onwards: • Arm will end TLAs (technology license agreements) with SoC vendors and go straight to OEMs. i.e. Sony will pay for the Arm license instead of Qualcomm • Arm will ban custom GPUs, custom NPUs, and custom ISPs if the SoC uses stock cores. i.e. no more Samsung’s Xclipse RDNA GPUs/AI Engine, Google’s Tensor NPU/ISP, MediaTek’s APU, HiSilicon’s Da Vinci NPU, Unisoc’s VDSP, … if stock Arm CPU cores are used. Arm is essentially doing what regulators feared Nvidia-owned Arm would do Quotes My thoughts I understand that ARM doesn’t make much money unlike their customers, but this is such a bad move, this feels like threats. I feel like, having lost that NVIDIA money, Softbank just want to squeeze as much money as they can get from Arm customers. Supposedly, Nvidia already has 20 year agreement for special use case, so they are not impacted but everyone else are like Samsung, Qualcomm, Mediatek, Google etc. Sources https://www.semianalysis.com/p/arm-changes-business-model-oem-partners
-
I know a lot of ppl look at the SPX and compare it to the WinRT. I'm not gunna lie, RT was garbage. Like, comparable to a celeron at best. Here's the deal though. My Nintendo Switch is functioning as a des top right now. Does all the shit. It games, and better games than waiting around for AAA companies to port to Linux or dancing around with narf software on a narf chip, it's stupidly low power, makes no heat even cranking a game on a 4k screen, and on top of it all you have a way cooler surface pro x86 replacement. You know what would be cooler than that though? Taking all that effort that is now built into every arm device, going to Microsoft, and being like, hey, I wanna slap this mofo in a dock like the ayn Odin dock and rock dual 1440p screens, or wire it up so the SPX is the computer, but you can use it as a drawing tablet or as a mousing surface.. Idegaf about the stupid Mac studio, why are you ignoring the mouse pad computer form factor I have been slowly falling in love with? Like, look, I get it. Marketing is cute as well as 'intended use cases' but like.... It's the same deal as Linus not understanding that he is a solo personality now, not the company personality, and as such we the audience want to hear Linus, Alex, a b c d e f g opinions on whatever from design to concept of use to a daily driver report. I want you guys to set Alex and whoever else wants to join in loose on tech stuff and for them to use whatever they want for whatever reason. I want to see yall do that super tek cooler but do that energy with everything. So SPX, what can it do, as a Microsoft arm device at that. Don't look at it as 'wah I can' t run steam' because that's A lame B stupid and C makes me as an arm user leave your video pissed. Yall can meme about it but to just call it trash and bin it immediately? I could do that with the lab or a product yall come out with and it wouldn't do anyone any services either. Like legit, get the hell over yourselves. Grow up.
-
Arm CPUs are taking over. Apple Silicon showed us that desktop computers need not be power hogs - Why haven't AMD, Intel, and Nvidia done the same, and would you want it?
-
I've never taken any EE classes nor any computer architecture classes, so I'm coming from a newby, beginner perspective. All I'm getting from the internet is mainly "ARM is less power hungry and used for mobile devices while x86 is intel/AMD." I'm not even sure if this can even be ELI5'd but, what are some the technical differences that differentiates ARM from x86 that makes it suit different devices?
-
Hi I'm new here but I've been trying to find a tv mount/arm to mount my 24" tv to the back of a mechanics rolling cart