Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

New RISC-V based CPU destroys M1 Efficiency with Up To 18x better performance per watt

A new RISC-V CPU has been found to destroy Apple's new M1 chip in Performance Per Watt tests.

 

This is mainly attributable to the insanely low wattage of this new RISC-V CPU. At 3GHz it uses 69 milliwatts or 0.069 watts and at 4.25GHz this CPU uses 200 milliwatts or 0.2 watts.

 

 

Quote

Micro Magic Inc.—a small electronic design firm in Sunnyvale, California—has produced a prototype CPU that is several times more efficient than world-leading competitors, while retaining reasonable raw performance.

We first noticed Micro Magic's claims earlier this week, when EE Times reported on the company's new prototype CPU, which appears to be the fastest RISC-V CPU in the world. Micro Magic adviser Andy Huang claimed the CPU could produce 13,000 CoreMarks (more on that later) at 5GHz and 1.1V while also putting out 11,000 CoreMarks at 4.25GHz—the latter all while consuming only 200mW. Huang demonstrated the CPU—running on an Odroid board—to EE Times at 4.327GHz/0.8V and 5.19GHz/1.1V.

Later the same week, Micro Magic announced the same CPU could produce over 8,000 CoreMarks at 3GHz while consuming only 69mW of power.

 

Quote

All of this sounds very exciting—Micro Magic's new prototype is delivering solid smartphone-grade performance at a fraction of the power budget, using an instruction set that Linux already runs natively on. Better yet, the company itself isn't an unknown.

Micro Magic was originally founded in 1995 and was purchased by Juniper Networks for $260 million. In 2004, it was reborn under its original name by the original founders—Mark Santoro and Lee Tavrow, who originally worked at Sun and led the team that developed the 300MHz SPARC microprocessor.

Micro Magic intends to offer its new RISC-V design to customers using an IP licensing model. The simplicity of the design—RISC-V requires roughly one-tenth the opcodes that modern ARM architecture does—further simplifies manufacturing concerns, since RISC-V CPU designs can be built in shuttle runs, sharing space on a wafer with other designs.

 

 

As you might imagine running at higher frequency destroys the efficiency gains and the optimal frequency for this CPU is stated to be 3GHz.

 

 

Images are from ArsTechnica's article:

 

coremark.effiency-1440x1080.png

 

coremark.performance_fixed-1440x1080.png

 

This is pretty awesome and I hope we continue to get awesome new RISC-V based designs. RISC-V for those who don't know is an open source CPU architecture design as an alternative to ARM. Until now we've mainly seen low power chips but I also hope that we get to see higher performance desktop grade RISC-V chips in the future.

 

Sources

https://arstechnica.com/gadgets/2020/12/new-risc-v-cpu-claims-recordbreaking-performance-per-watt/

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill

iPhone 8 Plus (Mid 2019 to present)

Samaritan XL (Early 2018 - present with 2019 GPU upgrade and 2020 RAM upgrade and 2020 HDD upgrade) - AMD Ryzen 7 1700X (8C/16T) , MSI X370 Gaming Pro Carbon, Corsair 32GB Vengeance LPX DDR4-2666 (2020) ,  Asus ROG Strix RX Vega 56 , Corsair RM850i PSU, Corsair H100i v2 CPU Cooler, Samsung 860 EVO 500GB SSD, Seagate BarraCuda 6TB HDD (2020) , NZXT S340 Elite, Corsair ML 120 Pro, Corsair ML 140 Pro

Link to post
Share on other sites

Keep in mind coremark is made for 8bit micro controllers not desktop class cpu's so the scores are to put it lightly misformed

 

 

Edit:

 

For example a arduino mega which runs at 16mhz gets a score of 7. This thing uses a over 10 years old cpu design and is 976times weaker than a snapdragon 820. To be at the same level of performance assuming perfect scaling it would need to run over 15ghz

 

The newer arduino wifi still run at 16mhz but is 6 years newer and has a score of 20 only being 341 times slower. So it would need to run at 5.45 ghz to be at the same level of a 820.

 

Both still are designed on pretty ancient designs.

 

Keep in mind that these are NOT at all made for performance just basic tasks they need to do and that is it.

 

Also keep in mind the 820 was not at all made to be a 8 bit micro controller. There are arm based ones specialized at this and they easily outscale the snapdragon whilst running slower.

 

The risc V seems to be special made for these instructions so of course it's gonna do a lot better as this is it's purpose but it really is in line with what you would expect a high power microcontroller to do when someone spends time developing one specifically for it.

 

Also keep in mind the default workload of the bench is 2000 bytes large so this could have easily taken place in it's cache memory thus again giving an unfair advantage against systems that first have to load it from somewhere else as this riscV seems to currently basically be built like like a microcontroller.

 

Either way tho if this can be translated to desktop 32 and 64 bit applications then it would be pretty epic but as it stands now it's a souped up microcontroller at best.

 

Link to post
Share on other sites
Quote

At 3GHz it uses 69 milliwats or 0.69 watts and at 4.25GHz this CPU uses 200 milliwatts or 0.2 watts

So it uses less power when running at boost frequencies. I see, I see. :P

Guides & Tutorials:

How To: Remotely Access a Computer, Server, or NAS

How To: Access Remote Systems at Home/Work Securely from Anywhere with Pritunl

How to Format Storage Devices in Windows 10

A How-To: Drive Sharing in Windows 10

VFIO GPU Pass-though w/ Looking Glass KVM on Ubuntu 19.04

A How-To Guide: Building a Rudimentary Disk Enclosure

Three Methods to Resetting a Windows Login Password

A Beginners Guide to Debian CLI Based File Servers

 

Guide/Tutorial in Progress:

How to Use Memtest86 to Diagnose RAM Errors

 

In the Queue:

iPXE Network Booting to an iSCSI Target

 

Don't see what you need? Check the Full List or *PM me, if I haven't made it I'll add it to the list.

*NOTE: I'll only add it to the list if the request is something I know I can do.

Link to post
Share on other sites
6 minutes ago, Windows7ge said:

So it uses less power when running at boost frequencies. I see, I see. :P

I thought the OP has gotten the Wattage mixed up.

Link to post
Share on other sites
23 minutes ago, whm1974 said:

I thought the OP has gotten the Wattage mixed up.

 

30 minutes ago, Windows7ge said:

So it uses less power when running at boost frequencies. I see, I see. :P

 

 

Thanks I fixed it.

 

69 milliwatts on 3GHz and 200 milliwatts at 4.25GHz. Funnily enough the power for 5GHz wasn't given.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill

iPhone 8 Plus (Mid 2019 to present)

Samaritan XL (Early 2018 - present with 2019 GPU upgrade and 2020 RAM upgrade and 2020 HDD upgrade) - AMD Ryzen 7 1700X (8C/16T) , MSI X370 Gaming Pro Carbon, Corsair 32GB Vengeance LPX DDR4-2666 (2020) ,  Asus ROG Strix RX Vega 56 , Corsair RM850i PSU, Corsair H100i v2 CPU Cooler, Samsung 860 EVO 500GB SSD, Seagate BarraCuda 6TB HDD (2020) , NZXT S340 Elite, Corsair ML 120 Pro, Corsair ML 140 Pro

Link to post
Share on other sites

If it offered same performance I wonder how it would match it then, potentially still win but yeah.

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB | Mouse: Zowie S1 | OS: Windows 10

Link to post
Share on other sites

I’m waiting for someone who knows the chip better to explain more about whether this is a specialized chip and what’s different about it.

 

I’ll say about the benchmarks that the M1 isn’t just a CPU so its power measure includes other components like the RAM, GPU, neural engine, etc. A fair comparison would adjust for those differences.

Link to post
Share on other sites

I have a feeling that RISC-V will get the same route ARM did, meaning it would premiere in more tightly integrated packages; stuff that would require lots of battery life out of a small battery. I’d see it maybe going in smart glasses then working it’s way up to smart watches.

 

I don’t think RISC-V will be on desktop for a while, ARM is still going to be the alternative for the time being, but at least there’s movement in this space. I wonder what’s gonna be next after RISC-V...

Link to post
Share on other sites
3 minutes ago, NotTheFirstDaniel said:

I have a feeling that RISC-V will get the same route ARM did, meaning it would premiere in more tightly integrated packages; stuff that would require lots of battery life out of a small battery. I’d see it maybe going in smart glasses then working it’s way up to smart watches.

 

I don’t think RISC-V will be on desktop for a while, ARM is still going to be the alternative for the time being, but at least there’s movement in this space. I wonder what’s gonna be next after RISC-V...

How many people actually own Smart Glasses and Smart Watches? I do think that SiFive has a good idea with releasing a RISC-V mITX board with their most recent CPU.

Link to post
Share on other sites
17 minutes ago, Jet_ski said:

I’m waiting for someone who knows the chip better to explain more about whether this is a specialized chip and what’s different about it.

 

I’ll say about the benchmarks that the M1 isn’t just a CPU so its power measure includes other components like the RAM, GPU, neural engine, etc. A fair comparison would adjust for those differences.

RISC-V is a CPU Instruction Set Architecture (ISA) like x86, ARM, and MIPS. I won't bore you with the technical details but essentially it's an engineering specification of how you talk to the CPU. Originally developed by UC Berkeley, what's so good about RISC-V isn't just that its design principles are super well thought out and efficient, but that it's entirely free and open source. x86 and ARM are licensed by companies and using their architecture requires that you pay the companies that own the design. Apple has to pay licensing fees to Arm, Inc., for instance.

 

RISC-V has been a big deal in engineering circles for awhile for how much promise it has. This news is a big deal for it since more and more companies are becoming interested in it now that the design is matured (and effectively frozen). The problem of course is that you need to build an operating system to support it and only the Linux kernel out of the big three actually does. The real advantage is that RISC-V is free, remember, so there's a lot of incentive for someone like Apple to use it. Of course, they weren't about to bet the future of their company on it, but I could totally see them not wanting to pay fees to another company for their own processors.

Link to post
Share on other sites
10 minutes ago, whm1974 said:

How many people actually own Smart Glasses and Smart Watches? I do think that SiFive has a good idea with releasing a RISC-V mITX board with their most recent CPU.

With the former, people are just waiting to rip off Samsung's and Apple's take on the idea, and sales of smartwatches have been increasing YoY since the original Apple Watches and FitBits.

 

For RISC-V to be popular in the mainstream, Apple has to transition to it. I'm sorry if that sounds fanboyish, but look at ARM. No one took it seriously on the desktop/laptop until the M1 came out. Plus, you're not going to get app support until Apple does it. Imagine telling all these software vendors that now you have to port to both ARM and RISC-V, they'll just pick the one that's more popular, which would be ARM.

Link to post
Share on other sites

This is going to be amazing for things like smart watches, smart glasses stuff that only has a very small battery. 

Link to post
Share on other sites
5 minutes ago, NotTheFirstDaniel said:

With the former, people are just waiting to rip off Samsung's and Apple's take on the idea, and sales of smartwatches have been increasing YoY since the original Apple Watches and FitBits.

 

For RISC-V to be popular in the mainstream, Apple has to transition to it. I'm sorry if that sounds fanboyish, but look at ARM. No one took it seriously on the desktop/laptop until the M1 came out. Plus, you're not going to get app support until Apple does it. Imagine telling all these software vendors that now you have to port to both ARM and RISC-V, they'll just pick the one that's more popular, which would be ARM.

RISC-V is already supported by BSD and Linux Operating Systems along with several others.

Link to post
Share on other sites
Just now, whm1974 said:

RISC-V is already supported by BSD and Linux Operating Systems along with several others.

The Raspberry Pi was out and Linux supported ARM for how many years before WoA and Apple Silicon, yet they failed to gather any mainstream attention outside of niche enthusiast groups.

 

Until Apple does for RISC-V what they did for ARM, RISC-V on the desktop won't be as popular as ARM/x86, unless Microsoft is actually competent this time around.

Link to post
Share on other sites
21 minutes ago, NotTheFirstDaniel said:

The Raspberry Pi was out and Linux supported ARM for how many years before WoA and Apple Silicon, yet they failed to gather any mainstream attention outside of niche enthusiast groups.

 

Until Apple does for RISC-V what they did for ARM, RISC-V on the desktop won't be as popular as ARM/x86, unless Microsoft is actually competent this time around.

The Raspberry Pi is a great deal more popular then you think. It went far beyond educating School Children about Programming and Electronics.

Link to post
Share on other sites
12 minutes ago, whm1974 said:

According to Wikipedia about ~30 Million of these things have been sold.

https://en.wikipedia.org/wiki/Raspberry_Pi#Sales

Please stay on topic. :)

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill

iPhone 8 Plus (Mid 2019 to present)

Samaritan XL (Early 2018 - present with 2019 GPU upgrade and 2020 RAM upgrade and 2020 HDD upgrade) - AMD Ryzen 7 1700X (8C/16T) , MSI X370 Gaming Pro Carbon, Corsair 32GB Vengeance LPX DDR4-2666 (2020) ,  Asus ROG Strix RX Vega 56 , Corsair RM850i PSU, Corsair H100i v2 CPU Cooler, Samsung 860 EVO 500GB SSD, Seagate BarraCuda 6TB HDD (2020) , NZXT S340 Elite, Corsair ML 120 Pro, Corsair ML 140 Pro

Link to post
Share on other sites

promising news but we're still about 10 years from risc-v being a real thing probably

Coming Soon: MOAR COARS: 5GHz Confirmed Black Edition™ The Build

 

Link to post
Share on other sites
2 hours ago, jaslion said:

Keep in mind coremark is made for 8bit micro controllers not desktop class cpu's so the scores are to put it lightly misformed

Misformed? These benchmarks are entirely fricking useless, unless one specifically wants to compare 8bit MCUs with no FPU, MMU, vectorized instructions, DDR RAM or practically any modern features! Sure, advertising the enormous efficiency and high clock-speeds will look good in headlines and will fool people into thinking this is something far better or meaningful than it actually is -- this thread itself contains multiple examples of people not understanding the actual implications -- but...no, just no; Coremark - results are NOT even remotely indicative of desktop-like workload performance.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites
2 hours ago, Jet_ski said:

I’ll say about the benchmarks that the M1 isn’t just a CPU so its power measure includes other components like the RAM, GPU, neural engine, etc. A fair comparison would adjust for those differences.

yea sure it's not just a cpu, but that benchmark is purely for cpu

Quote me for a reply, React if I was helpful, informative, or funny

 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to post
Share on other sites
4 minutes ago, WereCatf said:

Misformed? These benchmarks are entirely fricking useless, unless one specifically wants to compare 8bit MCUs with no FPU, MMU, vectorized instructions, DDR RAM or practically any modern features! Sure, advertising the enormous efficiency and high clock-speeds will look good in headlines and will fool people into thinking this is something far better or meaningful than it actually is -- this thread itself contains multiple examples of people not understanding the actual implications -- but...no, just no; Coremark - results are NOT even remotely indicative of desktop-like workload performance.

I wonder when we will see a sort of Raspberry Pi replacement using a RISC-V SoC? Would need a iGPU however.

Link to post
Share on other sites

But experts on internet forums told me "risc is dead" years ago... 

 

 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
28 minutes ago, Mark Kaine said:

But experts on internet forums told me "risc is dead" years ago... 

 

 

Apple themselves already killed-off their own RISC processors once when PowerPC went away in the early 2000's. RISC processors have been around for decades, there were even Sun and other workstations using them back in the day, and yet the computing world moved to CISC (specifically x86/AMD64) hardware for a reason. The only real benefit of a RISC processor is power efficiency for tailored workloads.

My Current Setup:

AMD Ryzen 2700X 50th Anniversary

Corsair Vengence LPX 3200mhz 16GB

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

PNY GTX 1060 6GB

ADATA SU800 1TB

WD 5400RPM 2TB

EVGA G3 750W

RAIDMAX Monster II

beQuiet PureWings 3 140mm x5

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×