Jump to content

New RISC-V based CPU destroys M1 Efficiency with Up To 18x better performance per watt

AlTech

A new RISC-V CPU has been found to destroy Apple's new M1 chip in Performance Per Watt tests.

 

This is mainly attributable to the insanely low wattage of this new RISC-V CPU. At 3GHz it uses 69 milliwatts or 0.069 watts and at 4.25GHz this CPU uses 200 milliwatts or 0.2 watts.

 

 

Quote

Micro Magic Inc.—a small electronic design firm in Sunnyvale, California—has produced a prototype CPU that is several times more efficient than world-leading competitors, while retaining reasonable raw performance.

We first noticed Micro Magic's claims earlier this week, when EE Times reported on the company's new prototype CPU, which appears to be the fastest RISC-V CPU in the world. Micro Magic adviser Andy Huang claimed the CPU could produce 13,000 CoreMarks (more on that later) at 5GHz and 1.1V while also putting out 11,000 CoreMarks at 4.25GHz—the latter all while consuming only 200mW. Huang demonstrated the CPU—running on an Odroid board—to EE Times at 4.327GHz/0.8V and 5.19GHz/1.1V.

Later the same week, Micro Magic announced the same CPU could produce over 8,000 CoreMarks at 3GHz while consuming only 69mW of power.

 

Quote

All of this sounds very exciting—Micro Magic's new prototype is delivering solid smartphone-grade performance at a fraction of the power budget, using an instruction set that Linux already runs natively on. Better yet, the company itself isn't an unknown.

Micro Magic was originally founded in 1995 and was purchased by Juniper Networks for $260 million. In 2004, it was reborn under its original name by the original founders—Mark Santoro and Lee Tavrow, who originally worked at Sun and led the team that developed the 300MHz SPARC microprocessor.

Micro Magic intends to offer its new RISC-V design to customers using an IP licensing model. The simplicity of the design—RISC-V requires roughly one-tenth the opcodes that modern ARM architecture does—further simplifies manufacturing concerns, since RISC-V CPU designs can be built in shuttle runs, sharing space on a wafer with other designs.

 

 

As you might imagine running at higher frequency destroys the efficiency gains and the optimal frequency for this CPU is stated to be 3GHz.

 

 

Images are from ArsTechnica's article:

 

coremark.effiency-1440x1080.png

 

coremark.performance_fixed-1440x1080.png

 

This is pretty awesome and I hope we continue to get awesome new RISC-V based designs. RISC-V for those who don't know is an open source CPU architecture design as an alternative to ARM. Until now we've mainly seen low power chips but I also hope that we get to see higher performance desktop grade RISC-V chips in the future.

 

Sources

https://arstechnica.com/gadgets/2020/12/new-risc-v-cpu-claims-recordbreaking-performance-per-watt/

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Keep in mind coremark is made for 8bit micro controllers not desktop class cpu's so the scores are to put it lightly misformed

 

 

Edit:

 

For example a arduino mega which runs at 16mhz gets a score of 7. This thing uses a over 10 years old cpu design and is 976times weaker than a snapdragon 820. To be at the same level of performance assuming perfect scaling it would need to run over 15ghz

 

The newer arduino wifi still run at 16mhz but is 6 years newer and has a score of 20 only being 341 times slower. So it would need to run at 5.45 ghz to be at the same level of a 820.

 

Both still are designed on pretty ancient designs.

 

Keep in mind that these are NOT at all made for performance just basic tasks they need to do and that is it.

 

Also keep in mind the 820 was not at all made to be a 8 bit micro controller. There are arm based ones specialized at this and they easily outscale the snapdragon whilst running slower.

 

The risc V seems to be special made for these instructions so of course it's gonna do a lot better as this is it's purpose but it really is in line with what you would expect a high power microcontroller to do when someone spends time developing one specifically for it.

 

Also keep in mind the default workload of the bench is 2000 bytes large so this could have easily taken place in it's cache memory thus again giving an unfair advantage against systems that first have to load it from somewhere else as this riscV seems to currently basically be built like like a microcontroller.

 

Either way tho if this can be translated to desktop 32 and 64 bit applications then it would be pretty epic but as it stands now it's a souped up microcontroller at best.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

At 3GHz it uses 69 milliwats or 0.69 watts and at 4.25GHz this CPU uses 200 milliwatts or 0.2 watts

So it uses less power when running at boost frequencies. I see, I see. :P

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Windows7ge said:

So it uses less power when running at boost frequencies. I see, I see. :P

I thought the OP has gotten the Wattage mixed up.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, whm1974 said:

I thought the OP has gotten the Wattage mixed up.

 

30 minutes ago, Windows7ge said:

So it uses less power when running at boost frequencies. I see, I see. :P

 

 

Thanks I fixed it.

 

69 milliwatts on 3GHz and 200 milliwatts at 4.25GHz. Funnily enough the power for 5GHz wasn't given.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

If it offered same performance I wonder how it would match it then, potentially still win but yeah.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

I’m waiting for someone who knows the chip better to explain more about whether this is a specialized chip and what’s different about it.

 

I’ll say about the benchmarks that the M1 isn’t just a CPU so its power measure includes other components like the RAM, GPU, neural engine, etc. A fair comparison would adjust for those differences.

Link to comment
Share on other sites

Link to post
Share on other sites

I have a feeling that RISC-V will get the same route ARM did, meaning it would premiere in more tightly integrated packages; stuff that would require lots of battery life out of a small battery. I’d see it maybe going in smart glasses then working it’s way up to smart watches.

 

I don’t think RISC-V will be on desktop for a while, ARM is still going to be the alternative for the time being, but at least there’s movement in this space. I wonder what’s gonna be next after RISC-V...

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, NotTheFirstDaniel said:

I have a feeling that RISC-V will get the same route ARM did, meaning it would premiere in more tightly integrated packages; stuff that would require lots of battery life out of a small battery. I’d see it maybe going in smart glasses then working it’s way up to smart watches.

 

I don’t think RISC-V will be on desktop for a while, ARM is still going to be the alternative for the time being, but at least there’s movement in this space. I wonder what’s gonna be next after RISC-V...

How many people actually own Smart Glasses and Smart Watches? I do think that SiFive has a good idea with releasing a RISC-V mITX board with their most recent CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, whm1974 said:

How many people actually own Smart Glasses and Smart Watches? I do think that SiFive has a good idea with releasing a RISC-V mITX board with their most recent CPU.

With the former, people are just waiting to rip off Samsung's and Apple's take on the idea, and sales of smartwatches have been increasing YoY since the original Apple Watches and FitBits.

 

For RISC-V to be popular in the mainstream, Apple has to transition to it. I'm sorry if that sounds fanboyish, but look at ARM. No one took it seriously on the desktop/laptop until the M1 came out. Plus, you're not going to get app support until Apple does it. Imagine telling all these software vendors that now you have to port to both ARM and RISC-V, they'll just pick the one that's more popular, which would be ARM.

Link to comment
Share on other sites

Link to post
Share on other sites

This is going to be amazing for things like smart watches, smart glasses stuff that only has a very small battery. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, NotTheFirstDaniel said:

With the former, people are just waiting to rip off Samsung's and Apple's take on the idea, and sales of smartwatches have been increasing YoY since the original Apple Watches and FitBits.

 

For RISC-V to be popular in the mainstream, Apple has to transition to it. I'm sorry if that sounds fanboyish, but look at ARM. No one took it seriously on the desktop/laptop until the M1 came out. Plus, you're not going to get app support until Apple does it. Imagine telling all these software vendors that now you have to port to both ARM and RISC-V, they'll just pick the one that's more popular, which would be ARM.

RISC-V is already supported by BSD and Linux Operating Systems along with several others.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, whm1974 said:

RISC-V is already supported by BSD and Linux Operating Systems along with several others.

The Raspberry Pi was out and Linux supported ARM for how many years before WoA and Apple Silicon, yet they failed to gather any mainstream attention outside of niche enthusiast groups.

 

Until Apple does for RISC-V what they did for ARM, RISC-V on the desktop won't be as popular as ARM/x86, unless Microsoft is actually competent this time around.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, NotTheFirstDaniel said:

The Raspberry Pi was out and Linux supported ARM for how many years before WoA and Apple Silicon, yet they failed to gather any mainstream attention outside of niche enthusiast groups.

 

Until Apple does for RISC-V what they did for ARM, RISC-V on the desktop won't be as popular as ARM/x86, unless Microsoft is actually competent this time around.

The Raspberry Pi is a great deal more popular then you think. It went far beyond educating School Children about Programming and Electronics.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, whm1974 said:

According to Wikipedia about ~30 Million of these things have been sold.

https://en.wikipedia.org/wiki/Raspberry_Pi#Sales

Please stay on topic. :)

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

promising news but we're still about 10 years from risc-v being a real thing probably

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, jaslion said:

Keep in mind coremark is made for 8bit micro controllers not desktop class cpu's so the scores are to put it lightly misformed

Misformed? These benchmarks are entirely fricking useless, unless one specifically wants to compare 8bit MCUs with no FPU, MMU, vectorized instructions, DDR RAM or practically any modern features! Sure, advertising the enormous efficiency and high clock-speeds will look good in headlines and will fool people into thinking this is something far better or meaningful than it actually is -- this thread itself contains multiple examples of people not understanding the actual implications -- but...no, just no; Coremark - results are NOT even remotely indicative of desktop-like workload performance.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Jet_ski said:

I’ll say about the benchmarks that the M1 isn’t just a CPU so its power measure includes other components like the RAM, GPU, neural engine, etc. A fair comparison would adjust for those differences.

yea sure it's not just a cpu, but that benchmark is purely for cpu

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, WereCatf said:

Misformed? These benchmarks are entirely fricking useless, unless one specifically wants to compare 8bit MCUs with no FPU, MMU, vectorized instructions, DDR RAM or practically any modern features! Sure, advertising the enormous efficiency and high clock-speeds will look good in headlines and will fool people into thinking this is something far better or meaningful than it actually is -- this thread itself contains multiple examples of people not understanding the actual implications -- but...no, just no; Coremark - results are NOT even remotely indicative of desktop-like workload performance.

I wonder when we will see a sort of Raspberry Pi replacement using a RISC-V SoC? Would need a iGPU however.

Link to comment
Share on other sites

Link to post
Share on other sites

But experts on internet forums told me "risc is dead" years ago... 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Mark Kaine said:

But experts on internet forums told me "risc is dead" years ago... 

 

 

Apple themselves already killed-off their own RISC processors once when PowerPC went away in the early 2000's. RISC processors have been around for decades, there were even Sun and other workstations using them back in the day, and yet the computing world moved to CISC (specifically x86/AMD64) hardware for a reason. The only real benefit of a RISC processor is power efficiency for tailored workloads.

My Current Setup:

AMD Ryzen 5900X

Kingston HyperX Fury 3200mhz 2x16GB

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

EVGA RTX 3060 Ti XC

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, NotTheFirstDaniel said:

I have a feeling that RISC-V will get the same route ARM did, meaning it would premiere in more tightly integrated packages; stuff that would require lots of battery life out of a small battery. I’d see it maybe going in smart glasses then working it’s way up to smart watches.

 

I don’t think RISC-V will be on desktop for a while, ARM is still going to be the alternative for the time being, but at least there’s movement in this space. I wonder what’s gonna be next after RISC-V...

I don't think RISC-V will be on the desktop. MIPS chips have already been there and the most common MIPS chip that everyone had was the PS1 or the N64.

 

Micro Magic are the former SUN people who worked on SPARC (AFAIK.)

 

Anyway, it'll be interesting if anything they develop ends up in anything more powerful than a smartTV or tablet. I don't see it becoming a desktop chip, and a lot of recent interest in RISC-V is due to nVidia grabbing ARM, which puts a lot of uncertainty about the future of the ARM designs. Maybe an ultra-dense super-computer design based on it might be in the cards considering who is developing it.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×