Jump to content

Leak shows AMD "Ryzen C7" SoC for smartphones with impressive specs

Clueless_Gamer
6 hours ago, leadeater said:

image.png.d2ed3ab3bdb446ef1f7c31842fbde92a.png

But why? Do we need this in cellphones?

Be real. The majority of gamers is on mobile now.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

image.png.d2ed3ab3bdb446ef1f7c31842fbde92a.png

But why? Do we need this in cellphones?

Ryzen does what Inteldon't I guess 🤣

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Vode said:

Be real. The majority of gamers is on mobile now.

By count sure, but none of them need Ray Tracing 😉

 

Spoiler

Neither would hardware assisted RT on a cellphone be of any actual use anyway, increased power and die area and lower general performance, not a good idea.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

By count sure, but none of them need Ray Tracing 😉

 

  Hide contents

Neither would hardware assisted RT on a cellphone be of any actual use anyway, increased power and die area and lower general performance, not a good idea.

 

But RT is about all that's left for candy crush, unless they release a Disney reskin...   again...

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, leadeater said:

By count sure, but none of them need Ray Tracing 😉

wait till you see RT Clash of Clans, it will be glorious 

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

This is good, AMD expanding to other fields. If anything, this might bring me back to Android. A Ryzen powered smartphone. I'd want it just for that alone lol :D

Link to comment
Share on other sites

Link to post
Share on other sites

So it seems like this is fake, but my guess is that it will be pretty similar to what we will see from the next Samsung SoC (for the next Galaxy S device, not the Note).

Cortex-X1 for the big cores, Cortex-A78 for the middle cores and Cortex-A55 for the small cores. The clock speeds seems about right as well.

TSMC 5nm might be ready by next year.

Samsung and AMD have collaborated on building a GPU for mobile devices so the "RDNA_2 Mobile" could be real. Not sure about the other details though.

Probably won't use MediaTek's modem since Samsung has their own.

Memory seems believable.

Display support, sure.

The other stuff? Yep, seems pretty standard.

 

I'd say 90% of this will turn out to be more or less accurate, but that it's the Samsung SoC instead.

With that being said, I think this is a completely fabricated "leak". But the stuff here isn't that hard to think of, and will probably be true.

 

 

 

9 hours ago, Commodus said:

If this is accurate (it's easy to create a bogus wall of text, so be careful), it'd finally end some of the stagnation in Android phone CPUs and might give Apple some real competition performance-wise.  Raytracing probably won't get much use in mobile games for a while, but I'd rather have it than not.

CPU wise, it seems like the X1 will be comparable to the big cores in the A13. The medium cores (A78) will be faster than the small cores in the A13 though. And then we got 4 shitty A55 cores. So overall, if this SoC were to be true and be released today, it would stack up really well against the A13. This would probably be slightly faster at peak CPU performance.

But by the time this gets released, it will be up against the A14 and Apple will probably be in the lead again. But Apple's lead will be way smaller than it has been before.

 

 

8 hours ago, dizmo said:

Samsung SoCs are used in a lot of other products, it's just most people pick Qualcomm because they're more widely known. As Samsung's chips get better and better, they might be seen in more and more devices. I think Samsung's flagship SoCs aren't made in enough numbers to be used in anything but their own devices.

You sure?

I've only ever seen Samsung's SoCs (and by that I mean Exynos) used in Samsung devices and then a few Meizu devices.

I think one of them problems has been licensing for cellular connectivity (because Qualcomm are dicks), primarily CDMA.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

By count sure, but none of them need Ray Tracing 😉

 

  Hide contents

Neither would hardware assisted RT on a cellphone be of any actual use anyway, increased power and die area and lower general performance, not a good idea.

 

I don't think we have the capabilties to assess, data or ability to decide what other may or may not need.

 

The market usually decides that. We enthusiasts are a fraction of people.

 

On the tech aspect. I think I agree with you although bleeding edge tech often seems to be useless and/or wasteful at first. But without making the first step progress is hampered. Innovation, selection and creative destruction feeds progress.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Vode said:

I don't think we have the capabilties to assess, data or ability to decide what other may or may not need.

Well I think we do as we know hardware RT requires actual die area and does not add performance to any other aspect and the extra transistors require power gating when not in use but power gating is not perfect so you are taking die area and increasing power usage, in a mobile part optimized for low power, so gain what? You can only use it to do BVH calculations so unless everything else in the SoC is capable of supporting it then it's literally useless.

 

So why increase the cost of the SoC by making it bigger and also increase idle power usage for a feature you can't actually use, the GPU raster pipeline is too slow, and wouldn't be effectively utilized on smaller screens. This would only make sense on a large form factor tablet, that's it.

 

Taking the very same die area and transistors and using that for CPU or GPU would actually result in a hardware performance gain that people actually use and would benefit from.

 

Not only is this rumor likely fake Ray Tracing is just a current buzz word often used to peddle rumors because it's the in thing and people want to believe it's getting wider usage. This is just a rumor, as with all rumors do a sensibility check before believing it, hardware Ray Tracing in a mobile SoC from a company that hasn't deployed any such technology yet in a market they aren't currently even in yet is a hard fail to me.

Link to comment
Share on other sites

Link to post
Share on other sites

@leadeater

I mean, there isn't any "fake" RT. It is or it isn't. You do have some room with precision, but you have to calculate ray bounces. Precision is where you define how many rays per pixel you use and how many bounces each ray does. You can go as low as 1 ray per pixel and do 1 bounce. I guess you could do ray approximation and maybe fake it to get similar effect with little performance hit, but you'd still have to do some ray tracing.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, RejZoR said:

@leadeater

I mean, there isn't any "fake" RT. It is or it isn't. You do have some room with precision, but you have to calculate ray bounces. Precision is where you define how many rays per pixel you use and how many bounces each ray does. You can go as low as 1 ray per pixel and do 1 bounce. I guess you could do ray approximation and maybe fake it to get similar effect with little performance hit, but you'd still have to do some ray tracing.

Definition of terms. Totally fake ray tracing can be done. It’s been a thing since the 15th century in western art.  Apparently one can use bits of this to dramatically enhance the effect of small amounts of actual ray tracing. “Faked up” maybe.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, RejZoR said:

@leadeater

I mean, there isn't any "fake" RT. It is or it isn't. You do have some room with precision, but you have to calculate ray bounces. Precision is where you define how many rays per pixel you use and how many bounces each ray does. You can go as low as 1 ray per pixel and do 1 bounce. I guess you could do ray approximation and maybe fake it to get similar effect with little performance hit, but you'd still have to do some ray tracing.

I said the rumor is fake bouncing of the hype of RT due to people wanting to believe anything around that. Probably needed a comma.

Link to comment
Share on other sites

Link to post
Share on other sites

The thing about RT is it’s simultaneously the most computationally intensive and least useful thing that can be done.  It’s difficult to even see the effects of RT on something quite frequently, while using huge sections of compute capacity.  Existence of RT says “we’ve run out of really useful stuff so we’re looking for scraps”.  This might be where phones are now I guess.  Kinda doubting it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

The thing about RT is it’s simultaneously the most computationally intensive and least useful thing that can be done

Well it's only not useful right now because we can't, due to hardware performance or not found superior methods, do an entire RT render path in real time to replace everything we do now. Remember when we didn't have dynamic shadows and lighting? There was a reason for that, similar reason to this. We'll likely make that transition but I wouldn't put money on the next 2 hardware/architecture generations being that time.

 

If the whole RT thing isn't a flop on consoles I suspect we'll see a lot more noticeable and useful RT usage in games, whether PC gamers want to admit it consoles do dictate industry trends and technology development that flows all the way down to game engine design and supporting tool-sets and all the way up to what we get on the PC. People wonder why graphics sliders and settings don't do much after medium or high, consoles is your answer (diminishing returns on fidelity improvements on features limited to what consoles support, rare usage of sponsored features like RTX and GameWorks).

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

image.png.d2ed3ab3bdb446ef1f7c31842fbde92a.png

But why? Do we need this in cellphones?

653550941_yesmeme.jpg.c12dc9cfff34a6eeee1d3e85ba26d275.jpg

Details separate people.

Link to comment
Share on other sites

Link to post
Share on other sites

So the actual article is apparently almost certainly garbage, but the discussion is now about processing power increases in cellphones.   
 

cellphones will increase in processing power as their battery power envelope allows.  One issue is RT, which has gigantic processing requirements for a return that is pretty questionable currently.   Will cellphone power envelopes eventually get to a point where processing power becomes gratuitous and RT becomes something to add? Probably, eventually.  Has this already happened?  Maybe.  I doubt it.    will RT develop enough to the point it is worth making space for amongst more mission critical things?  Possibly.  Not yet though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Saw this coming eventually. Well maybe not the actual x86 though Ryzen name reminded me about it. We knew they work with Samsung to use RDNA based GPU for the new SoC but I knew we'd see Cortex X1 cores for this SoC though. We know that those cores are finally a new high performance tier aka Apple big cores, so this will be very fun to see. The 4 cores config of those will probably be for tablets and some 2in1's or such. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Doobeedoo said:

Saw this coming eventually. Well maybe not the actual x86 though Ryzen name reminded me about it. We knew they work with Samsung to use RDNA based GPU for the new SoC but I knew we'd see Cortex X1 cores for this SoC though. We know that those cores are finally a new high performance tier aka Apple big cores, so this will be very fun to see. The 4 cores config of those will probably be for tablets and some 2in1's or such. 

It’s why the fake got as far as it did I think.  It’s not specifically impossible, just bleeding edge.  The 5nm thing is done because there isn’t much 5nm out yet so there isn’t any data and it makes the power draw stuff less implausible.  Standard mountebank technique. Fraudulent science lives in the grey area of just outside proven things where stuff might be possible.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

This is possibly one of the most ridiculous rumors I've ever seen on here. There are soooooo many red flags. I'd give a Nigerian Prince a loan before I bet on this being true.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, JoostinOnline said:

This is possibly one of the most ridiculous rumors I've ever seen on here. There are soooooo many red flags. I'd give a Nigerian Prince a loan before I bet on this being true.

It can be debunked using non science stuff but even the science doesn’t really hold up.  Shows how people will believe even a really weak lie as long as it’s pretty enough.  Been a whole lot of that going around for years lately

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LAwLz said:

You sure?

I've only ever seen Samsung's SoCs (and by that I mean Exynos) used in Samsung devices and then a few Meizu devices.

I think one of them problems has been licensing for cellular connectivity (because Qualcomm are dicks), primarily CDMA.

They're used in some Motorola and Vivo phones, some older Lenovo (maybe still is, haven't looked) as well as quite a few DAPs.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The CPU performance doesn't look too impressive considering it's still using a similar layout to what Qualcomm has to offer.

 

Curious to see how it hold up against Apple.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×