Jump to content

Why cannot a CPU be used as a GPU?

archerbob
Just now, archerbob said:

How could one custom build a GPU?

 

 

actually i had the same thought, why the vram and gpu chips, cooler arent swapable

Any Help is appricated! Please correct me if I´m wrong!

Sorry for grammer/spelling mistakes, but english is not my native language (it´s german in case you were curious) *expand to see builds*

 

Primary PC: CPU: AMD Ryzen 5 3600 | GPU: Crossfire Radeon 6870 + 6850 | RAM: CORSAIR Vengeance 2X16 = 32GB @ 3600MHZ DDR4 | MOBO: ASUS ROG STRIX B450-F | COOLER: COOLER MASTER ML360R | CASE: DEEPCOOL Matrexx 55 V3 ADD-RGB | PSU: GIGABYTE P850GM 80+ GOLD | SDD: CRUCIAL MX500 250GB |

Everything thats not colourful I haven't bought yet.

 

Secondary PC(Currently not operational): CPU:  INTEL Q8200S @ 2.33Ghz | GPU: GTX 750 ti / 760 | RAM: 4X2 = 8GB @ 800MHZ DDR2 OCZ Platinum | MOBO: ASUS P5E-VM SE | COOLER: Be Quiet! Silent Loop 280* | CASE: DEEPCOOL Matrexx 55 V3 ADD-RGB* | PSU: CORSAIR RM850 2019 80+ GOLD* | SSD: CRUCIAL MX500 250GB* 

Everything marked with * is what I bought for the Primary PC and I'm just using it until I get all the parts.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Enzo1001 said:

actually i had the same thought, why the vram and gpu chips, cooler arent swapable

 

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

A gpu chip has a lot of very simple cores , hundreds or thousands of them, with just a few specialized functions, and all these "cores" can work in parallel on parts of each frame drawn on the screen to make lots of frames per second.

A cpu has much more complex cores which can do a lot of complex operations.

 

The vram on video card is needed to store as close as possible to the gpu cores the information about what's in the frame and the textures that get applied over all the items in the frame ... ex a tree, a door, wall of a building, etc ... everything is made out o thousands of polygons and triangles, and textures are applied over those triangles to make the picture you see.

 

You can't custom build a video card because it's just harder... you're dealing with memory chips that must be much closer to the gpu chip, can't have them on sticks because the extra distance and wire resistance of the slots would cause issues. those memory chips run at higher frequencies.

It's also pointless to have a gpu chip on a socket, because for example you could have a gtx 1050 ti chip that consumes under 60w or you could have a rtx3070 chip which consumes 250 watts ... it would be silly to make the  buyer spend 100$ or so on the base video card circuit board with a VRM (dc-dc converter) capable of supplying 300w to a gpu chip, if that gpu chip only consumes 60w.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, mariushm said:

it would be silly to make the  buyer spend 100$ or so on the base video card circuit board with a VRM (dc-dc converter) capable of supplying 300w to a gpu chip, if that gpu chip only consumes 60w

Yet it's acceptable to design motherboard that caters to a wide range of CPU power consumption, don't think this argument makes sense lol

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Enzo1001 said:

actually i had the same thought, why the vram and gpu chips, cooler arent swapable

A big issue would be power delivery. It's technically possible to design a board where with the necessary power delivery to different configurations of GPU dies and VRAM... but it's just not practical. Everything would have to at least support the highest tier GPU in order to make it worth while. There are more complex explanations but I'm not going down that endless rabbit hole. 

 

I would recommend giving this thread a read as it was discussed prior.

 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

>Why cannot a CPU be used as a GPU?

 

actually the CPU can be used as a GPU... it's just going to be slllooooowww for the reasons @mariushm described above

i think mesa OpenGL drivers have this functionality: "A software implementation of OpenGL is useful for experimentation, such as testing new rendering techniques."

 

And i think Linus ran crisis on a threadripper in one video... or something similar

ಠ_ಠ

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, archerbob said:

How could one custom build a GPU?

 

 

async.

Link to comment
Share on other sites

Link to post
Share on other sites

You probably need some kind of factory to manufacture the chips your GPU is made of ...

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, shadow_ray said:

>Why cannot a CPU be used as a GPU?

 

actually the CPU can be used as a GPU... it's just going to be slllooooowww for the reasons @mariushm described above

i think mesa OpenGL drivers have this functionality: "A software implementation of OpenGL is useful for experimentation, such as testing new rendering techniques."

 

And i think Linus ran crisis on a threadripper in one video... or something similar

In fact it often IS used for 2D at least, when using something like VNC for a virtual desktop (not mirroring a real one so no GPU rendering).

 

Also when using Linux due to lack of standards video playback is often done on the CPU whereas on Windows/Mac the GPU decode blocks will do it saving a ton of CPU resources.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/3/2021 at 2:18 AM, shadow_ray said:

>Why cannot a CPU be used as a GPU?

 

actually the CPU can be used as a GPU... it's just going to be slllooooowww for the reasons @mariushm described above

i think mesa OpenGL drivers have this functionality: "A software implementation of OpenGL is useful for experimentation, such as testing new rendering techniques."

 

And i think Linus ran crisis on a threadripper in one video... or something similar

You can use Cinebench as an example

Asus ROG G531GT : i7-9750H - GTX 1650M +700mem - MSI RX6600 Armor 8G M.2 eGPU - Samsung 16+8GB PC4-2666 - Samsung 860 EVO 500G 2.5" - 1920x1080@145Hz (172Hz) IPS panel

Family PC : i5-4570 (-125mV) - cheap dual-pipe cooler - Gigabyte Z87M-HD3 Rev1.1 - Kingston HyperX Fury 4x4GB PC3-1600 - Corsair VX450W - an old Thermaltake ATX case

Test bench 1 G3260 - i5-4690K - 6-pipe cooler - Asus Z97-AR - Panram Blue Lightsaber 2x4GB PC3-2800 - Micron CT500P1SSD8 NVMe - Intel SSD320 40G SSD

iMac 21.5" (late 2011) : i5-2400S, HD 6750M 512MB - Samsung 4x4GB PC3-1333 - WT200 512G SSD (High Sierra) - 1920x1080@60 LCD

 

Test bench 2: G3260 - H81M-C - Kingston 2x4GB PC3-1600 - Winten WT200 512G

Acer Z5610 "Theatre" C2 Quad Q9550 - G45 Express - 2x2GB PC3-1333 (Samsung) - 1920x1080@60Hz Touch LCD - great internal speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it can,  my Fujitsu with a Pentium CPU can play any 3D games from around 1995 i throw at it no problem...

I always knew I could buy a "voodoo card" or something, I just never understood why lol so I didn't...

 

Also tbh, I'm pretty sure a GPU is just a CPU, just different architecture with a lot of "cores" or whatever 🤔 

 

On 10/2/2021 at 7:59 PM, mariushm said:

you're dealing with memory chips that must be much closer to the gpu chip, can't have them on sticks because the extra distance and wire resistance of the slots would cause issues

Eh, isn't that exactly what consoles do though? I think there are pros and cons for both solutions and its not "it can only be done like this" at all.

 

One argument pro unified memory could be PCs finally wouldn't be limited by low vram most GPUs come with...

 

And no, there's also no reason why a PC couldn't have the fastest RAM available instead of the cheapest ... afaik.

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mark Kaine said:

Also tbh, I'm pretty sure a GPU is just a CPU, just different architecture with a lot of "cores" or whatever 🤔 

They are both processing units,just designed for different purposes.

GPU - Graphics Processing Unit

CPU - Central Processing Unit

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

You can use your CPU to "software render" your frames. Thing is, the CPU core is MUCH slower at it than a dedicated GPU core.

 

That's about as simple as i can keep it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

CPUs are dumb in solving graphics, just like me, in solving math problems. GPUs are designed to be quick in solving graphics related problems, which are easy to solve but there are hell a lot of them, so GPUs have hell a lot of smaller cores. Buttt, GPUs are dumb in solving large problems which CPUs are good at, because they have fewer but more capable cores.

Link to comment
Share on other sites

Link to post
Share on other sites

The architectures are too different, otherwise you’d have Intel and AMD  CPUs being touted for their graphics capabilities.

Phone 1 (Daily Driver): Samsung Galaxy Z Fold2 5G

Phone 2 (Work): Samsung Galaxy S21 Ultra 5G 256gb

Laptop 1 (Production): 16" MBP2019, i7, 5500M, 32GB DDR4, 2TB SSD

Laptop 2 (Gaming): Toshiba Qosmio X875, i7 3630QM, GTX 670M, 16GB DDR3

Link to comment
Share on other sites

Link to post
Share on other sites

Sure it can, the same way a GPU can be used as a CPU.

Of course it's not too efficient, a CPU as a GPU will be really slow ("software renderer")  same as a GPU running as a CPU, really poorly. https://www.nvidia.com/en-us/geforce/forums/gaming-pcs/8/265725/dawn-operating-system-boots-and-runs-fully-on-gpu/

 

Just because you can doesn't mean you should.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/2/2021 at 7:49 PM, archerbob said:

How could one custom build a GPU?

At a very broad and general level a CPU can do everything a GPU can, and more, but they are optimised for different use-cases. GPUs are essentially CPUs optimized for large matrix multiplications of floats. And as it turns out you can do a lot with matrix multiplications of floats (machine learning, AI, or whatever you want to call it). CUDA cores (Nvidia) and Stream processors (AMD) are essentially small accumulators optimised for floating point arithmetics, which together with a lot of other accumulators are optimised for matrices. 

 

If you are interested at a technical level look at how the late nineties SIMD instruction-sets for x86 CPUs MMX, SSE and AVX started doing essentially what GPUs do today. 

 

An interesting project on the topic is the Intel Larrabee project, which tried to put fully programmable microprocessors in arrays to form a GPU, rather than the only partially programmable more accumulator like entities of then current Nvidia and Radeon products. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Vishera said:

They are both processing units,just designed for different purposes.

GPU - Graphics Processing Unit

CPU - Central Processing Unit

A CPU would be an example of a more generalized processor whereas a GPU is highly specialized/optimized for the task of rendering graphics, especially when you get into the higher end GPU/cards.

 

43 minutes ago, flindeberg said:

An interesting project on the topic is the Intel Larrabee project, which tried to put fully programmable microprocessors in arrays to form a GPU, rather than the only partially programmable more accumulator like entities of then current Nvidia and Radeon products.

Which is kinda the opposite of what crypto miners do, use the less expensive (depending on the card) GPUs and many of them to work on completing the hash blocks instead of having the CPU do the work. Because as you stated...

 

44 minutes ago, flindeberg said:

GPUs are essentially CPUs optimized for large matrix multiplications of floats.

 

23+ yrs IT experience

 

MAIN SYSTEM

Operating System

Windows 10 Pro x64 21H1

Case

Antec Three Hundred Two Gaming

CPU

AMD Ryzen 9 3900X 3.8GHz 12-Core 24-Thread

Motherboard

Asus ROG Strix X570-E Gaming

RAM

G.Skill Trident Z RGB Series 32GB

(2 x 16GB) DDR4 3200 (PC4-25600)

Graphics Card

Asus Nvidia Geforce RTX 2060 Overclocked (Factory) 6GB GDDR6

Dual-Fan EVO Edition

Storage

2 × Samsung 970 EVO Plus Nvme (M.2 2280) SSD 1TB

2 × Samsung 860 QVO SATA III 6.0Gb/s SSD 1TB (RAID1 Array 1)

2 × Hitachi UltraStar HDS721010CLA330 7200RPM SATA III 3.0Gb/s 1TB (RAID1 Array 2)

PSU

Thermaltake Toughpower GF1 850W 80+ Gold

Optical Drive

LG WH16NS40 Super Multi Blue Internal SATA 16x Blu-ray Disc/DVD/CD Rewriter

Displays

HP w2408 widescreen 16:10 1920x1200 @60Hz

HP w2207 widescreen 16:10 1680x1050 @60Hz

Keyboard/Mouse

Logitech MK200 Wired Keyboard/Mouse Combo Kit

Link to comment
Share on other sites

Link to post
Share on other sites

What about the making of a card using the BGA chip from an old laptop for a 2060, I've see old broken laptops that have decent graphics chips in them.

 

Is it in the realm of possibility for the home user to create the PCB needed to hot air the GPU BGA chip to a PCIe card?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, flindeberg said:

An interesting project on the topic is the Intel Larrabee project,

Another one was IBM Cell processor,  with the advantage that actually worked and sold millions...

 

Would have been super interesting if Sony went ahead with their initial plan to use two of them and therefore one for graphics rendering,  i believe they didn't due to heat mostly, the original PS3s ran hot as hell, but the concept was way ahead of its time,  true "multi purpose" processors... think Alderlake is a similar concept,  even just general "multi threaded" CPUs were kinda the evolution of that IBM / Sony chip.

 

In hindsight it was probably not super ideal for gaming consoles (way too many issues) doesn't make it any less forward thinking though. 

 

5 hours ago, Sauron said:

 

Haha, this is so cool. That's actually kinda what these "voodoo" cards were? They even look similar albeit more complex obviously.

 

Sidenote: dude sounds like vwwestlife,  wonder if there's a connection...even the video style is very similar...

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, archerbob said:

Is it in the realm of possibility

yes

6 minutes ago, archerbob said:

for the home user to create the PCB needed to hot air the GPU BGA chip to a PCIe card?

no

 

home user (depending on what you meant by that) isnt going to bother doing all of the stuff that require to make that mobile gpu works as normal pcie card gpu, i mean it sounds fairly easy at first right? just take it out from laptop and resolder it to the card

until you realized you need to design the pcb (obviously), and hope someone on the internet upload a schematic of that spesific moblle gpu not to mention the trial and error that goes along the way

01110100 01101000 01100001 01110100 00100000 01110111 01100001 01110011 00100000 00110111 00110000 00100000 01101001 01101110 01100011 01101000 00100000 01110000 01101100 01100001 01110011 01101101 01100001 00100000 01110011 01100011 01110010 01100101 01100101 01101110 00100000 01110100 01110110

 

 

 

 

 

 

 

 

 

 

Audio Interface I/O LIST v2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×