Jump to content

Anyone know much on Intels Acceleration Card "Mustang-F100-A10"?

Is this what Raja Kadori and Jim Keller have moved over to Intel to develop? I heard nothing of this because it seems like Server acceleration cards but would this be the sort of angle Intel is going into the graphics route? 

https://www.intel.com/content/www/us/en/programmable/products/boards_and_kits/dev-kits/altera/acceleration-card-arria-10-gx.html

https://www.ieiworld.com/en/product/model.php?II=614

https://www.youtube.com/watch?time_continue=41&v=5Pu4cHySSZ0

it all makes my brain whirl, but i found it interesting as it looks decievely like a GPU  - I know it's for FPGA Field programmable gateway array - light research suggests this is for programming effective asic work on the go (could be wrong). I thought thats sort of what Nvidias Tensor cores were sorta doing although this is more a jack of all trades kind of thing.

maybe worth a video in Tech quickie or a news spot in TechLinked seeing this stuff has only come out a few months ago. 
damn rabbit hole of Questions ive dug myself in... i still dont know how to work my new NAS properly

Link to comment
Share on other sites

Link to post
Share on other sites

sorry, but to post in the Tech News section, your OP must contain specific criteria as described here: https://linustechtips.com/main/topic/11724-posting-guidelines-read-before-posting/

this is the reason it has been moved to General Discussion. correct the OP and it can be moved back, otherwise it will reside here.

Link to comment
Share on other sites

Link to post
Share on other sites

That first link is an FPGA card, which is completely different than a video card (at least as far as purpose and SI layout).  Think of FPGAs as "programmable" "hardware."  They're hardware but they're much slower than an ASIC (what GFX cards are).   You can program a FPGA for a certain problems/data sets and depending on the work load it will be faster (or slower) than a GPU but that because you're programming hardware to do exactly what you need and nothing you want vs GPGPU code where you're writing code to run a a GPU (with it's own arch/HW layout). 

 

Sorry if this doesn't make too much sense, I've been drinking a little...

 

Edit: ooooooooooookkkkkkkkkkkkkkkk, just ignore this post

"Anger, which, far sweeter than trickling drops of honey, rises in the bosom of a man like smoke."

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×