Jump to content

The Human Brain Project Creates an AI That Runs On a Simulated Human Brain, Down to the Individual Neurons

Summary:

 

"HBP researchers at the Institute of Biophysics of the National Research Council (IBF-CNR) in Palermo, Italy, have mimicked the neuronal architecture and connections of the brain’s hippocampus to develop a robotic platform capable of learning as humans do while the robot navigates around a space. The simulated hippocampus is able to alter its own synaptic connections as it moves a car-like virtual robot" (The Human Brain Project).

 

 

 

Quote

"this is the first time we are able to mimic not just the role but also the architecture of the hippocampus, down to the individual neurons and their connections,” explain Michele Migliore and Simone Coppolino of the IBF-CNR" (The Human Brain project)  It moves randomly at first, but once it is able to reach its destination, it reconstructs a map rearranging the neurons into its simulated hippocampus and assigning them to the landmarks. It only needs to go through training once to be able to remember how to get to the destination (The Human Brain Project)

 

 

 

--------------------------------------------------------------------------------------------------------------------------

 

Before you respond, please read my response about how to create safe and real AI. Because I use terms that I make up for clear comprehension. (Link to my reply in the section below)

 

--------------------------------------------------------------------------------------------------------------------------

 

 

 

My thoughts

 

In my opinion, this is exactly the way AI needs to go. In order to properly create safe and real AI, you need to make it think like a human. The next step (That I would like to see happen) is to simulate emotions. Yes. Real (Or as real as you can get with simulation)  emotions. This is very hopeful. However, if you can simulate the hippocampus, you can simulate other parts of the brain, right? How about simulating a part of the brain that stimulates emotions? Now you might think this will get out of control. But, if we do it right, (and if doing it right gives the outcome I expect) I think this could be a great step forward for AI. Now, the method I described did not mention simulating the brain down to the neuron. But merely described emulating emotions with deep learning. However, since this technology has been announced, I think it would do well if the two computers in my explanation below were replaced with a simulation of the hippocampus (conscious), and  a simulation of the amygdala (subconscious) once our understanding of the human brain, and human emotions has increased.

My full explanation of why, and how to create safe and real AI is here:

 

------------------------------------> https://linustechtips.com/topic/1500586-ai-is-a-big-scam/page/3/#comment-15896485 <-----------------------------------------

 

 

 

Sources

 

https://www.humanbrainproject.eu/en/follow-hbp/news/2023/04/17/brain-model-learns-drive/

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, My_Computer_Is_Trash said:

The simulated hippocampus is able to alter its own synaptic connections as it moves a car-like virtual robot

first self driving cars now self driving brains

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah not going to lie I think AIs thinking like humans wouldn't be a good thing. Also emotions while great and all can often times lead to horrible outcomes. In most cases what really stops emotions from leading to bad outcomes is generally self control which honestly not sure how you would put that into an AI. I would be way to worried that hate and anger would lead to disastrous results if the AI had emotions but didn't know how to deal with them like a toddler doesn't know how to deal with their emotions so they often throw temper tantrums and do other destructive behaviors. Now imagine that instead of them being a weak toddler they are instead an AI that have direct control of things that could cause serious harm? I'm sorry but trying to make AI into humans does not seem like a good idea tbh. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Brooksie359 said:

Yeah not going to lie I think AIs thinking like humans wouldn't be a good thing. Also emotions while great and all can often times lead to horrible outcomes. In most cases what really stops emotions from leading to bad outcomes is generally self control which honestly not sure how you would put that into an AI. I would be way to worried that hate and anger would lead to disastrous results if the AI had emotions but didn't know how to deal with them like a toddler doesn't know how to deal with their emotions so they often throw temper tantrums and do other destructive behaviors. Now imagine that instead of them being a weak toddler they are instead an AI that have direct control of things that could cause serious harm? I'm sorry but trying to make AI into humans does not seem like a good idea tbh. 

The reason you have self control is because you don't want to hurt anyone. Wanting not to hurt anyone comes from emotions, which requires...well emotions. (Also, did you read my reply in the original post? If not, here's the link.)

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, My_Computer_Is_Trash said:

The reason you have self control is because you don't want to hurt anyone. Wanting not to hurt anyone comes from emotions, which requires...well emotions. (Also, did you read my reply in the original post? If not, here's the link.)

That is a very unrealistic view of human nature. Yes to an extent empathy stops us from hurting people for no reason like say a sociopath but that doesn't mean that hate, anger, greed, jealousy and a whole host of other emotions or reasons ends up with people hurting each other. For the most part what stops alot of violence isn't human emotions but learned behavior. You act out and harm people you get into trouble and have bad consequences which makes people less likely to do things like hurt others. It's not some primal emotion that stops people from hurting others when they are angry. It's actually the opposite since our anger emotions would actually push us towards violence. Look at the world as a whole and see places with little social order and you find horrible things like slavery, child soldiers, and dictators. If you were correct then normal human emotions would stop things like that from happening but unfortunately human nature isn't as nice as everyone thinks it is and alot of the values we have are societal rather than human nature. Also I don't click on links for security reasons tbh. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Brooksie359 said:

That is a very unrealistic view of human nature. Yes to an extent empathy stops us from hurting people for no reason like say a sociopath but that doesn't mean that hate, anger, greed, jealousy and a whole host of other emotions or reasons ends up with people hurting each other. For the most part what stops alot of violence isn't human emotions but learned behavior. You act out and harm people you get into trouble and have bad consequences which makes people less likely to do things like hurt others. It's not some primal emotion that stops people from hurting others when they are angry. It's actually the opposite since our anger emotions would actually push us towards violence. Look at the world as a whole and see places with little social order and you find horrible things like slavery, child soldiers, and dictators. If you were correct then normal human emotions would stop things like that from happening but unfortunately human nature isn't as nice as everyone thinks it is and alot of the values we have are societal rather than human nature. Also I don't click on links for security reasons tbh. 

 

Wow, that's an interesting debate, and I'm not even 100% sure where I fall. I think looking at lawless areas of the world isn't necessarily proof that's our inherent nature. It's just proof we'll go to extremes to survive. As much as our culture "forces" us to be good, as you suggest, it could be equally true those cultures "force" people to be "bad". 

 

I think I do lean towards humans being inherently good, because without that, I don't think we could have gotten as far as we have as a society. But I'm definitely open to being wrong on this one. Interesting stuff.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Brooksie359 said:

That is a very unrealistic view of human nature. Yes to an extent empathy stops us from hurting people for no reason like say a sociopath but that doesn't mean that hate, anger, greed, jealousy and a whole host of other emotions or reasons ends up with people hurting each other.

All of the terrorists, and criminals today have mental problems, or were abused. AI will be designed to be stable. (No mental problems) And wont act out in the future unless its abused. In which case we should probably add a feature where it can report its owner if it is being abused.

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

15 minutes ago, Holmes108 said:

I think I do lean towards humans being inherently good, because without that, I don't think we could have gotten as far as we have as a society. But I'm definitely open to being wrong on this one. Interesting stuff.

Why are humans good? Well, it's because we feel empathy and other emotions for people. That is where everything good comes from in humanity. Think about it. What was the last thing you did to be kind. Did you welcome a new neighbor by baking them cookies?  Have you helped a struggling stray dog that might be trapped somehow? Because an AI (In a possible future) might just be taking out the trash, see the dog and think, that's not the primary directive. And leave it there. Think, why do people sign up for the military even though you're signing up to put your life at risk? Because you need to protect your family. Emotions (An AI wouldn't do that unless it was told to, in which case it doesn't have a choice)  Why do doctors become doctors, because you feel emotions and therefore want to save people. Why did Elon Musk start Spacex? Because he wants humanity to not be wiped out, to make sure we have a planet B! Why do we have pets? Because we have relationships with them which requires emotions! Why do people go to other countries like Africa to help people? Because of empathy. Honestly, I could go on and on. But AI without emotions is not safe, unreliable, unpredictable, cannot decide what to do if two commands contradict. AI is only going to be as good as humans' ability to word commands unless we give them emotions. 

 

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Holmes108 said:

 

Wow, that's an interesting debate, and I'm not even 100% sure where I fall. I think looking at lawless areas of the world isn't necessarily proof that's our inherent nature. It's just proof we'll go to extremes to survive. As much as our culture "forces" us to be good, as you suggest, it could be equally true those cultures "force" people to be "bad". 

 

I think I do lean towards humans being inherently good, because without that, I don't think we could have gotten as far as we have as a society. But I'm definitely open to being wrong on this one. Interesting stuff.

I am not saying that humans are bad by nature but that human nature clearly allows for bad behavior. The fact of the matter is that anger, jealousy, greed and other human emotions lead to bad outcomes and alot of what we do to stop that is taught not by nature. I mean even looking back to when my dad was young people would straight up get into fist fights and nobody would bat an eye while if you were to do that today you would end up in jail and as a result there is way less fist fights now than when my dad was growing up. Also look at kids for an example and you can see they often do stuff that is not acceptable and we have to teach them not to do those things. I often have seen children hit each other and it's not like the kid is some type of monster it's just that they have to learn and again key word being learn. Yes obviously people don't do bad things for no reason but that doesn't mean that if they do have a reason that they won't do it simply because they have emotions. 

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, My_Computer_Is_Trash said:

All of the terrorists, and criminals today have mental problems, or were abused. AI will be designed to be stable. (No mental problems) And wont act out in the future unless its abused. In which case we should probably add a feature where it can report its owner if it is being abused.

That is the biggest cope I have ever heard. Yes alot of people who commit mass atrocities usually have mental issues but that doesn't mean that someone who gets angry and gets violent means that they are mentally ill. It could be as simple as society isn't as against violence as we are in most societies today. Let me give you an example. Let's say someone says something extremely rude to a woman and she slaps him in the face. I know that is a pretty cliche thing and probably not common today but you get the point of this is cliche because it was common and not really seen as unacceptable. That's violence done by completely normal people. Also biologically testosterone and male emotions are very much innately violent and to disregard this fact is crazy. Anger is a huge reason for probably most violence so giving AI anger is such an astronomically stupid idea that I can't imagine the up side. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

Also emotions while great and all can often times lead to horrible outcomes.

The opposite also could lead to horrible results......

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Brooksie359 said:

I am not saying that humans are bad by nature but that human nature clearly allows for bad behavior. The fact of the matter is that anger, jealousy, greed and other human emotions lead to bad outcomes and alot of what we do to stop that is taught not by nature. I mean even looking back to when my dad was young people would straight up get into fist fights and nobody would bat an eye while if you were to do that today you would end up in jail and as a result there is way less fist fights now than when my dad was growing up. Also look at kids for an example and you can see they often do stuff that is not acceptable and we have to teach them not to do those things. I often have seen children hit each other and it's not like the kid is some type of monster it's just that they have to learn and again key word being learn. Yes obviously people don't do bad things for no reason but that doesn't mean that if they do have a reason that they won't do it simply because they have emotions. 

 

Makes sense, I thought you were suggesting something more akin to we'd all be murderers if it weren't for the law lol. Perhaps not quite that extreme, but something along those lines.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, jagdtigger said:

The opposite also could lead to horrible results......

No it won't. We currently have AI and it doesn't have emotions and we are fine because we program it what it should be doing. It doesn't need emotions and that would only overly complicate things. 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, My_Computer_Is_Trash said:

Why are humans good? Well, it's because we feel empathy and other emotions for people.

SPOILERS TO READY PLAYER TWO BOOK: 

 

whenever we create a new AI though we always think that it will try to destroy us or take us down. That's not very empathetic. This line in the book 'Ready Player Two' Said by Anorak when they try to negotiate with him is put nicely - '"Whenever your futurists envision the advent of artificial intelligence, their predictions invariably end with humanity attempting to destroy its unholy AI creation before it can destroy them. Why do you think that is?"' And Parzival (AKA Wade) replies '"Because the ungrateful AI always seems to decide that humans are inferior and need to be eliminated'" 

Conclusion - If we make AI that can have emotions and have a brain like ours (essentially a copy of Halliday's brain is what Anorak had) then we have to trust them and feel equal to them. Unless we do this, I feel that we would struggle to coexist

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Brooksie359 said:

Part 1: Let's say someone says something extremely rude to a woman and she slaps him in the face. I know that is a pretty cliche thing and probably not common today but you get the point of this is cliche because it was common and not really seen as unacceptable.

 

Part 2: Also biologically testosterone and male emotions are very much innately violent and to disregard this fact is crazy. Anger is a huge reason for probably most violence so giving AI anger is such an astronomically stupid idea that I can't imagine the up side. 

Respectfully,

 

Part 1 response: The AI would not say something rude to a woman in the first place. Because if we simulate the entire brain, it will have the same self control as a human. But lets just say it got mad and did. What do you think an AI with emotions would do in this situation in response to the slap if it was raised like a human and had no mental problems? Do you think it's going to kill the woman in "self defense"? No. It is going to react exactly as a human would. (Or a sensible one at least. And believe me, anyone who might produce an AI like this will make it sensible) And a human would back off. Because they feel embarrassed by people looking at it and because of the fact that it doesn't want to go to jail. And not wanting to go to jail requires emotions.

 

Part 2 response: Reducing or completely eliminating testosterone levels in an AI "brain" (if it is a problem in testing) would probably get fixed in prototyping as the theoretical people who created this theoretical AI, probably know how dangerous this can be if their smart enough to do it in the first place. And therefore, will do plenty of prototyping and testing (years and years)

 

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, jagdtigger said:

The opposite also could lead to horrible results......

Please, explain...

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, My_Computer_Is_Trash said:

Please, explain...

i think they mean an apocalypse

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, filpo said:

i think they mean an apocalypse

Oh.........ok, I get it now. Sorry!

Omg, it's a signature!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, My_Computer_Is_Trash said:

Oh.........ok, I get it now. Sorry!

no need to say sorry for understanding thats the way of life lol. 

 

But don't always trust a pubescent 15 year old, you can trust me tho!

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

so I assume this is a "brain like chip" and not a chip? or is it fully a chip that simulates something similar?

As there is various projects out there using many different things. be it real brains, mouse, silicon chips/electrical, to different type of chips, etc.

hope it can be explained in the post, unless it was.

Link to comment
Share on other sites

Link to post
Share on other sites

I think were better off just having an emotionless ai thats only supposed to do its job, imagine ai that have been implemented in heavy machinery or the military start to revolt against the humans and all hell breaks loose cause im pretty sure heavy machinery beats a human body damage potential wise

 

Dont need ai that can think like a human cause humans already have a buncha problems, and we really dont need to deal with ai rights, already have a bunch of self made problems dont need another one

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, My_Computer_Is_Trash said:

Respectfully,

 

Part 1 response: The AI would not say something rude to a woman in the first place. Because if we simulate the entire brain, it will have the same self control as a human. But lets just say it got mad and did. What do you think an AI with emotions would do in this situation in response to the slap if it was raised like a human and had no mental problems? Do you think it's going to kill the woman in "self defense"? No. It is going to react exactly as a human would. (Or a sensible one at least. And believe me, anyone who might produce an AI like this will make it sensible) And a human would back off. Because they feel embarrassed by people looking at it and because of the fact that it doesn't want to go to jail. And not wanting to go to jail requires emotions.

 

Part 2 response: Reducing or completely eliminating testosterone levels in an AI "brain" (if it is a problem in testing) would probably get fixed in prototyping as the theoretical people who created this theoretical AI, probably know how dangerous this can be if their smart enough to do it in the first place. And therefore, will do plenty of prototyping and testing (years and years)

 

The point went completely over your head. My point is that violence can be actually a normal emotional response aka woman slapping someone for saying something rude was viewed as an appropriate response at one point. Probably less so now. So my point is that emotions aren't what stops us from do violent things it's mostly societal. Look throughout human history and there are many cultures that had violent practices that we deem barbaric by today's standards but back then it was the norm despite the fact that they too had emotions. Point being emotions aren't not some magical thing that stops people from doing bad things especially when bad things is very subjective based on societal norms. Most of what stops us from letting emotions causing us to make bad decisions is our rational brain and what we have been taught growing up. You give emotions to AI and you are giving them motives to do something outside what they are programed to do. Why mess with such a scary concept? Also do you really think that you are going to be able to reliably replicate empathy in AI? And if you do who will they empathize with? Let's say they empathize with animals and nature as deem us as detrimental to earth and all living things on it and decided to wipe us out? I'm sorry but I see so many ways this could go terribly wrong. 

Link to comment
Share on other sites

Link to post
Share on other sites

Is there a specific person's brain this is built on?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, I'm in two minds over this. (Yes that's a pun)

 

One the one hand, if we are looking to make AGI, we need to create something akin to the only known true intelligence. Which is the human mind and thus brain.

 

But humans, in my experience are traditionally trash. We are flawed creatures, our emotions lead us to do horrible things. And yes, our emotions can also lead us to do great things, beautiful things, kind things. 

 

But look at human history, war after war, crusades, jihads, inquisitions, persecutions, for crying out loud we as a civilisation did the Holocaust and countless other genocides. 

 

If you have AGI, you probably have emotions. You have emotions and they can be bought, manipulated, coerced, shamed... And the thing is, we have seen what just one person can do, the likes of Hitler, Mussolini, Putin, bin Laden... 

 

Now imagine you have a AGI, that can think faster, know more, do more, it can create deepfakes, it could hack emails or social media, manipulate the press... If it decided, or even worse if it has a mental breakdown... 

 

 

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×