Jump to content

A Hypothetical Question of Morality as it Equates to Technology

Skipple

So I have a question that I bounce around in my head a lot. 

 

Let's start with the following assumptions: 

  • The pace of technology is growing ever faster 
  • Because of that, it's reasonable to assume that in the future we would be able to simulate a human brain to the upmost degree of accuracy
    • This can be done with computers in the traditional sense, or technology that we cannot conceive of yet. 
  • This simulation of the human brain has receptors that simulate the human experience and senses (as we understand them)
  • There are no differentiating factors other than biological flesh
    • It feels, has emotion, interpersonal connection, love, hate, consciousness. It has all the wonderful and detrimental characteristics of a human. 

The questions that this thought experiment invoke in me are: 

  • Would this 'thing' be human?
    • If no, what's the determining factor?
  • Should it have the rights of a human?
    • Right to life?
    • Protection from the law?
    • Property rights?
    • If no, to any of these, why?

ask me about my homelab

on a personal quest convincing the general public to return to the glory that is 12" laptops.

cheap and easy cable management is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

It is not new that someone has questioned the humanity of a computer.  There was an anime call .hack//sign about this.  But, in my opinion: No, it would not be human.  It would have to be built, not born.  Engineered, not evolved.  And right now, not even all humans are getting human rights, let lone machines.

It must be true, I read it on the internet...

Link to comment
Share on other sites

Link to post
Share on other sites

id say no because its just a simulation and its not the same thing as the real thing any more than a fluid simulation of water being water

Link to comment
Share on other sites

Link to post
Share on other sites

taking this the other way a bit; once we can upload our consciousness, would destruction of that data be considered murder?

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, VeganJoy said:

taking this the other way a bit; once we can upload our consciousness, would destruction of that data be considered murder?

Imagine the asking price to decrypt grandma if she falls victim of ransomware!

It must be true, I read it on the internet...

Link to comment
Share on other sites

Link to post
Share on other sites

also you presume the a computer CAN mimic humans, but the nature of a computer makes it very hard to mimic computers. Consider CAPTCHA's for example. Simple for us, nearly impossible for an AI. there's lots of hurdles to pass before computers reach the point of questioning humanity, let alone sentience. I'd rather we focus on the humans not getting equal rights already

Insanity is not the absence of sanity, but the willingness to ignore it for a purpose. Chaos is the result of this choice. I relish in both.

Link to comment
Share on other sites

Link to post
Share on other sites

Note all my answers are subjective here

 

1. No they arent humans since they lack flesh like we do. You wont call software (since that's what it is) a living creature right?

 

2. Does it suffer from not having rights? We evolved to treat pain as sth negative because it stems from bad things like being tired, carry wounds or sustain loss of property, all of which lower our chances of survival (at least in the prehistoric days). We made rights because we disagree that anyone should suffer for others' benefit and not their own. 

 

Can we say the same to an advanced software? Does it have anything to suffer from? Can we harm it? If its existence is what harms it, then does deleting other software today something ae serious as commiting homicide?

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

It would not be human. That doesn't mean it wouldn't be sentient and entitled to rights. Go watch star trek the next generation, they did a great episode on the concept

desktop

Spoiler

r5 3600,3450@0.9v (0.875v get) 4.2ghz@1.25v (1.212 get) | custom loop cpu&gpu 1260mm nexxos xt45 | MSI b450i gaming ac | crucial ballistix 2x8 3000c15->3733c15@1.39v(1.376v get) |Zotac 2060 amp | 256GB Samsung 950 pro nvme | 1TB Adata su800 | 4TB HGST drive | Silverstone SX500-LG

HTPC

Spoiler

HTPC i3 7300 | Gigabyte GA-B250M-DS3H | 16GB G Skill | Adata XPG SX8000 128GB M.2 | Many HDDs | Rosewill FBM-01 | Corsair CXM 450W

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Cyracus said:

they did a great episode on the concept

Multiple great episodes

 

Measure of a man being the most notable, but every season has an "is it life" episode, more or less. Definitely does try to touch on things like "is it still a lifeform if you can't prove it has a soul" which I think would be the most forefront part of the conversation if we dropped an AI into our current generation.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

I would say yes, it is. The most reasonable view we have of consciousness/sentience is that it arises from being able to process abstract/meta versus concrete concepts, such as this very question. It may be that an extremely elaborate and intelligent computer program could be created that is not sentient, or that is sentient but not human (this is the primary issue in the topic of AI safety, an AI that is sentient but does not share human instincts/values), but if it was a biological simulation of the human brain then there isn't any fundamental difference from a human.

 

Here's how I would think about it. Let's say that in this scenario, we can also upload an existing person's brain into this simulation where they can live forever. Would we call that person, who is now a computer simulation, no longer human? I feel that we would, instead, treat them the same way as when they were in a physical body. If you agree with this, then there's no reason why a simulated brain that never actually existed would be any different from a moral standpoint.

Link to comment
Share on other sites

Link to post
Share on other sites

Grow a brain?

Bio-chemical computing?

Will it need 5 senses to be closer to a human?

Will this brain have emotions?

Imagination? Can it believe in the god or godess you believe in?

 

How imperfect of a first try is that going to be?

 

X86 processing will never be bio-chemical computing. So the tech in your pocket probably won't all of a sudden actually have feelings from it's own free will and imagination.

 

Technology is only an extension of humans. We would only design it in our imaginational ways. Yes I made that word up. Lol...

 

Link to comment
Share on other sites

Link to post
Share on other sites

To quote Star Trek The Next Generation episode "The measure of a man"

If it was a box with wheels we wouldn't be having this conversation. 

In that episode the debate was whether or not Data, a human shaped android was human enough to refuse to be dismantled so they could figure out how to make more of him.   In the strict physical sense no it would not be a human.  However, it would be a sentient, intelligent, consciousness which as such should have the same basic rights as we do.  

 

The truth of this question probably lies in that physical and emotional reality.  If this conscious brain is enclosed in a human like body with hair and eyes and fingernails that grow and ... fully functional.   Then we will likely regard that as a human being in all but name.  If however this brain is housed in a computer the size of a building then we will probably enslave it to ... learn how to better serve ads to our cell phones. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Skipple said:

 I agree that it would not be classified as ‘human’ but if we actually had completely sentient computers, I think we should treat them as their own race or society. If there were a nation made up of completely sentient computers, I would respect that nation as sovereign and its people as real. If they were citizens of another country, I think laws and regulations would have to adapt to apply to these machines as I’m sure there are plenty of things that wouldn’t match up properly. For instance, healthcare for these machines wouldn’t be a doctor but rather a maintenance specialist. There would be many repercussions due to things such as that so there would definitely need to be a lot of debate and legislation changes to make them have the same rights as us.

Link to comment
Share on other sites

Link to post
Share on other sites

Absolutely it needs to have rights, if it has feelings, emotions, anger of course it does, just because it’s not biological doesn’t mean that they’re disposable, it has feelings too. 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Skipple said:

Would this 'thing' be human?

No.

 

10 hours ago, Skipple said:

If no, what's the determining factor?

If we go by the dictionary definition of human, which is:

Quote

a bipedal primate mammal

then the 'thing' is not a mammal, so therefore it does not qualify as a human.

 

10 hours ago, Skipple said:

Right to life?

Yes.

 

10 hours ago, Skipple said:

Protection from the law?

Yes.

 

10 hours ago, Skipple said:

Property rights?

No.

 

10 hours ago, Skipple said:

If no, to any of these, why?

I personally do not believe a machine that does not have a physical body has any need of property.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, shoutingsteve said:

It is not new that someone has questioned the humanity of a computer.  There was an anime call .hack//sign about this.  But, in my opinion: No, it would not be human.  It would have to be built, not born.  Engineered, not evolved.  And right now, not even all humans are getting human rights, let lone machines.

Data from Star Trek would like to have a word with you

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×