×

Mes naudojame slapukus, kad padėtume pagerinti LingQ. Apsilankę avetainėje Jūs sutinkate su mūsų slapukų politika.

image

Kurzgesagt (In a Nutshell), Do Robots Deserve Rights? What if Machines Become Conscious?

Do Robots Deserve Rights? What if Machines Become Conscious?

Imagine a future where your toaster anticipates what kind of toast you want.

During the day, it scans the Internet for new and exciting types of toast.

Maybe it asks you about your day, and wants to chat about new achievements in toast technology.

At what level would it become a person?

At which point will you ask yourself if your toaster has feelings?

If it did, would unplugging it be murder?

And would you still own it? Will we someday be forced to give our machines rights?

AI is already all around you.

It makes sure discounters are stocked with enough snacks,

it serves you up just the right Internet ad, and you may have even read a new story written entirely by a machine.

Right now we look at chat bots like Siri and laugh at their primitive simulated emotions,

but it's likely that we will have to deal with beings that make it hard to draw the line

between real and simulated humanity.

Are there any machines in existence that deserve rights?

Most likely, not yet. But if they come, we are not prepared for it.

Much of the philosophy of rights is ill-equipped to deal with the case of Artificial Intelligence.

Most claims for right, with a human or animal, are centered around the question of consciousness.

Unfortunately, nobody knows what consciousness is.

Some think that it's immaterial, others say it's a state of matter, like gas or liquid.

Regardless of the precise definition, we have an intuitive knowledge of consciousness because we experience it.

We are aware of ourselves and our surroundings, and know what unconsciousness feels like.

Some neuroscientists believe that any sufficiently advanced system can generate consciousness.

So, if your toaster's hardware was powerful enough, it may become self-aware.

If it does, would it deserve rights?

Well, not so fast. Would what we define as "rights" make sense to it?

Consciousness entitles beings to have rights because it gives a being the ability to suffer.

It means the ability to not only feel pain, but to be aware of it.

Robots don't suffer, and they probably won't unless we programmed them to.

Without pain or pleasure, there's no preference, and rights are meaningless.

Our human rights are deeply tied to our own programming, for example we dislike pain

because our brains evolved to keep us alive.

To stop us from touching a hot fire, or to make us run away from predators.

So we came up with rights that protect us from infringements that cause us pain.

Even more abstract rights like freedom are rooted in the way our brains are wired

to detect what is fair and unfair.

Would a toaster that is unable to move, mind being locked in a cage?

Would it mind being dismantled, if it had no fear of death?

Would it mind being insulted, if it had no need for self-esteem?

But what if we programmed the robot to feel pain and emotions?

To prefer justice over injustice, pleasure over pain and be aware of it?

Would that make them sufficiently human?

Many technologists believe that an explosion in technology would occur,

when Artificial Intelligence can learn and create their own Artificial Intelligences,

even smarter than themselves.

At this point, the question of how our robots are programmed will be largely out of our control.

What if an Artificial Intelligence found it necessary to program the ability to feel pain,

just as evolutionary biology found it necessary in most living creatures?

Do robots deserve those rights?

But maybe we should be less worried about the risk that super-intelligent robots pose to us,

and more worried about the danger we pose to them.

Our whole human identity is based on the idea of human exceptionalism,

that we are special unique snowflakes, entitled to dominate the natural world.

Humans have a history of denying that other beings are capable of suffering as they do.

In the midst of the Scientific Revolution, René Descartes argued animals were mere automata―robots if you will.

As such, injuring a rabbit was about as morally repugnant as punching a stuffed animal.

And many of the greatest crimes against humanity were justified by their perpetrators

on the grounds that the victims were more animal than civilized human.

Even more problematic is that we have an economic interest in denying robot rights.

If can coerce a sentient AI―possibly through programmed torture―into doing as we please,

the economic potential is unlimited.

We've done it before, after all.

Violence has been used to force our fellow humans into working.

And we've never had trouble coming up with ideological justifications.

Slave owners argued that slavery benefited the slaves: it put a roof over their head and taught them Christianity.

Men who were against women voting argued that it was in women's own interest to leave the hard decisions to men.

Farmers argue that looking after animals and feeding them justifies their early death for our dietary preferences.

If robots become sentient, there will be no shortage of arguments for those who say

that they should remain without rights, especially from those who stand to profit from it.

Artificial Intelligence raises serious questions about philosophical boundaries.

What we may ask if sentient robots are conscious or deserving of rights,

it forces us to pose basic questions like, what makes us human? What makes us deserving of rights?

Regardless of what we think, the question might need to be resolved in the near future.

What are we going to do if robots start demanding their own rights?

What can robots demanding rights teach us about ourselves?

Our friends at Wisecrack made a video exploring this very question using the philosophy of Westworld.

Wisecrack dissects pop culture in a unique and philosophical way.

Click here to check out the video and subscribe to their channel.

Learn languages from TV shows, movies, news, articles and more! Try LingQ for FREE

Do Robots Deserve Rights? What if Machines Become Conscious? verbo auxiliar|||||||| Verdienen Roboter Rechte? Was wäre, wenn Maschinen ein Bewusstsein bekämen? ¿Tienen derechos los robots? ¿Y si las máquinas toman conciencia? Les robots ont-ils des droits ? Et si les machines devenaient conscientes ? I robot hanno dei diritti? E se le macchine diventassero coscienti? ロボットに権利はあるのか?もし機械が意識を持ったら? 로봇도 권리를 가질 자격이 있을까요? 기계가 의식이 생기면 어떻게 될까요? Verdienen robots rechten? Wat als machines bewust worden? Czy roboty zasługują na prawa? Co jeśli maszyny staną się świadome? Os robôs merecem ter direitos? E se as máquinas se tornarem conscientes? Заслуживают ли роботы прав? Что если машины станут сознательными? Förtjänar robotar rättigheter? Vad händer om maskiner blir medvetna? Robotlar Hakları Hak Ediyor mu? Makineler Bilinçli Hale Gelirse Ne Olur? Чи заслуговують роботи на права? Що, якщо машини стануть свідомими? 机器人应该享有权利吗?如果机器有了意识怎么办? 机器人应该享有权利吗?如果机器有了意识怎么办?

Imagine a future where your toaster anticipates what kind of toast you want. |||||Toaster|vorhersagt||||Toast|| |||||Smart bread toaster|predicts|||||| ||||||||||toast|| Stel je een toekomst voor waarin je broodrooster anticipeert op wat voor soort toast je wilt. Tost makinenizin ne tür bir tost istediğinizi tahmin ettiği bir gelecek hayal edin.

During the day, it scans the Internet for new and exciting types of toast. ||||durchsucht|||||||||

Maybe it asks you about your day, and wants to chat about new achievements in toast technology. ||fragt|||||||||||||| Možná se vás zeptá na váš den a chce si popovídat o nových úspěších v technologii toastů. 也许它会询问你一天过得怎么样,或者想聊聊吐司技术方面的新成果。

At what level would it become a person? Na jaké úrovni by se to stalo člověkem?

At which point will you ask yourself if your toaster has feelings? |||||||||kitchen appliance|| V jakém okamžiku se zeptáte sami sebe, zda má váš toustovač city?

If it did, would unplugging it be murder? Wenn|es|||das Abziehen|||Mord conditional||||disconnecting from power||| ||||unplugging|||

And would you still own it? Will we someday be forced to give our machines rights? ||||||||eines Tages|werden|||||| ||||||||un día||||||| A vlastnili byste ho ještě? Budeme někdy nuceni dát našim strojům práva?

AI is already all around you. KI|||||

It makes sure discounters are stocked with enough snacks, |||Discounter|||||Snacks |||discount stores||||| Zajišťuje, aby diskonty byly zásobeny dostatečným množstvím občerstvení, Es sorgt dafür, dass Discounter genug Snacks auf Lager haben, Het zorgt ervoor dat discounters voldoende snacks hebben, 它确保折扣店有足够的零食库存,

it serves you up just the right Internet ad, and you may have even read a new story written entirely by a machine. es serviert dir genau die passende Internet-Werbung, und du hast vielleicht sogar eine neue Geschichte gelesen, die ganz von einer Maschine geschrieben wurde.

Right now we look at chat bots like Siri and laugh at their primitive simulated emotions, ||||||Bots||Siri||lachen über||||| |||||||||||||basic or rudimentary|| ||||||bots||||||||| Im Moment betrachten wir Chatbots wie Siri und lachen über ihre primitiven simulierten Emotionen,

but it's likely that we will have to deal with beings that make it hard to draw the line ||||||||||||||||ziehen|| ale je pravděpodobné, že se budeme muset vypořádat s bytostmi, které ztěžují stanovení hranice

between real and simulated humanity.

Are there any machines in existence that deserve rights?

Most likely, not yet. But if they come, we are not prepared for it. Am wahrscheinlichsten|||||||||||||

Much of the philosophy of rights is ill-equipped to deal with the case of Artificial Intelligence. |||study of principles||||||||||||| Velká část filozofie práv není dostatečně vybavena, aby se vypořádala s případem umělé inteligence. 権利の哲学の多くは、人工知能のケースに対処するための装備が不足しています。

Most claims for right, with a human or animal, are centered around the question of consciousness. |||||||||||||||conscience ||||||||||konzentriert||||| Většina nároků na právo, u člověka nebo zvířete, se soustředí kolem otázky vědomí. 人間や動物に関する権利の主張のほとんどは、意識の問題を中心にしています。

Unfortunately, nobody knows what consciousness is. 残念ながら、意識が何であるかを知っている人はいません。

Some think that it's immaterial, others say it's a state of matter, like gas or liquid. ||||unbedeutend||||||||||| ||||immaterial = immaterial||||||||||| 有些人认为它是非物质的,其他人则说它是物质的状态,例如气体或液体。

Regardless of the precise definition, we have an intuitive knowledge of consciousness because we experience it. ||||||||intuitive||||||| ||||||||instinctive understanding||||||| Bez ohledu na přesnou definici máme intuitivní znalost vědomí, protože je zažíváme.

We are aware of ourselves and our surroundings, and know what unconsciousness feels like. |||||||||||Bewusstlosigkeit|| |||||||||||unconsciousness|| Jsme si vědomi sebe a svého okolí a víme, jaké je to bezvědomí.

Some neuroscientists believe that any sufficiently advanced system can generate consciousness. |Neurosissenschaftler||||ausreichend|||||

So, if your toaster's hardware was powerful enough, it may become self-aware. |||des Toasters|Hardware|||||||selbständig| |||toaster's|||||||||

If it does, would it deserve rights?

Well, not so fast. Would what we define as "rights" make sense to it? No, ne tak rychle. Dávalo by smysl to, co definujeme jako „práva“? まあ、そんなに早くはない。私たちが「権利」と定義するものはそれにとって意味があるだろうか?

Consciousness entitles beings to have rights because it gives a being the ability to suffer. |berechtigt||||||||||||| |dà diritto||||||||||||| Vědomí opravňuje bytosti mít práva, protože dává bytosti schopnost trpět. 意識は存在に権利を与える。なぜなら、意識は存在が苦しむ能力を与えるからだ。

It means the ability to not only feel pain, but to be aware of it. それは、痛みを感じるだけでなく、それを認識する能力を意味する。

Robots don't suffer, and they probably won't unless we programmed them to. ||||||will not|a meno che|||| Roboti netrpí a pravděpodobně nebudou trpět, pokud je k tomu nenaprogramujeme. ロボットは苦しむことはなく、私たちがプログラムしない限り、苦しむことはないでしょう。

Without pain or pleasure, there's no preference, and rights are meaningless. ||||||Präferenz||||sinnlos Bez bolesti a potěšení neexistuje žádná preference a práva jsou bezvýznamná. 痛みや快楽がなければ、好みはなく、権利は無意味です。

Our human rights are deeply tied to our own programming, for example we dislike pain |||||||||Programmierung||||mögen Schmerz nicht| |||||||||programming||||we dislike| 私たちの人権は、私たち自身のプログラミングと深く結びついています。例えば、私たちは痛みを嫌います。

because our brains evolved to keep us alive.

To stop us from touching a hot fire, or to make us run away from predators.

So we came up with rights that protect us from infringements that cause us pain. ||||||||||atteintes|||| ||||||||||Verletzungen|||| ||||||||||violations or intrusions|||| ||||||||||infringements|||| Dus hebben we rechten bedacht die ons beschermen tegen inbreuken die ons pijn doen.

Even more abstract rights like freedom are rooted in the way our brains are wired ||||||||||||||verdrahtet

to detect what is fair and unfair. ||||||unfair ||||||unfair

Would a toaster that is unable to move, mind being locked in a cage? |||||||||||||Käfig Vadilo by toustovači, který se nemůže hýbat, zavřený v kleci? 動けないトースターは、ケージに閉じ込められることを気にしますか?

Would it mind being dismantled, if it had no fear of death? ||||abgebaut||||||| ||||taken apart||||||| ||||dismantled||||||| 死を恐れなければ、解体されることを気にしますか?

Would it mind being insulted, if it had no need for self-esteem? würde|||||||||||Selbstwertgefühl|Selbstwertgefühl ||||||||||||esteem ||||insulted|||||||| 自尊心を必要としなければ、侮辱されることを気にしますか?

But what if we programmed the robot to feel pain and emotions? ||||||Roboter|||||

To prefer justice over injustice, pleasure over pain and be aware of it? |bevorzugen|Gerechtigkeit||Ungerechtigkeit||||||||

Would that make them sufficiently human?

Many technologists believe that an explosion in technology would occur, |Technologen|||||||würde| |technologists||||||||

when Artificial Intelligence can learn and create their own Artificial Intelligences, ||||||||||Intelligenzen

even smarter than themselves.

At this point, the question of how our robots are programmed will be largely out of our control. V tuto chvíli bude otázka, jak jsou naši roboti naprogramováni, z velké části mimo naši kontrolu. 此时,我们的机器人如何编程的问题将在很大程度上不受我们控制。

What if an Artificial Intelligence found it necessary to program the ability to feel pain,

just as evolutionary biology found it necessary in most living creatures? |||biologia|||||||

Do robots deserve those rights?

But maybe we should be less worried about the risk that super-intelligent robots pose to us, ||||||||||||||darstellen|| ||||||||||||||present to us|| しかし、かも知れませんが、非常に知能の高いロボットが私たちに及ぼすリスクについて、私たちはもっと心配するべきではないかもしれません。

and more worried about the danger we pose to them. und besorgter über die Gefahr, die wir für sie darstellen. そして、私たちが彼らに及ぼす危険について、もっと心配するべきです。

Our whole human identity is based on the idea of human exceptionalism, |||Identität|||||||| |||||||||||unique human superiority |||||||||||exceptionalism Unsere gesamte menschliche Identität basiert auf der Idee des menschlichen Exzeptionalismus, 私たちの人間としてのアイデンティティは、人間の特異性という考えに基づいています。

that we are special unique snowflakes, entitled to dominate the natural world. |||||Schneeflocken|||||| |||||unique individuals|deserving of control||||| |||||snowflakes|||||| dass wir besondere einzigartige Schneeflocken sind, die das natürliche Umfeld dominieren dürfen.

Humans have a history of denying that other beings are capable of suffering as they do. |||||leugnen|||||||||| |||||refusing to acknowledge||||||||||

In the midst of the Scientific Revolution, René Descartes argued animals were mere automata―robots if you will. ||inmitten|||||René|Descartes|behauptete||||Automaten|||| ||||||||||||simple|mechanical beings|||| |||||||||||||automata|||| 在科学革命期间,勒内·笛卡尔认为动物只是自动机——如果你愿意的话可以称之为机器人。

As such, injuring a rabbit was about as morally repugnant as punching a stuffed animal. ||verletzen||||||moralisch|abscheulich||schlagen||Plüsch-| ||||||||ethically|morally offensive||hitting||plush toy| ||injuring|||||||morally offensive||punching||stuffed|animal Als zodanig was het verwonden van een konijn ongeveer net zo moreel weerzinwekkend als het slaan van een knuffeldier.

And many of the greatest crimes against humanity were justified by their perpetrators |||||||human beings collectively||deemed as right|||those responsible ||||||||||||perpetrators A mnohé z největších zločinů proti lidskosti jejich pachatelé ospravedlnili

on the grounds that the victims were more animal than civilized human. ||Grundlage||||||||zivilisiert| ||||||||||civilized| na základě toho, že oběti byly spíše zvířecí než civilizovaní lidé.

Even more problematic is that we have an economic interest in denying robot rights.

If can coerce a sentient AI―possibly through programmed torture―into doing as we please, ||forcer||sensible|||||||||| ||zwingen||fühlend|||||||||| ||force or compel||self-aware, conscious|||||||||| ||to coerce|||||||||||| Wir können eine empfindungsfähige KI - möglicherweise durch programmierte Folter - dazu zwingen, das zu tun, was wir wollen,

the economic potential is unlimited. ||Growth capacity||

We've done it before, after all. Už jsme to ostatně dělali.

Violence has been used to force our fellow humans into working.

And we've never had trouble coming up with ideological justifications. ||||||||ideologischen|Rechtfertigungen |||||||||rationalizations ||||||||ideological| 私たちはイデオロギー的な正当化を考えるのに苦労したことがありません。

Slave owners argued that slavery benefited the slaves: it put a roof over their head and taught them Christianity. Sklaven|Besitzer|||Sklaverei|nutzte||Sklaven||||Dach||||und|lehrte|| ||||forced labor system||||||||||||||the Christian faith 奴隷所有者は、奴隷制度は奴隷に利益をもたらすと主張しました。奴隷には屋根があり、キリスト教を教えました。

Men who were against women voting argued that it was in women's own interest to leave the hard decisions to men. |||||Wahlrecht|||||||||||||||Männer |||||voting||||||||||||||| 女性の投票に反対した男性は、困難な決定を男性に任せることが女性自身の利益であると主張しました。

Farmers argue that looking after animals and feeding them justifies their early death for our dietary preferences. |||||||||rechtfertigt||||||| |||||||||provides a reason||||||eating habits|

If robots become sentient, there will be no shortage of arguments for those who say |||sensibles||||||||||| |||aware||||||||||| ||||||||Mangel an|||||| もしロボットが感情を持つようになれば、そうすべきだと言う人々に対する議論は尽きないだろう。

that they should remain without rights, especially from those who stand to profit from it. 特にそれから利益を得る人々からは、権利を持たないままでいるべきだという意見を受けるだろう。

Artificial Intelligence raises serious questions about philosophical boundaries. ||wirft||||philosophischen|Grenzen 人工知能は、哲学的な境界に関する深刻な疑問を提起しています。

What we may ask if sentient robots are conscious or deserving of rights, ||||||||||würdig|| |||||self-aware||||||| Na co se můžeme ptát, zda jsou vnímající roboti při vědomí nebo si zaslouží práva, 感情を持つロボットが意識を持っているのか、それとも権利を持つに値するのか、という問いを私たちが投げかけるかもしれません。

it forces us to pose basic questions like, what makes us human? What makes us deserving of rights? それは私たちに、私たちを人間らしくさせるものは何か?権利を持つに値するとはどういうことなのか?といった基本的な質問を考えさせます。

Regardless of what we think, the question might need to be resolved in the near future. 私たちの考えに関わらず、この問題は近い将来に解決される必要があるかもしれません。

What are we going to do if robots start demanding their own rights? |||||||||demanding|||

What can robots demanding rights teach us about ourselves? Was|||||||| |||demanding||||| Was können uns Roboter, die Rechte fordern, über uns selbst lehren?

Our friends at Wisecrack made a video exploring this very question using the philosophy of Westworld. |||Wisecrack|||||||||||| |||Wisecrack|||||||||||| |||humor channel||||||||||||TV series |||||||||||||||Westworld Unsere Freunde von Wisecrack haben ein Video erstellt, das genau diese Frage mithilfe der Philosophie von Westworld untersucht. Wisecrack 的朋友制作了一个视频,利用《西部世界》的哲学来探讨这个问题。

Wisecrack dissects pop culture in a unique and philosophical way. |analysiert|||||||| |analyzes thoroughly|||||||| |he dissects|||||||| Wisecrack zerlegt Popkultur auf eine einzigartige und philosophische Weise.

Click here to check out the video and subscribe to their channel.