×

Używamy ciasteczek, aby ulepszyć LingQ. Odwiedzając stronę wyrażasz zgodę na nasze polityka Cookie.

image

TED, Rana el Kaliouby: This app knows how you feel — from the look on your face

Rana el Kaliouby: This app knows how you feel — from the look on your face

0:11Our emotions influence every aspect of our lives, from our health and how we learn, to how we do business and make decisions, big ones and small. Our emotions also influence how we connect with one another. We've evolved to live in a world like this, but instead, we're living more and more of our lives like this -- this is the text message from my daughter last night -- in a world that's devoid of emotion. So I'm on a mission to change that. I want to bring emotions back into our digital experiences.

0:47I started on this path 15 years ago. I was a computer scientist in Egypt, and I had just gotten accepted to a Ph.D. program at Cambridge University. So I did something quite unusual for a young newlywed Muslim Egyptian wife: With the support of my husband, who had to stay in Egypt, I packed my bags and I moved to England. At Cambridge, thousands of miles away from home, I realized I was spending more hours with my laptop than I did with any other human. Yet despite this intimacy, my laptop had absolutely no idea how I was feeling. It had no idea if I was happy, having a bad day, or stressed, confused, and so that got frustrating. Even worse, as I communicated online with my family back home, I felt that all my emotions disappeared in cyberspace. I was homesick, I was lonely, and on some days I was actually crying, but all I had to communicate these emotions was this. (Laughter) Today's technology has lots of I.Q., but no E.Q. ; lots of cognitive intelligence, but no emotional intelligence. So that got me thinking,what if our technology could sense our emotions? What if our devices could sense how we felt and reacted accordingly, just the way an emotionally intelligent friend would? Those questions led me and my team to create technologies that can read and respond to our emotions, and our starting point was the human face.

2:29So our human face happens to be one of the most powerful channels that we all use to communicate social and emotional states, everything from enjoyment, surprise, empathy and curiosity. In emotion science, we call each facial muscle movement an action unit. So for example, action unit 12, it's not a Hollywood blockbuster, it is actually a lip corner pull, which is the main component of a smile. Try it everybody. Let's get some smiles going on. Another example is action unit 4. It's the brow furrow. It's when you draw your eyebrows together and you create all these textures and wrinkles. We don't like them, but it's a strong indicator of a negative emotion. So we have about 45 of these action units, and they combine to express hundreds of emotions.

3:17Teaching a computer to read these facial emotions is hard, because these action units, they can be fast, they're subtle, and they combine in many different ways. So take, for example, the smile and the smirk.They look somewhat similar, but they mean very different things. (Laughter) So the smile is positive, a smirk is often negative. Sometimes a smirk can make you became famous. But seriously, it's important for a computer to be able to tell the difference between the two expressions.

3:49So how do we do that? We give our algorithms tens of thousands of examples of people we know to be smiling, from different ethnicities, ages, genders, and we do the same for smirks. And then, using deep learning, the algorithm looks for all these textures and wrinkles and shape changes on our face, and basically learns that all smiles have common characteristics, all smirks have subtly different characteristics. And the next time it sees a new face, it essentially learns that this face has the same characteristics of a smile, and it says, "Aha, I recognize this. This is a smile expression." 4:29So the best way to demonstrate how this technology works is to try a live demo, so I need a volunteer, preferably somebody with a face. (Laughter) Cloe's going to be our volunteer today.

4:44So over the past five years, we've moved from being a research project at MIT to a company, where my team has worked really hard to make this technology work, as we like to say, in the wild. And we've also shrunk it so that the core emotion engine works on any mobile device with a camera, like this iPad. So let's give this a try.

5:05As you can see, the algorithm has essentially found Chloe's face, so it's this white bounding box, and it's tracking the main feature points on her face, so her eyebrows, her eyes, her mouth and her nose. The question is, can it recognize her expression? So we're going to test the machine. So first of all, give me your poker face. Yep, awesome. (Laughter) And then as she smiles, this is a genuine smile, it's great. So you can see the green bar go up as she smiles. Now that was a big smile. Can you try a subtle smile to see if the computer can recognize? It does recognize subtle smiles as well. We've worked really hard to make that happen. And then eyebrow raised, indicator of surprise. Brow furrow, which is an indicator of confusion. Frown. Yes, perfect. So these are all the different action units. There's many more of them.This is just a slimmed-down demo. But we call each reading an emotion data point, and then they can fire together to portray different emotions. So on the right side of the demo -- look like you're happy. So that's joy. Joy fires up. And then give me a disgust face. Try to remember what it was like when Zayn left One Direction. (Laughter) Yeah, wrinkle your nose. Awesome. And the valence is actually quite negative, so you must have been a big fan. So valence is how positive or negative an experience is, and engagement is how expressive she is as well. So imagine if Chloe had access to this real-time emotion stream, and she could share it with anybody she wanted to. Thank you.

(Applause)

6:44So, so far, we have amassed 12 billion of these emotion data points. It's the largest emotion database in the world. We've collected it from 2.9 million face videos, people who have agreed to share their emotions with us, and from 75 countries around the world. It's growing every day. It blows my mind away that we can now quantify something as personal as our emotions, and we can do it at this scale.

7:11So what have we learned to date? Gender. Our data confirms something that you might suspect. Women are more expressive than men. Not only do they smile more, their smiles last longer, and we can now really quantify what it is that men and women respond to differently. Let's do culture: So in the United States, women are 40 percent more expressive than men, but curiously, we don't see any difference in the U.K. between men and women. (Laughter) Age: People who are 50 years and older are 25 percent more emotive than younger people. Women in their 20s smile a lot more than men the same age,perhaps a necessity for dating. But perhaps what surprised us the most about this data is that we happen to be expressive all the time, even when we are sitting in front of our devices alone, and it's not just when we're watching cat videos on Facebook. We are expressive when we're emailing, texting, shopping online, or even doing our taxes.

8:16Where is this data used today? In understanding how we engage with media, so understanding virality and voting behavior; and also empowering or emotion-enabling technology, and I want to share some examples that are especially close to my heart. Emotion-enabled wearable glasses can help individuals who are visually impaired read the faces of others, and it can help individuals on the autism spectrum interpret emotion, something that they really struggle with. In education, imagine if your learning apps sense that you're confused and slow down, or that you're bored, so it sped up, just like a great teacher would in a classroom. What if your wristwatch tracked your mood, or your car sensed that you're tired, or perhaps your fridge knows that you're stressed, so it auto-locks to prevent you from binge eating. (Laughter) I would like that, yeah. What if, when I was in Cambridge, I had access to my real-time emotion stream, and I could share that with my family back home in a very natural way, just like I would've if we were all in the same room together?

9:26I think five years down the line, all our devices are going to have an emotion chip, and we won't remember what it was like when we couldn't just frown at our device and our device would say, "Hmm, you didn't like that, did you? " Our biggest challenge is that there are so many applications of this technology, my team and I realize that we can't build them all ourselves, so we've made this technology available so that other developers can get building and get creative. We recognize that there are potential risks and potential for abuse, but personally, having spent many years doing this, I believe that the benefits to humanity from having emotionally intelligent technology far outweigh the potential for misuse.And I invite you all to be part of the conversation. The more people who know about this technology, the more we can all have a voice in how it's being used. So as more and more of our lives become digital, we are fighting a losing battle trying to curb our usage of devices in order to reclaim our emotions. So what I'm trying to do instead is to bring emotions into our technology and make our technologies more responsive. So I want those devices that have separated us to bring us back together. And by humanizing technology, we have this golden opportunity to reimagine how we connect with machines,and therefore, how we, as human beings, connect with one another.

10:57Thank you.

10:59(Applause)

Learn languages from TV shows, movies, news, articles and more! Try LingQ for FREE

Rana el Kaliouby: This app knows how you feel — from the look on your face Rana||Kaliouby||application|understands||||||||| Rana el Kaliouby: Αυτή η εφαρμογή ξέρει πώς αισθάνεστε - από το βλέμμα στο πρόσωπό σας Rana el Kaliouby: This app knows how you feel — from the look on your face Rana el Kaliouby: Esta aplicación sabe cómo te sientes por la expresión de tu cara Rana el Kaliouby : Cette application sait comment vous vous sentez - à partir de votre visage ラナ・エル・カリウビーこのアプリはあなたの気持ちを知っている - あなたの表情から Rana el Kaliouby: Ta aplikacja wie, jak się czujesz - na podstawie wyrazu twojej twarzy Rana el Kaliouby: Esta aplicação sabe como te sentes - a partir da tua cara Рана эль Калиуби: Это приложение знает, что вы чувствуете - по выражению вашего лица Rana el Kaliouby: Bu uygulama nasıl hissettiğinizi yüzünüzdeki ifadeden anlıyor Rana el Kaliouby:这个应用程序知道你的感受 - 从你脸上的表情

0:11Our emotions influence every aspect of our lives, from our health and how we learn, to how we do business and make decisions, big ones and small. 0: 11Nuestras emociones influyen en todos los aspectos de nuestras vidas, desde nuestra salud y cómo aprendemos, hasta cómo hacemos negocios y tomamos decisiones, grandes y pequeñas. Our emotions also influence how we connect with one another. Nuestras emociones también influyen en cómo nos conectamos unos con otros. We’ve evolved to live in a world like this, but instead, we’re living more and more of our lives like this -- this is the text message from my daughter last night -- in a world that’s devoid of emotion. |||||||||||||||||||||||||||||||||||pozbawiony|| We've evolved to live in a world like this, but instead, we're living more and more of our lives like this -- this is the text message from my daughter last night -- in a world that's devoid of emotion. Hemos evolucionado para vivir en un mundo como este, pero en lugar de eso, vivimos más y más de nuestras vidas así: este es el mensaje de texto de mi hija anoche, en un mundo sin emociones. Nous avons évolué pour vivre dans un monde comme celui-ci, mais au lieu de cela, nous vivons de plus en plus de nos vies comme ça - c'est le message texte de ma fille hier soir - dans un monde dépourvu d'émotion. So I’m on a mission to change that. Así que estoy en una misión para cambiar eso. I want to bring emotions back into our digital experiences. Quiero devolver las emociones a nuestras experiencias digitales.

0:47I started on this path 15 years ago. I was a computer scientist in Egypt, and I had just gotten accepted to a Ph.D. Yo era un científico informático en Egipto, y acababa de ser aceptado para un Ph.D. program at Cambridge University. ||Cambridge University| So I did something quite unusual for a young newlywed Muslim Egyptian wife: With the support of my husband, who had to stay in Egypt, I packed my bags and I moved to England. |||||||||recently married|Muslim||||||||||||||||||||||| |||||||||młodej|||||||||||||||||||||||| At Cambridge, thousands of miles away from home, I realized I was spending more hours with my laptop than I did with any other human. |||||||||||||||||||||in the company of||| Yet despite this intimacy, my laptop had absolutely no idea how I was feeling. |||bliskości|||||||||| Sin embargo, a pesar de esta intimidad, mi computadora portátil no tenía ni idea de cómo me sentía. It had no idea if I was happy, having a bad day, or stressed, confused, and so that got frustrating. No tenía idea de si era feliz, tenía un mal día o estaba estresada, confundida y eso se puso frustrante. Even worse, as I communicated online with my family back home, I felt that all my emotions disappeared in cyberspace. |||||||||||||||||||digital realm |||||||||||||||||||cyberprzestrzeni I was homesick, I was lonely, and on some days I was actually crying, but all I had to communicate these emotions was this. ||longing for home||||||||||||||||||||| ||||||||algunos días||||||||||||||| ||tęskniłem za domem||||||||||||||||||||| (Laughter) Today’s technology has lots of I.Q., but no E.Q. (Risas) La tecnología de hoy tiene mucho CI, pero no EQ ; lots of cognitive intelligence, but no emotional intelligence. So that got me thinking,what if our technology could sense our emotions? What if our devices could sense how we felt and reacted accordingly, just the way an emotionally intelligent friend would? |||||||||||odpowiednio|||||||| Those questions led me and my team to create technologies that can read and respond to our emotions, and our starting point was the human face.

2:29So our human face happens to be one of the most powerful channels that we all use to communicate social and emotional states, everything from enjoyment, surprise, empathy and curiosity. 2:29So our human face happens to be one of the most powerful channels that we all use to communicate social and emotional states, everything from enjoyment, surprise, empathy and curiosity. In emotion science, we call each facial muscle movement an action unit. So for example, action unit 12, it’s not a Hollywood blockbuster, it is actually a lip corner pull, which is the main component of a smile. ||for instance|||||||hit film||||||||||||||| Así, por ejemplo, la unidad de acción 12, no es un éxito de taquilla de Hollywood, en realidad es un tirón de la esquina de los labios, que es el componente principal de una sonrisa. Ainsi, par exemple, l'unité d'action 12, ce n'est pas un blockbuster hollywoodien, c'est en fait un coin de la lèvre, qui est la principale composante d'un sourire. Try it everybody. Inténtalo todo el mundo. Let’s get some smiles going on. Another example is action unit 4. It’s the brow furrow. |||皱眉 |||Stirnrunzel ||forehead|brow crease ||brwi|zmarszczka It's the brow furrow. Es el surco de la frente. C'est le sillon des sourcils. È il solco sopraccigliare. Het is de wenkbrauwplooi. It’s when you draw your eyebrows together and you create all these textures and wrinkles. |||||facial hair||||||||| Es cuando juntas tus cejas y creas todas estas texturas y arrugas. We don’t like them, but it’s a strong indicator of a negative emotion. ||||||||wskaźnik|||| So we have about 45 of these action units, and they combine to express hundreds of emotions.

3:17Teaching a computer to read these facial emotions is hard, because these action units, they can be fast, they’re subtle, and they combine in many different ways. So take, for example, the smile and the smirk.They look somewhat similar, but they mean very different things. ||||||||smug smile|||||||||| |||||uśmiech||||||||||||| Prenons par exemple le sourire et le sourire narquois: ils se ressemblent un peu, mais ils signifient des choses très différentes. (Laughter) So the smile is positive, a smirk is often negative. Sometimes a smirk can make you became famous. But seriously, it’s important for a computer to be able to tell the difference between the two expressions.

3:49So how do we do that? We give our algorithms tens of thousands of examples of people we know to be smiling, from different ethnicities, ages, genders, and we do the same for smirks. ||||||||||||||||||ethnic groups||gender identities||||||| Nous donnons à nos algorithmes des dizaines de milliers d'exemples de personnes que nous savons être souriantes, d'ethnies, d'âges, de sexes différents, et nous faisons de même pour les sourires. And then, using deep learning, the algorithm looks for all these textures and wrinkles and shape changes on our face, and basically learns that all smiles have common characteristics, all smirks have subtly different characteristics. And the next time it sees a new face, it essentially learns that this face has the same characteristics of a smile, and it says, "Aha, I recognize this. This is a smile expression." 4:29So the best way to demonstrate how this technology works is to try a live demo, so I need a volunteer, preferably somebody with a face. |||||||||||||||||||||ideally|||| 4: 29 Entonces, la mejor manera de demostrar cómo funciona esta tecnología es probar una demostración en vivo, así que necesito un voluntario, preferiblemente alguien con rostro. (Laughter) Cloe’s going to be our volunteer today. |Cloe|||||| (Risas) Cloe será nuestro voluntario hoy.

4:44So over the past five years, we’ve moved from being a research project at MIT to a company, where my team has worked really hard to make this technology work, as we like to say, in the wild. 4: 44Así que en los últimos cinco años, hemos pasado de ser un proyecto de investigación en MIT a una empresa, donde mi equipo ha trabajado muy duro para hacer que esta tecnología funcione, como queremos decir, en la naturaleza. Au cours des cinq dernières années, nous sommes donc passés d'un projet de recherche au MIT à une entreprise, où mon équipe a travaillé très dur pour faire fonctionner cette technologie, comme nous aimons le dire, dans la nature. And we’ve also shrunk it so that the core emotion engine works on any mobile device with a camera, like this iPad. |||reduced|||||||||||||||||| |||zmniejszyliśmy|||||||||||||||||| Y también lo hemos reducido para que el motor de emoción principal funcione en cualquier dispositivo móvil con una cámara, como este iPad. Et nous l'avons également réduit pour que le moteur d'émotion de base fonctionne sur n'importe quel appareil mobile équipé d'un appareil photo, comme cet iPad. So let’s give this a try.

5:05As you can see, the algorithm has essentially found Chloe’s face, so it’s this white bounding box, and it’s tracking the main feature points on her face, so her eyebrows, her eyes, her mouth and her nose. |||||||||||||||begrenzend||||||||||||||||||||| |||||||||Chloe's face||||||box||||||||||||||||||||| |||||||||||||||ramka||||||||||||||||||||| 5: 05 Como puede ver, el algoritmo esencialmente ha encontrado la cara de Chloe, por lo que es este cuadro delimitador blanco, y está rastreando los puntos principales de su cara, sus cejas, sus ojos, su boca y su nariz. Comme vous pouvez le voir, l'algorithme a essentiellement trouvé le visage de Chloé, c'est donc cette boîte englobante blanche, et il suit les principaux points caractéristiques de son visage, donc ses sourcils, ses yeux, sa bouche et son nez. The question is, can it recognize her expression? So we’re going to test the machine. So first of all, give me your poker face. Así que antes que nada, dame tu cara de poker. Yep, awesome. (Laughter) And then as she smiles, this is a genuine smile, it’s great. So you can see the green bar go up as she smiles. Now that was a big smile. Can you try a subtle smile to see if the computer can recognize? ¿Puedes probar una sonrisa sutil para ver si la computadora puede reconocer? It does recognize subtle smiles as well. We’ve worked really hard to make that happen. And then eyebrow raised, indicator of surprise. ||eyebrow raised|||| Brow furrow, which is an indicator of confusion. Fruncir el ceño, que es un indicador de confusión. Frown. Frown expression zmarszczyć brwi Yes, perfect. So these are all the different action units. There’s many more of them.This is just a slimmed-down demo. |||||||||reduced|| jest|||||||||odchudzona|| Hay muchos más de ellos. Esto es solo una demo reducida. Il y en a beaucoup plus. Ceci est juste une démo allégée. But we call each reading an emotion data point, and then they can fire together to portray different emotions. ||refer to||||||||||||||represent various|| ||||||||||||||||ukazywać|| Pero llamamos a cada lectura un punto de datos de emoción, y luego pueden disparar para representar diferentes emociones. So on the right side of the demo -- look like you’re happy. So that’s joy. Joy fires up. |rozpala| La alegría se enciende. La joie s'enflamme. And then give me a disgust face. ||make|||| Try to remember what it was like when Zayn left One Direction. ||||||||Zayn Malik||| Intenta recordar cómo era cuando Zayn dejó One Direction. (Laughter) Yeah, wrinkle your nose. ||zmarszczyć|| (Risas) Sí, arruga la nariz. Awesome. And the valence is actually quite negative, so you must have been a big fan. ||emotional value|||||||||||| ||walencja|||||||||||| Y la valencia es bastante negativa, por lo que debes haber sido un gran fan. En de valentie is eigenlijk vrij negatief, dus je moet een grote fan zijn geweest. So valence is how positive or negative an experience is, and engagement is how expressive she is as well. ||||||||||||||demonstrative|||| Entonces, la valencia es lo positiva o negativa que es una experiencia, y el compromiso es lo expresiva que es ella también. So imagine if Chloe had access to this real-time emotion stream, and she could share it with anybody she wanted to. |||Chloe (1)|||||||||||||||||| Thank you.

(Applause)

6:44So, so far, we have amassed 12 billion of these emotion data points. |||||gathered|||||| |||||zgromadziliśmy|||||| 6: 44 Así que, hasta ahora, hemos acumulado 12 mil millones de estos puntos de datos emocionales. It’s the largest emotion database in the world. We’ve collected it from 2.9 million face videos, people who have agreed to share their emotions with us, and from 75 countries around the world. It’s growing every day. It blows my mind away that we can now quantify something as personal as our emotions, and we can do it at this scale. |zaskakuje||||||||zmierzyć|||||||||||||| Me aleja la mente de que ahora podemos cuantificar algo tan personal como nuestras emociones, y podemos hacerlo a esta escala.

7:11So what have we learned to date? Gender. Our data confirms something that you might suspect. Women are more expressive than men. Not only do they smile more, their smiles last longer, and we can now really quantify what it is that men and women respond to differently. Let’s do culture: So in the United States, women are 40 percent more expressive than men, but curiously, we don’t see any difference in the U.K. between men and women. (Laughter) Age: People who are 50 years and older are 25 percent more emotive than younger people. |||||||||||expressive||| (Rires) Âge: les personnes de 50 ans et plus sont 25% plus émotives que les plus jeunes. Women in their 20s smile a lot more than men the same age,perhaps a necessity for dating. |||||||||||||||requirement|| Женщины в возрасте 20 лет улыбаются намного чаще, чем мужчины того же возраста, возможно, это необходимо для свиданий. But perhaps what surprised us the most about this data is that we happen to be expressive all the time, even when we are sitting in front of our devices alone, and it’s not just when we’re watching cat videos on Facebook. We are expressive when we’re emailing, texting, shopping online, or even doing our taxes. |||||sending emails|sending messages|||||||

8:16Where is this data used today? In understanding how we engage with media, so understanding virality and voting behavior; and also empowering or emotion-enabling technology, and I want to share some examples that are especially close to my heart. |||||||||popularity spread|||||||||||||||||||||||| |||||||||wiralność|||||||||||||||||||||||| Al entender cómo nos relacionamos con los medios, entendemos la viralidad y el comportamiento de voto; y también tecnología empoderadora o que permita la emoción, y quiero compartir algunos ejemplos que son especialmente cercanos a mi corazón. Emotion-enabled wearable glasses can help individuals who are visually impaired read the faces of others, and it can help individuals on the autism spectrum interpret emotion, something that they really struggle with. ||technology accessory|||||||||||||||||||||||||||||| ||noszone|||||||||||||||||||||||||||||| Los anteojos portátiles habilitados para la emoción pueden ayudar a las personas con discapacidad visual a leer las caras de los demás, y pueden ayudar a las personas en el espectro del autismo a interpretar las emociones, algo con lo que realmente luchan. Les lunettes portables activées par les émotions peuvent aider les personnes malvoyantes à lire le visage des autres et elles peuvent aider les personnes autistes à interpréter les émotions, ce avec quoi elles ont vraiment du mal. In education, imagine if your learning apps sense that you’re confused and slow down, or that you’re bored, so it sped up, just like a great teacher would in a classroom. ||||||||||||||||||||przyspieszyły|||||||||| En educación, imagine que sus aplicaciones de aprendizaje perciben que está confundido y lento, o que está aburrido, por lo que se aceleró, como lo haría un gran maestro en un salón de clases. Dans l'éducation, imaginez si vos applications d'apprentissage sentent que vous êtes confus et que vous ralentissez, ou que vous vous ennuyez, alors cela s'est accéléré, tout comme le ferait un excellent enseignant dans une salle de classe. What if your wristwatch tracked your mood, or your car sensed that you’re tired, or perhaps your fridge knows that you’re stressed, so it auto-locks to prevent you from binge eating. |||smart device|||||||detected||||||||||||||||||||| ||||||||||wyczuła||||||||||||||||||||| ¿Qué pasaría si su reloj de pulsera rastreara su estado de ánimo o si su automóvil detectara que está cansado, o tal vez su refrigerador sepa que está estresado, por lo que se bloquea automáticamente para evitar que coma de forma compulsiva? Que faire si votre montre-bracelet suit votre humeur, ou votre voiture sent que vous êtes fatigué, ou peut-être que votre réfrigérateur sait que vous êtes stressé, alors il se verrouille automatiquement pour vous empêcher de manger de façon excessive. (Laughter) I would like that, yeah. What if, when I was in Cambridge, I had access to my real-time emotion stream, and I could share that with my family back home in a very natural way, just like I would’ve if we were all in the same room together? ¿Qué pasaría si, cuando estaba en Cambridge, tuviera acceso a mi flujo de emociones en tiempo real, y podría compartirlo con mi familia en casa de una manera muy natural, como lo haría si todos estuviéramos en la misma habitación? ¿juntos?

9:26I think five years down the line, all our devices are going to have an emotion chip, and we won’t remember what it was like when we couldn’t just frown at our device and our device would say, "Hmm, you didn’t like that, did you? 9: 26 Creo que dentro de cinco años, todos nuestros dispositivos tendrán un chip de emoción, y no recordaremos cómo fue cuando no pudimos fruncir el ceño a nuestro dispositivo y nuestro dispositivo diría, "Hmm, No te gustó eso, ¿verdad? " Our biggest challenge is that there are so many applications of this technology, my team and I realize that we can’t build them all ourselves, so we’ve made this technology available so that other developers can get building and get creative. We recognize that there are potential risks and potential for abuse, but personally, having spent many years doing this, I believe that the benefits to humanity from having emotionally intelligent technology far outweigh the potential for misuse.And I invite you all to be part of the conversation. ||||||||||||||||||||||||||||||||||||improper use||||||||||| |||||||||||||||wiele|||||||||||||||||||||nadużycia||||||||||| The more people who know about this technology, the more we can all have a voice in how it’s being used. So as more and more of our lives become digital, we are fighting a losing battle trying to curb our usage of devices in order to reclaim our emotions. ||||||||||||||||||limit||||||||regain|| ||||||||||||||||||ograniczyć|||||||||| Entonces, a medida que más y más de nuestras vidas se vuelven digitales, estamos luchando una batalla perdida tratando de frenar el uso de dispositivos para recuperar nuestras emociones. So what I’m trying to do instead is to bring emotions into our technology and make our technologies more responsive. Entonces, lo que estoy tratando de hacer es traer emociones a nuestra tecnología y hacer que nuestras tecnologías sean más receptivas. So I want those devices that have separated us to bring us back together. And by humanizing technology, we have this golden opportunity to reimagine how we connect with machines,and therefore, how we, as human beings, connect with one another. ||||||||||neu gestalten|||||||||||||||| ||||||||||rethink|||||||||||||||| ||uczynienia||||||||wyobrazić||||||||||||||||

10:57Thank you.

10:59(Applause)