×

Χρησιμοποιούμε cookies για να βελτιώσουμε τη λειτουργία του LingQ. Επισκέπτοντας τον ιστότοπο, συμφωνείς στην πολιτική για τα cookies.


image

CrashCourse: Media Literacy, Future Literacies: Crash Course Media Literacy #12

Future Literacies: Crash Course Media Literacy #12

Facebook first launched in 2004.

YouTube came out in 2005.

Twitter and Spotify: 2006.

The first iPhone was released in 2007.

Snapchat launched in 2011 and so did Siri.

Tinder was founded in 2012.

Google Glass was first released in 2013.

Amazon's Echo smart speaker came out in 2015.

From social media to smartphones to augmented reality devices to smart speakers.

All of these inventions have changed how we interact with each other, and especially with media.

And yet, they're all just getting started – Well, maybe not Google Glass.

It just wasn't your time, man.

Virtual reality and artificial intelligence are already starting to work their way into our media diets little by little.

You thought internet ads were annoying?

Wait until they can follow you around from billboard to billboard thanks to facial recognition.

You thought it was hard to decipher fabricated news from truth?

Wait until even videos can be Photoshopped. Er...Videoshopped?

Media literacy doesn't just mean learning how to navigate today's media landscape.

It means preparing yourself for tomorrow's, too.

Today we're talking all about literacies of the future.

[Theme Music]

This is our last episode of Crash Course Media Literacy, and I want to thank you all for joining us this far.

During this series we've made a lot of references to new media – computers, the internet, social media –

and how they change or challenge our traditional relationships to media and media literacy.

Today we're going to really dig into that, and talk about two new forms of literacy that promise to shape the future: Data Literacy and Algorithmic Literacy.

OK. First: Data literacy, as you might have guessed from the name, deals with understanding and analyzing data.

And there's a lot of data out there.

You may have heard of “raw data” or “big data” or, if you have a fitbit, “personal fitness tracking data.”

So what is data?

For our purposes, data is: information about the world that is stored in a specific format.

And this is a pretty broad definition.

With every passing year, online companies think up more and more ways to track us.

Links, cookies, IP addresses.

Quite simply, every time you “step out” onto the web, you leave a huge path of data behind you.

If you go back and look at the coverage of the 2016 U.S. election you'll find countless articles about data collected from people:

their preferences, their values, their political party.

When data gets used this way, it's often to give an argument of sense of being scientific.

But just because data exist, doesn't mean it's accurate, or helpful.

Humans are flawed. We have biases. We have agendas.

And humans make the data.

It is not neutral, because we are not neutral.

Say you see in a magazine that 30% of Americans love chocolate ice cream the most.

Ok, well what did everyone else like the most?

Were they undecided?

Did they prefer vanilla or chunky monkey or cotton candy?

Maybe 30% isn't even the majority, and 65% of people like peanut butter ice cream the best.

Data only matters in context.

And it can be helpful in many ways.

It helps us track everything from personal fitness goals to citywide poverty levels.

It's just very easy to misconstrue, because humans are susceptible to nice, wholesome, easy to believe numbers.

I said we'd come back to how your personal data can be used, and that brings us to: Algorithmic Literacy.

Algorithms are basically sets of instructions or calculations for a computer to run.

Websites and apps take the data they collect about us and send it through an algorithm, and out pops some result.

Like Facebook. It takes all your personal data and what you've liked and shared and – serves you ads, yes.

But it also sticks info into an algorithm to decide what appears in your news feed.

Its goal is to keep you on Facebook, so it shows you things it thinks you will really like.

That's why there's no “dislike” button.

The way algorithms personalize stuff for us can be fun and useful.

Plus it makes us feel a little special, like those Christmas ornaments in the store that have your name on them.

It also makes us feel comfortable, like we're in our own little world of happy things.

That's because often, we are.

Eli Pariser calls this the filter bubble, the media world we create where we only see and interact with things and people we already like.

Sometimes we do this to ourselves by curating feeds with our own interests in mind.

But this also happens behind the scenes, algorithmically, without our knowledge.

Facebook might serve up posts from the half of your friends that like the same things you do.

Or Google news might show you a mix or articles they know you're likely to click on.

This might be convenient, sure, but it can also mean seeing a very different version of the world than the people around you.

Algorithmic literacy is knowing that any information you see online is only one slice of the pie, and one that's been cut specifically for you.

One of the most important things about Data and Algorithmic literacy is that they always go hand in hand.

And when you know about them, you can start to ask classical media literacy questions, even of the newest of media.

It's impossible to know what new media technology we're going to be dealing with next year, or next decade, or next century.

And there will always be new complications to learn about.

But no matter what form of future literacy you develop, it's sure to rest on the same basic principle: Skepticism.

So as we wrap up this series, and leave you staring into the uncertain future, let us leave you with this:

Being skeptical means approaching everything by questioning its truth.

Every ad, every song, every book, every article – everything

Being skeptical doesn't mean taking the fun out of all media;

it just means that instead of blindly accepting whatever's thrown your way, a little voice in your head says, “But what about…”

Our brains love to play little games with the media. They love the familiar.

They love things with easy explanations. They love taking shortcuts.

They even love believing things we've heard already, even if they're not true.

Skepticism, adding in a dash of logic and context to our media interactions, helps fight our brains' annoying habits.

You know that saying, “follow your gut?”

That is the opposite of what you should do with media.

“Follow your perception of bias and textual analysis skills” should be the saying; it's just not as catchy.

Media consumers fall into traps all the time because they like when things are comfortable, certain, and easy.

But the world is not always comfortable.

Nothing is certain but the fact that Prince is greatest musician of all time, and that life is rarely easy.

Once you come to terms with the fact that every bit of media isn't as simple as it looks, you're that much closer to understanding it.

Over the past dozen lessons, we've learned the history of media literacy, how to read advertisements and their darker cousin, propaganda.

We have looked at our minds on media and our media on money.

We've explored media law and how to break it, and even looked into the future of literacy.

We shape the media as much as it shapes us.

Our reactions to media are just as important as what's thrown our way.

We're all media creators now, like it or not, and we're certainly all consumers.

Everyone has a role to play, and a responsibility to share, and we all need to do it together.

I'm glad you've joined us on this journey.

Until next time, I'm Jay Smooth and this has been Crash Course: Media Literacy.

Crash Course Media Literacy is filmed in the Dr. Cheryl C. Kinney Studio in Missoula, MT.

It's made with the help of all of these nice people, and our animation team is Thought Cafe.

Crash Course is a Complexly production.

If you wanna keep imagining the world complexly with us check out some of our other channels,

like The Financial Diet, SciShow Space, and Mental Floss.

If you'd like to keep Crash Course free for everyone, forever, you can support the series at Patreon,

a crowdfunding platform that allows you to support the content you love.

Thank you to all of our patrons for making Crash Course possible with their continued support.

Future Literacies: Crash Course Media Literacy #12 Future Literacies: Crashkurs Medienkompetenz #12 Alfabetizaciones del futuro: Curso acelerado de alfabetización mediática nº 12 Littératies du futur : Cours accéléré d'éducation aux médias #12 未来のリテラシークラッシュコース メディア・リテラシー 第12回 Literacias do Futuro: Curso rápido de literacia mediática #12 Future Literacies: Краткий курс по медиаграмотности №12 未来的文学媒体扫盲速成班 #12

Facebook first launched in 2004.

YouTube came out in 2005.

Twitter and Spotify: 2006.

The first iPhone was released in 2007.

Snapchat launched in 2011 and so did Siri.

Tinder was founded in 2012.

Google Glass was first released in 2013.

Amazon's Echo smart speaker came out in 2015.

From social media to smartphones to augmented reality devices to smart speakers. From social media to smartphones to augmented reality devices to smart speakers. Sosyal medyadan akıllı telefonlara, artırılmış gerçeklik cihazlarından akıllı hoparlörlere kadar.

All of these inventions have changed how we interact with each other, and especially with media. Tüm bu icatlar birbirimizle ve özellikle de medya ile etkileşim şeklimizi değiştirdi.

And yet, they're all just getting started – Well, maybe not Google Glass. Ve yine de, hepsi daha yeni başlıyor - Belki Google Glass değil.

It just wasn't your time, man.

Virtual reality and artificial intelligence are already starting to work their way into our media diets little by little.

You thought internet ads were annoying?

Wait until they can follow you around from billboard to billboard thanks to facial recognition.

You thought it was hard to decipher fabricated news from truth?

Wait until even videos can be Photoshopped. Er...Videoshopped? Wait until even videos can be Photoshopped. Er...Videoshopped?

Media literacy doesn't just mean learning how to navigate today's media landscape. Media literacy doesn't just mean learning how to navigate today's media landscape.

It means preparing yourself for tomorrow's, too.

Today we're talking all about literacies of the future.

[Theme Music]

This is our last episode of Crash Course Media Literacy, and I want to thank you all for joining us this far.

During this series we've made a lot of references to new media – computers, the internet, social media –

and how they change or challenge our traditional relationships to media and media literacy.

Today we're going to really dig into that, and talk about two new forms of literacy that promise to shape the future: Data Literacy and Algorithmic Literacy.

OK. First: Data literacy, as you might have guessed from the name, deals with understanding and analyzing data.

And there's a lot of data out there.

You may have heard of “raw data” or “big data” or, if you have a fitbit, “personal fitness tracking data.” Sie haben vielleicht schon von "Rohdaten" oder "Big Data" gehört oder, wenn Sie ein Fitbit haben, von "persönlichen Fitness-Tracking-Daten".

So what is data?

For our purposes, data is: information about the world that is stored in a specific format.

And this is a pretty broad definition.

With every passing year, online companies think up more and more ways to track us.

Links, cookies, IP addresses.

Quite simply, every time you “step out” onto the web, you leave a huge path of data behind you. Quite simply, every time you “step out” onto the web, you leave a huge path of data behind you.

If you go back and look at the coverage of the 2016 U.S. election you'll find countless articles about data collected from people: If you go back and look at the coverage of the 2016 U.S. election you'll find countless articles about data collected from people:

their preferences, their values, their political party. their preferences, their values, their political party.

When data gets used this way, it's often to give an argument of sense of being scientific. When data gets used this way, it's often to give an argument of sense of being scientific. Veriler bu şekilde kullanıldığında, bu genellikle bilimsel bir argüman hissi vermek içindir.

But just because data exist, doesn't mean it's accurate, or helpful. Ancak verilerin var olması, doğru veya yararlı oldukları anlamına gelmez.

Humans are flawed. We have biases. We have agendas. Humans are flawed. We have biases. We have agendas. İnsanlar kusurludur. Önyargılarımız ve gündemlerimiz vardır.

And humans make the data.

It is not neutral, because we are not neutral.

Say you see in a magazine that 30% of Americans love chocolate ice cream the most.

Ok, well what did everyone else like the most?

Were they undecided?

Did they prefer vanilla or chunky monkey or cotton candy?

Maybe 30% isn't even the majority, and 65% of people like peanut butter ice cream the best.

Data only matters in context.

And it can be helpful in many ways.

It helps us track everything from personal fitness goals to citywide poverty levels. Es hilft uns, alles zu verfolgen, von persönlichen Fitnesszielen bis hin zu stadtweiten Armutsraten.

It's just very easy to misconstrue, because humans are susceptible to nice, wholesome, easy to believe numbers. It's just very easy to misconstrue, because humans are susceptible to nice, wholesome, easy to believe numbers.

I said we'd come back to how your personal data can be used, and that brings us to: Algorithmic Literacy.

Algorithms are basically sets of instructions or calculations for a computer to run. Algorithms are basically sets of instructions or calculations for a computer to run.

Websites and apps take the data they collect about us and send it through an algorithm, and out pops some result. Websites and apps take the data they collect about us and send it through an algorithm, and out pops some result.

Like Facebook. It takes all your personal data and what you've liked and shared and – serves you ads, yes.

But it also sticks info into an algorithm to decide what appears in your news feed.

Its goal is to keep you on Facebook, so it shows you things it thinks you will really like.

That's why there's no “dislike” button.

The way algorithms personalize stuff for us can be fun and useful.

Plus it makes us feel a little special, like those Christmas ornaments in the store that have your name on them.

It also makes us feel comfortable, like we're in our own little world of happy things.

That's because often, we are.

Eli Pariser calls this the filter bubble, the media world we create where we only see and interact with things and people we already like.

Sometimes we do this to ourselves by curating feeds with our own interests in mind. Sometimes we do this to ourselves by curating feeds with our own interests in mind.

But this also happens behind the scenes, algorithmically, without our knowledge.

Facebook might serve up posts from the half of your friends that like the same things you do.

Or Google news might show you a mix or articles they know you're likely to click on.

This might be convenient, sure, but it can also mean seeing a very different version of the world than the people around you.

Algorithmic literacy is knowing that any information you see online is only one slice of the pie, and one that's been cut specifically for you.

One of the most important things about Data and Algorithmic literacy is that they always go hand in hand.

And when you know about them, you can start to ask classical media literacy questions, even of the newest of media.

It's impossible to know what new media technology we're going to be dealing with next year, or next decade, or next century.

And there will always be new complications to learn about.

But no matter what form of future literacy you develop, it's sure to rest on the same basic principle: Skepticism.

So as we wrap up this series, and leave you staring into the uncertain future, let us leave you with this:

Being skeptical means approaching everything by questioning its truth.

Every ad, every song, every book, every article – everything

Being skeptical doesn't mean taking the fun out of all media;

it just means that instead of blindly accepting whatever's thrown your way, a little voice in your head says, “But what about…”

Our brains love to play little games with the media. They love the familiar.

They love things with easy explanations. They love taking shortcuts.

They even love believing things we've heard already, even if they're not true.

Skepticism, adding in a dash of logic and context to our media interactions, helps fight our brains' annoying habits. Skepticism, adding in a dash of logic and context to our media interactions, helps fight our brains' annoying habits.

You know that saying, “follow your gut?” Kennen Sie das Sprichwort "Folge deinem Bauchgefühl"?

That is the opposite of what you should do with media.

“Follow your perception of bias and textual analysis skills” should be the saying; it's just not as catchy. “Follow your perception of bias and textual analysis skills” should be the saying; it's just not as catchy.

Media consumers fall into traps all the time because they like when things are comfortable, certain, and easy.

But the world is not always comfortable.

Nothing is certain but the fact that Prince is greatest musician of all time, and that life is rarely easy.

Once you come to terms with the fact that every bit of media isn't as simple as it looks, you're that much closer to understanding it. Once you come to terms with the fact that every bit of media isn't as simple as it looks, you're that much closer to understanding it. Medyanın her parçasının göründüğü kadar basit olmadığı gerçeğini bir kez kabul ettiğinizde, onu anlamaya çok daha yakın olursunuz.

Over the past dozen lessons, we've learned the history of media literacy, how to read advertisements and their darker cousin, propaganda. Geçtiğimiz düzine ders boyunca medya okuryazarlığının tarihini, reklamları ve onların karanlık kuzeni propagandayı nasıl okuyacağımızı öğrendik.

We have looked at our minds on media and our media on money. Zihinlerimizin medyaya, medyamızın da paraya bakışını inceledik.

We've explored media law and how to break it, and even looked into the future of literacy.

We shape the media as much as it shapes us.

Our reactions to media are just as important as what's thrown our way. Our reactions to media are just as important as what's thrown our way.

We're all media creators now, like it or not, and we're certainly all consumers.

Everyone has a role to play, and a responsibility to share, and we all need to do it together.

I'm glad you've joined us on this journey.

Until next time, I'm Jay Smooth and this has been Crash Course: Media Literacy.

Crash Course Media Literacy is filmed in the Dr. Cheryl C. Kinney Studio in Missoula, MT.

It's made with the help of all of these nice people, and our animation team is Thought Cafe.

Crash Course is a Complexly production.

If you wanna keep imagining the world complexly with us check out some of our other channels,

like The Financial Diet, SciShow Space, and Mental Floss.

If you'd like to keep Crash Course free for everyone, forever, you can support the series at Patreon,

a crowdfunding platform that allows you to support the content you love.

Thank you to all of our patrons for making Crash Course possible with their continued support.