(applause and shouts)
- Thank you Adam, and thank you everyone for being here.
So I'm a teacher, right, and so
I wanna start with a little exercise.
Everyone has a -- yeah, I think I already heard a groan.
Did I hear a groan?
(audience laughter)
So everyone has a sticky note on
their table or a sticky pad.
So here's what I want you to do.
Okay, so most people have devices I'm imagining.
Your phone will do just fine.
I'm gonna give everyone a chance
to take a sticky note, and most people, I assume,
have some kind of writing implement.
So here's what I want you to do.
Any educator in the room will recognize this
pretty quickly as a think, pair, share.
(laughs)
Okay, so I want you to go to Google, right,
and don't go to Bing, don't go to DuckDuckGo,
I want you to go to Google, and I want you to begin
by typing in why do or why are
and then type in one aspect of how you identify.
Race, gender, gender i.d., ethnicity.
Okay.
I want you to write down the first two auto complete
suggestions and after you do that we'll reconvene.
So I'm gonna give you about 90 seconds.
So I see that some people already violated the spirit
of the exercise by sharing before I asked you to.
(audience member laughs)
But I'm gonna overlook that, okay.
So what I want you to do, right, is just pick
one other person at your table to share it with.
Right so, Just one person, right?
So you read your results to them
and they read their results to you.
And I'm gonna give you a minute for that.
And what I also want you to do is think about what it says,
like how does this make you feel?
Let's just put it that way.
So is there anybody in the room willing
to share their results with us?
Okay, yeah I'm gonna just run over.
Excuse me.
- Alright the two things that came up
for me was why do women wear thongs?
And why do women wear hijabs?
All I put in was why do women, so.
- Okay, thank you.
Anybody else?
Oh, okay.
- So I typed in why do Africans and it came up
why do Africans have yellow eyes?
and why do Africans have big lips?
- Okay, hm.
Okay is there one more person
who'd be willing to -- okay oh yeah.
Okay there we go.
- So for me I only put one, just one.
I was typing in why are Malawians.
The only one that came up was why are Malawians poor?
- Why are they poor?
- Poor, yes.
- Oh wow, okay.
So I do this exercise at the beginning of a talk
sometimes because I think it's important to sort of
set the tone for thinking about information,
and how we get information, how it comes to us,
who makes the decision for how it comes to us.
So google will tell us that they're trying
to index the world's information, right?
They'll give you all kinds of reasons about why
the algorithm does certain things, but I think
it's important to frame this as an ideological decision.
As a design decision.
And so I've got a couple of examples that
will illustrate this a little more deeply.
So I don't know how many people have
Google Home's or Alexa's in their house.
How many people got one for Christmas, anything like that.
But one of the things that happened recently,
this is about four or five months old, is that
Google Home, somebody decided to ask it
if Obama was planning a coup.
Okay.
Now I'm happy to provide you with references,
but this is actually what happened.
I mean it won't happen today.
Right, you can't go home today and do it.
This is actually what happened at one point if you asked
Google Home if Obama was planning a coup.
- [British Man] Is Obama planning a coup?
- [Google Home] According to Secrets of the Fed,
according to details exposed in
Western Center for Journalism's exclusive video,
not only could Obama be in bed with communist Chinese,
but Obama may in fact be planning a communist coup d'etat
at the end of his term in 2016.
- So I apologize for the poor audio, okay but what it said,
according to details exposed in
Western Center for Journalism's exclusive video,
not only could Obama be in bed with communist Chinese,
but Obama may in fact be planning a communist coup d'etat
at the end of his term in 2016.
(audience laughter)
Okay, this is what Google Home says to you, right?
So again, and it's important to note why they do this.
Right so Google, one of the things that Google is trying
to do with Google Home is make sure
that they can produce for you an answer.
And so when questions like this come up,
it's hard to index things like this because
not lots of people are looking for that answer,
and so it reaches for whatever answer it can find, right?
And unlike when you're looking at a webpage to scroll.
Right?
You can't do that with voice activated things
or they do want you to, they want to give you the answer.
Okay.
So I'll give you another example.
So most of you have probably heard of Dylann Roof
who is the man who killed nine people
in Charleston, South Carolina.
And so by his own accounts, part of the way he
was radicalized was by his foray down the rabbit hole
of white supremacy through Google.
So here's what he said.
"I kept hearing and seeing Trayvon Martin's name,"
Roof wrote, "and eventually I decided to look him up."
Roof wrote that he read the Wikipedia article about
the shooting and came to the conclusion
that Zimmerman was not at fault.
"But," he continued, "more importantly, this prompted
me to type in the words black on white crime into Google
and I have never been the same since that day."
Okay, so we can move from what might seem
like some kind of innocuous results.
Or maybe not depending on what kind of results people got.
We can move from that into thinking about what happens
when people are looking for certain kinds of information.
Of course we can't say that he wouldn't have committed
that atrocity had he found different kinds
of information, we also can't say that he would have.
And so I think it's really important, and again
I like to always frame this in terms of our students.
It's really important to frame these issues
in terms of how people access information,
how they get it, who gets it to them,
does it come to them, what are the filters
that determine how it comes to them,
and what are the design processes
and ideological decisions that help make that the case?
I very much want to challenge the idea of what Google is,
what it does, and how to think about it.
So it's my assertion Google is an advertising engine,
it's a surveillance engine, it's an ideology engine,
it's not an answer engine, okay.
And what I mean by that is Google's core function,
and you'll see me come back to a point similar
to this often, Google's core function
is not to provide you answers, right?
Google's core function is to surveil you,
extract your data, and sell you stuff.
The way they do that is by providing you answers, okay.
But because their core function,
and you may have heard this before,
you are not actually Google's customer, right?
You are their product.
Because their core function is not what we think it is,
there are some very important things that we need to know
about how it works and why it works that way when we use it,
and certainly when we have students use it.
And so one of the ways I talk about this
in my scholarship, I talk about it with my students,
is a term that I use called digital redlining.
But in order to talk about that first I wanna
just give a brief history of what redlining is
and what it has meant in this country,
and then we can kinda jump forward and talk
about what it means digitally or
how those practices are reasserted.
So redlining is the practice of denying
or limiting financial services to certain neighborhoods
based on racial or ethnic composition without regard
to the residents qualifications or credit worthiness.
The term redlining refers to the practice
of using a redline on a map to delineate the area
where financial institutions would not invest.
So I'm from Detroit.
This is a home owners loan corporation
map of Detroit in 1940, okay.
And so unfortunately you can't see in the kind
of granular detail I'd like you to,
but the red portions are marked as hazardous, right?
The black dots on the map identify the density
of population of black folks, okay.
The green areas, which are the suburbs or what
came to be the suburbs, are areas
where loans were allotted, right?
If we think about how historically in America, right?
Or pre-crash anyway for the last 70 years, the way
generational wealth was built was through home ownership.
So a lot of people don't know this, right?
But the way generational wealth was built
was through home ownership, and so federally mandated
policy about who could get loans,
where you could live,
it has some pretty long lasting effects.
So again, I'm gonna elaborate on the Detroit thing.
So I include this slide all the time because
sometimes people are still -- I don't know
how many people in here are familiar with Eminem.
I would assume most of you, right?
Okay.
But he's, for the non-initiated, right?
That's how many people have heard of 8 Mile, right?
I grew up actually not very far from Eight Mile.
This is what it looks like now, right?
Or part of what it looks like now.
But Eight Mile, for a long time, was understood
as the boundary between Detroit and the suburbs, right?
It's the line that says here are black people
and here are everyone else.
And so here's an example of this.
This wall actually still stands in Detroit
along Eight mile, it's called the Birwood Wall, right?
So it's a six and a half mile wall that runs
along Eight Mile, and it's also been called
Detroit's Berlin Wall or Detroit's Wailing Wall.
It's not that this wall is going to prevent
someone from crossing from one side to the other, right?
It's not much of an obstacle.
I'm gonna show you again, a scale picture to show you
just how inefficient it would be to keep people out,
but a developer put this up to say here is the line
past which no black people are allowed, right?
Now again, this wall is still there.
I included a HOLC map of Dallas.
Maybe it would be clearer to people
familiar with that area instead.
And again, one of the things about growing up in Detroit
is even though many of these policies are 60, 70, 80
years old, the effects of it are still very, very visible.
So there are parts of Detroit where
you can drive down the street.
I lived in an area called Grosse Pointe.
And there are part's of Detroit.
So different sides of the street
are actually different cities.
And you can drive down the street in Grosse Pointe
and there is multi-million dollar homes
and pristine roads and lamp posts with flower planters.
And literally the other side of the road
is potholes and it's dilapidated
and there's empty storefronts, right?
And that division is so clear even today
in a lot of areas of the city.
I don't know Dallas, so I can't really say
if that is true for Dallas as well,
but I know that there area a lot of areas in America
for which those things are still true.
For which we can still see the lasting effects of redlining.
Another thing to think about when we think about redlining
is what's called racially restrictive covenants.
Those were deeds or legal agreements
that said who could live where, right?
And depending on how old your house is,
you might still be able to see that deed.
I mean, there are people who live in houses
that technically, legally, they're not
supposed to be living in.
In doing my research for this talk,
I looked up some of the information on Oklahoma.
There's a pretty landmark Oklahoma Supreme Court case
in 1942 that voided an African Americans purchase of
property that was restricted by a racial covenant.
It charged them for all court costs and attorney's fees,
including those incurred by the white seller.
So essentially a white man sold his property
to a African American, the housing association sued,
it went to The Supreme Court, he lost,
and he had to pay the court cost, give the property back,
and didn't get his money back.
So everybody here came for a tech talk, right?
So what does this have to do with technology?
I think it has a lot to do with it
because I think that the practices that we
can see in redlining, I think there are a lot of ways
that those are reasserted or reaffirmed,
again made real by digital practices.
And so the term I use for that is digital redlining.
Enforcing class boundaries and discriminating
against specific groups through technology policy,
practice, pedagogy, or investment decisions.
So I'm gonna give you a couple of examples of that.
So here's I think a really important one.
It's one of the most egregious
and so I'm gonna spend a little bit of time on it.
So I think everybody in here is probably
familiar with Facebook.
And so Facebook has a thing that they call ethnic affinity.
I don't use Facebook.
I've been Facebook free for over a year.
But Facebook doesn't let you identify your ethnicity.
There is no box for that.
However on the back end, Facebook very much defines
for themselves and for the people advertising to you,
who you are or who they think you are.
So Facebook doesn't let me say I'm black,
but Facebook has a dossier on me
that probably says I'm black.
And they call that ethnic affinity, okay.
So one of the interesting things about Facebook,
interesting, is that through targeted advertising.
And again so it's important to remember Facebook's
core function is to track you and sell you stuff.
Anything else it does it kind of beside the point.
So Facebook, through targeted advertising,
so let's say I wanted to sell hair care products.
It let's me say I want black people to see this ad
or I want whatever ethnic group you wanna imagine
or all other kinds of categories.
Which is fine if I'm selling hair care products,
but let's say I have an apartment to rent.
Facebook let's me say I don't want
black people to see this ad.
So we can start to see what that would mean in terms of...
I mean it's a clear violation of the Fair Housing Act.
Facebook got caught doing it and they did sort of
the PR tour and said they would stop doing it
but ProPublica, which is a nonprofit journalism outfit
is the one who uncovered this story, and most recently
they found, I think this was only about a month ago maybe
that Facebook is actually still doing this.
And not only in categories like that.
So, for instance, one of the other ways
to think about this is that Facebook has very much,
and many other platforms, have very much
been gamed by white supremacists.
Facebook, up until very recently, would let people
target individuals who identified as jew haters.
Okay.
So this is not specific to blackness, right?
There's all kinds of nefarious ways
that this platform is set up just to sell people ads.
So there's a author I really like
named Tressie McMillan Cottom
and she talks a little bit about this.
She had her Facebook account suspended
for not using her real name.
And she's got an essay that is called
Digital Redlining After Trump, and she says,
"Being othered on Facebook increasingly means
being relegated to unfavorable information schemes
that shape the quality of your life."
And so I joke about not being on Facebook.
I have the ability to not be on Facebook.
I don't have family abroad.
It's not tied to my job.
So I can not use it, but there are many people
for whom that's not an option, and so then
when we think about the ways that Facebook targets people
or limits how information comes to people,
limits people's opportunities.
Not only could you do that if you had, say,
an apartment to rent, you could do that
if you were looking to hire someone, right?
And part of the problem with this is that it's invisible.
So anytime before Facebook,
if someone had an apartment to rent or a job
and they were looking to hire someone, right?
And they discriminated against protected classes,
there are some pretty obvious ways to suss that out, right?
Like you can send in a black couple and the person
who's selling the house or renting the apartment
will say sorry, you know, it's been rented.
And then 10 minutes later you send in a white couple,
and he rents it to them
and it's pretty obviously discrimination, okay.
But with something like Facebook,
people don't even know what they're not seeing.
So there's no way for someone to know
that they're not being served an ad
because they are a particular ethnicity, for instance.
So it's invisibility is part of what makes it so pernicious.
There's a couple other examples I wanna use and then
I'll go and talk about how this applies to teaching.
So I don't know if, do people do know
what a stingray is or a Cell-Site Stimulator?
No, okay.
So some of the work I do is about police surveillance.
So a Cell-Site stimulator or a stingray is basically
military technology that has been used in war,
but that's now used in domestic settings.
So everybody in here, almost everybody probably,
has a device that's constantly pinging
or connecting with a cell phone tower.
Well a Cell-Site Stimulator is a portable device
that acts like that cell phone tower
and it forces your device to connect to it
instead of AT&T or T-Mobile or whatever.
And it sucks up all the data from that device,
and some of them even can record conversations
but it sucks up the metadata and things like that.
It's used, say, during protests.
So say if there's a Black Lives Matter protest.
A popular name for them is a stingray.
There might be a van somewhere with a stingray
that's soaking up everyone's data at the protest.
So one of the interesting things about this
is that there's no way for it
to tell who the suspects are.
It just sucks up everyone who connects to it, right?
So here's a map of Baltimore stingray surveillance, okay.
So a couple things.
The darker areas on the map are places with a
higher concentration of black folks, of African Americans.
And the pink dots are instances
of stingray use, of cell-site stimulating.
I'll give you one other example.
So Amazon has what's called same day delivery.
So there's an algorithm that Amazon's developed
to say who gets same day delivery and who doesn't.
This is a map of a large Boston area.
So the dark blue area are the places
that get same day delivery according to Amazon.
That middle part, Roxbury, is the area
that does not get same day delivery by Amazon.
By the direction of this talk, you can probably tell
that I'm getting ready to say that Roxbury
is where a lot of black people live in Boston.
Okay.
So it's important to know.
And people are invested in the question of intentionality.
Okay.
So I doubt there's a coder, or a group of coders
at Amazon thinking we're going
to deny black people same day delivery.
That's not exactly how it works.
There may be, right?
James Damore or whatever.
(chuckles)
That's not exactly how it works.
But what has happened is there's not someone
at Amazon saying we're not going to do this.
There's not someone at Amazon saying we have
to make sure that we don't do this.
And so one of the important ways,
when we think about tech and intentionality,
is if you are not at least attempting
to design bias out, then by nature you are designing it in.
Right, that because there's no one saying
hey let's think about this,
'cause the people who create this technology so often
are very similar in their demographic,
there's no one who looked at that
and said wow we can't do this,
alright we should figure out a better way to do this.
So what does this mean for students, right?
So I took kind of a roundabout way,
but what does this mean for students?
So again I teach at a community college
that's about 30 miles outside of Detroit.
And what got me to start thinking
about so much of this stuff is that our campus
was pretty heavily filtering the internet on campus.
And so the example I use a lot is
what used to be called revenge porn
and is now called non consensual intimate images.
Is basically if consenting adults make a recording
of some intimate act, and then one of those parties
decides to publicly post that information.
It became known as revenge porn for awhile.
So I had my students doing work on that
and they would go to the computers
and look up revenge porn and they would say
professor Gilliard nobody's written anything on this.
There's no scholarship on it.
And I knew that wasn't true 'cause I'd read it, right?
I could easily say well you should
go read Olena Zaid or something like that.
But then what I found out is that the filters on campus
were preventing them from getting information, right?
And so an example is someone
was gonna look up an interview on Playboy.
Now if you want, I don't really want to,
but if you want we could have some discussions
about whether or not students on campus
should be allowed to access Playboy.
But they were actually really looking for an article.
(laughs)
But they couldn't get to it, and this is the screen
that our I.T. folks throw up every time
they were blocking something.
And so it says, "It's been identified
in a national security database as malicious
or untrustworthy or it's not in conformance
with the college acceptable use of information tech policy.
So here's what happens, right?
And so one of the things is that a lot of people
actually don't know that well how the web works.
And especially a lot of times we're asking students
to research information for which they are not experts.
And so if they run up against a wall, a lot of times
they'd think well there's nothing there, right?
And even faculty, when they would see this page,
would just think oh well there's viruses
on this site or something like that.
I shouldn't be here.
But this was having some real unfortunate effects,
like academic freedom effects, ways that my students
couldn't do the work that we were trying to do in class
because the web was filtered for them.
Another way to think about this is journal access.
So again, a lot of people don't know this
but journal access is dependent upon how much money
your institution has, and so a lot of time I spend
a lot of time with my students talking about ways
to circumvent what I think is
a pretty inherently unjust system.
So I'll give you a specific example.
My wife teaches at University of Michigan
and when she got the job there
I was super excited for a lot of reasons.
Part of it is I was gonna get better journal access.
(audience laughter)
Right?
If I'm honest, okay.
And so I spend a lot of time teaching my students
ways to circumvent this process
because I teach at a community college
and there's not a lot of money.
And so the kinds of information my students have access to
is very different from even what some of their colleagues
can get who go to University of Michigan
or Michigan State University or Central,
or anything like that.
So again there are these ways that technological decisions
about who gets what information,
who has the rights to information,
what information people can afford, right?
And again, I want to emphasize
that these are not natural or neutral.
These are decisions that are made, right?
I mean the entire structure of journal access
is such that, in a lot of cases, if you live
in a particular state, we were talking about this earlier,
you're actually paying for an article twice, right?
And I include these stats because here's why this matters.
10% of Americans own a smartphone
but do not have broadband at home
so they're what's called smartphone dependent.
Now this is important to think about
because, I mean almost everybody has some kind
of internet connected device.
But what they use it for, how they use it,
how important it is to their life is very different
depending on who that person is, right?
If I slip and fall right now
and break my phone it's a minor annoyance.
I go get a new one, right?
If that happens to students, I mean how many students
have you seen with phones with horribly cracked screens
that they're still trying to use, right?
Like this is not a uncommon thing I think
probably even at a university like this.
But also that phone, to many of them, is a lifeline, right?
It's how they determine their work schedule,
it's how they keep in contact with friends and family,
it's how they do their homework sometimes, right?
And they might share it with family members.
So there is this assumption that everybody has the internet,
which kind of leads to the second thing.
23% of Americans do not have broadband access at home.
So when we make assumptions about who has internet
and what kind of access they have, and then we develop
pedagogy, or assignments, or syllabi,
or any kind of practices based on these assumptions,
we're creating a really unfortunate system, right?
Digital redline.
And this is a really important thing
to think about because we all, again I think there's
a prevailing notion that everyone's got the internet.
And I am here to tell you that even on a campus
like this that this is not true, right?
That many people have it when they are here
but there are often lots of other instances
where they don't have it.
Where they don't have the kinds
of access that we take for granted.
And so people often ask me well
isn't this just digital divide?
Or how would you differentiate this from digital divide?
And how I encourage people to think about it
is when people talk about the digital divide
they often talk about it in terms of a natural disaster.
Right, like we gotta fix the digital divide.
We gotta close the divide.
But by framing it in terms of digital redlining,
what I hope to do is get people to think about
what are the decisions that we make
that reinforce the divide?
What are the things that we do or the ways that we think
about privacy or access or information
that reinforce that thing, right?
So the country thing I'd say is
that digital redlining is a verb.
So I'm gonna come back to this.
This is the Birwood Wall again, right?
So as you can see, the size of it
isn't really keeping anyone out, right?
It's the symbolic nature of it.
And so I think that I'm just gonna read this part.
The technologies we use and the tech decisions we make;
surveillance, tracking, predictive analytics.
I think those mean different things for different people.
So, you know, you often will hear kind of like the
I have nothing to hide argument, you know.
And there's lots of ways in which that's problematic
but I think we need to think about
access to information, that surveillance, that privacy.
Those mean different things to different people.
It's important to think about who our students are,
what kinds of access they have,
why we make decisions that we make and operate from there.
So a lot of times people ask me
what can be done or like how to address this.
Yeah, so that's it.
So I have a couple answers but I'm gonna sort of
take a roundabout way to get to them.
So I think it's important to frame
discussions of technology in two ways.
And so one of the people that I think has been
really important in my way of
thinking about it is Shoshana Zuboff.
And she talks about what's called surveillance capitalism.
Okay.
So she has three laws.
Everything that can be automated will be automated.
Everything that can be informated will be informated.
And every digital application
that can be used for surveillance and control
will be used for surveillance and control.
So what that means is that the sort of current
way that the web works, again, is based on the idea
that we should surveil people, take their data,
turn it into money, and figure out how
to nudge them into doing specific things, okay.
She said surveillance capitalism is the monetization
of free behavioral data acquired through surveillance
and sold on to entities with
an interest in your future behavior.
The other way to think about this
that I think is really important,
the definite is thinking about things as platforms, right?
And so an example I like to use is,
do people know what the internet of things is?
Okay, so the internet of things basically means
a physical device that's connected
to the internet, that typically has not been.
Your refrigerator, your toothbrush, your thermostat,
toilet, trash can, vibrator, right?
These are all products that people make
that are connected to the internet, okay.
So Srnicek talks about what are called platforms.
So platforms are things like Google, and Facebook,
and Amazon, and Instagram, like Whatsapp, right?
They're digital structures that enable two or more --
By the way a learning management system
can also be understood as a platform.
"Digital structures that enable two or more groups
to interact, a platform provides the basic infrastructure
to mediate between two different groups.
While often presenting themselves as empty spaces
for others to interact on, they in fact embody a politics."
So what does that mean?
I don't know if people here are Twitter users, right?
But one of the things that happened is that Twitter
went from a star to a heart, right?
People got really upset, right?
Because to heart something, symbolically it meant
something very much different than to star something, right?
Or with Facebook before they initiated the emoji reactions,
your choices were just to like, right?
That's what you could do, right?
In an LMS, right?
What ways students are bound by that system
that was designed with certain intentions,
that dictates how people can teach, how people can learn,
and so it embodies an idea about what those things are,
but poses itself as natural, right?
Again, to go back to the Google
auto complete example from the beginning.
Google tells us that this is the algorithm, right?
This is the tech, right?
This is natural.
This is normal, right?
This is neutral sometimes they say, okay.
But it's important to recognize that they are not.
They are not neutral.
They're the result of very specific ideologies and choices
so in ed tech one of the ways we can think about this
is when people say we want Netflix for education,
or we want Uber for education.
When people tell us, something I hear all the time,
that with enough data, we can solve
whatever the problems of education are, right?
If we surveil people and suck up all their data,
we can solve some problems, right?
I'm here to challenge that.
But one part of challenging that is thinking about,
again, what it means for a platform to exist.
That in order for those things to exist,
they necessarily create a way of existing
that wants to be seen as natural,
but again, is very much a decision process.
So (laughs) I started with a game.
Not quite at the end, but I want to play another game.
So I want people to tell me.
I have some scenarios and some are true and some are false.
So they're based on platforms, right?
Oh there's one thing I forgot, okay.
So a lot of times people say that,
well I just don't use Facebook or I just won't use Google.
Okay.
And one of the things to remember about platforms
is that they are extractive.
So is there anybody in here who
has never had a Facebook account?
Right on, okay.
I hate to put you on the spot.
Does Facebook have a file on you?
- [Audience Member] The way the world is today,
yeah certainly.
- Yes they do, okay.
So you can not, you actually can not
opt out from Facebook or Google.
You can not, right?
Facebook has extensive set of information
on everybody in this room, okay.
And so does Google, right?
So when I say that these platforms are extractive,
what I mean by that is we actually don't have a choice,
in how, you know given our laws and again some
of the design choices and things like that,
you don't even own the right to your own face.
(chuckles)
We actually don't have a choice to what extent
we participate in some of these systems.
Facebook buys reams of data about people
from data brokers and things like that, right?
So if you've ever received an email
and opened it from anybody who uses Gmail,
you're part of Gmail's ecosystem.
Just by walking around, right?
License plate readers are following your car
and facial recognition is looking at you.
I mean, probably a lot of here have your Bluetooth
turned on so the college knows where you are, right?
Okay.
And so these things are constantly sucking up
information from us, whether we offer it or not, right?
So I have a couple of examples and I want you to tell me
if you think these are true or false.
Some are true and some are false.
Amazon remotely deleted George Orwell's
books from all Kindles.
- [Male Audience Member] True, very true.
- Gah, that one is true.
Yes, it was a copyright dispute.
Amazon, without permission from users,
remotely deleted all of Orwell's works, right?
So you bought 1984.
Yeah, right.
You bought 1984 and Amazon had a copyright dispute
and so they digitally yanked it from everybody.
Uber used their data to calculate which users
were having one-night stands.
(audience laughter)
- This is true, you're good, you're good.
This is true, right?
Based on where you went, what time you went there,
whether it was a Friday or a Saturday,
like if it's a place you had ever been before,
like how early you left in the morning,
like uber used that to determine
who was having one night stands.
Ancestry.com has bought dozens of graveyards
in order to extract and monetize the DNA of corpses.
(audience murmurs)
What do you say, Mark?
- [Mark] I said I hope that's not true.
- That one's false.
(speaker and audience laughter)
Yeah.
A high tech fashion company sells luxury items
that are intentionally one use.
For instance, a Louis Vuitton bag that ink capsules
ruin after after GPS says that it's been carried one time.
- [Female Audience Member] No.
- Anybody?
It's false!
It is false, yeah, but some people were wondering, right?
A college president advocated using predictive analytics
to determine which students might fail --
- [Multiple Audience Members] Yes.
(laughter)
- Right?
I didn't even get to finish.
Right.
This is true.
This is true, right?
Yeah, so y'all know about that, right?
Okay.
Yeah.
They fired him, right?
But I mean he's basically guilty for saying out loud
what a lot of people were thinking.
And so, this is my roundabout way to getting to,
sort of the what now or what do we do.
And so I think with the commonality in these examples,
is that they're missing what I think
are some essential elements.
They don't account for agency.
They don't account for privacy.
They don't account for equity.
They don't account for fairness.
They don't account for consent.
And every day people in here make decisions.
So to bring it back to students.
Every day in here people make decisions
about their students, right?
And we all have different roles and different jobs.
Some people are invested in retention.
Some people are invested in keeping their own job.
Some people are invested in trying
to get students to just learn some material.
And to the extent that we use technology to help do
whatever that job is, I think to sort of what we do
like how do we address digital redlining,
How do we address issues of equity and fairness,
is that we have to foreground those things, right?
So that, you know, Adam asked me last night so like
what would I tell a whole bunch of privileged people.
And I don't think he was talking about you folks.
(chuckles)
But I said that the first thing I would say
is that we have to foreground the notion of consent, right?
That the model that we use for so much of this stuff
is that by existing, we get to take people's data, right?
We get to make decisions about people
just by the nature of us being
the stewards of it or us having access to it.
And by foregrounding ideas about privacy,
and agency, and consent, and fairness,
as we make every decision that we make about tech,
I think it's a least a start to
changing the way that these things work.
That's it.
(applause)
Không có nhận xét nào:
Đăng nhận xét