Incite Presents Laura Boykin and Malkia Devich-Cyril in Conversation with Peter Rubin
Released on 11/09/2019
I am thrilled to be here
with our guests for the next conversation.
I'm gonna do a little bit of inadequate introduction
for each of you and then tease out some more details
about you both and your work as we go.
So across the stage from me, Laura Boykin,
who is holding the little boxes
we're gonna be talking about, is a computational biologist
who uses genomics and super-computing
to help farmers in Sub-Saharan Africa control white flies,
which spread viruses that have caused devastation
of local cassava crops.
She's currently a senior research fellow
in the School of Molecular Sciences
at the University of Western Australia
and is also a Senior TED Fellow.
Malkia Devich-Cyril, directly to my right,
is the co-founder and co-director of MediaJustice,
an organization dedicated to advancing racial justice,
rights, and dignity in a digital age.
For more than two decades,
they have organized communities of color
and other underrepresented groups against media bias
and advocated for an open and affordable internet,
the abolition of discriminatory high-tech law enforcement,
and accountable tech platforms and companies,
among other human rights safeguards,
about all of which, I think,
fall under or purview here at Wired25.
So, Laura, I'd like to start with you,
in part because I'm never gonna be able
to explain your project as well as you can,
and in part because I'm hoping that the story
will have us feeling good about the power of data
before we come crashing back down to earth.
All right, so we're interested, myself personally,
I'm interested in two things.
One is equity and inclusion in science
because truthfully, it's a white man's game,
and we need to change that.
Secondly, I care about small holder farmers in East Africa,
and we're using these portable devices.
One's a molecular lab,
and one's a little mini super-computer
that has a Nvidia GPU chip in it,
and we take all of this to the farm,
and we diagnose sick plants real-time
so that the farmers can take action right away.
And a lot of people told us we couldn't do it,
and then we did it, and now we're here
talking about how the data
can empower local communities to make decisions,
and I think the most powerful thing about this
is the data stays with the team in East Africa,
and they have complete control over what they do.
So we're working for small holder farmers
and doing really high-tech things,
but everything we do is wrapped in social justice.
So that's what we're doing.
Now, Malkia, you were part of
our inaugural Wired25 slate last year,
and you were nominated for that
by someone named Edward Snowden, which leads me to think
that while Laura's work is undoubtedly beneficial,
I suspect you might feel it's the exception to the rule
with regards to how data's being used.
I mean, there's a reason why, for the last 20 years,
I've been working for a future
that makes communities of color
connected, represented, and free,
and one of those big reasons is because we understand
that technology is, not in and of itself,
but in the context of our current political
and economic environment, driving discrimination,
and so working with organizations across the country
and also with scientists and technologists
and folks working on media and tech policy,
but also and primarily working with communities of color
and racial economic justice groups,
we've been working to change some of the rules
and also create some of the rules that will restrain,
dare I say it, will restrain capitalism
and how this technology is driving inequality.
So we've undoubtedly seen, for the past few years,
this conversation happen, or begin,
it might be long overdue, about the way that tech companies
are treating the data of their users.
And to look at science, I think we all thought, Laura,
that science was free of this
because science is empirical truth.
It's numbers, right?
It's the scientific method.
What's wrong about that presumption?
I mean, just science, at its core, is systemically racist
and sexist and everything else -ist.
We don't have the right voices in the game, right?
But how does that affect the data,
I guess, is my question?
So for example, last week, there was a study
that came out in Nature magazine that said
the origin of human life is Botswana.
A group of scientists flew there, collected the data,
flew back to Australia, published their Nature paper.
Yippee, and the local communities have no access
to their own data, and the scientists in the region
have no access to that information.
How is that okay?
Scientists do that all the time.
We're asleep at the wheel.
We are asleep at the wheel.
We are doing things that are not equitable.
It's what we're doing.
So this goes back, then, to ownership of the data.
That right?
Funders should not fund that.
Nature should not publish that.
What's a corrective?
How do you rectify that?
From the beginning, the local community
should have been included in the design of the study,
and that would have never happened.
And this happened with your work.
Not mine.
No, no, no, but the way that your studies
and actions in East Africa were set up
got around that by building itself in a way.
Tell us how you constructed that work
to alleviate some of the problems.
So it's interesting because I was invited on the team
to this plight to save cassava
by a group of East African scientists
who saw that I could do computational work,
and they said, Hey, would you like to come
and learn more about cassava?
So I think I was invited.
I didn't invite myself, you know what I mean?
I'm an option, I'm not a requirement.
So from the very beginning,
our studies are inclusive in a true partnership.
We make sure all of the data is owned by the communities.
It's empowerment of the communities.
That's what we care the most about.
And that other study I mentioned,
they could have done that, as well, but it's harder.
It's harder to do it the right way.
Malkia, another one of the Wired25 cohort from last year,
Glen Weyl, espouses a number of ideas
under what he terms radical labor.
One of them is that data is labor,
meaning your activity on social media,
whether it's Tracy on Instagram or anybody on Facebook,
if you are tagging people in photos
or conveying your emotional state,
that becomes data to help train the AI at those companies,
and that if you are doing that,
then you should be compensated in some regard.
Does that idea, if we start treating data as labor,
does that get to the root of the problem?
I would say no.
To me, I look at data as more like
a natural resource, right?
And I think my skin, my DNA,
where I've been, my location,
all of those things that make me, not who I am,
but comprise what I'm doing and how I live my life,
it's me, right?
It's not just my labor.
That is too small a container to think about the data in.
What we're talking about is,
we're talking about whether people are allowed
to remain free or are incarcerated.
We're talking about how people are paid,
whether they have jobs, right?
This is not just about a one-to-one compensation issue.
You feeling me?
This is about core freedom issues.
This is about companies who know every single thing
about us, and we know nothing about them.
This is about governments who use data to be authoritarian,
to make laws that oppress people,
and people, citizens and otherwise,
who have no power to resist that use of data.
This is ultimately and fundamentally about control,
and not only who controls your democracy,
but who controls your daily life.
So, no, I wouldn't make it a one-to-one equation like that.
So given that these dangers are coming
from the inside out and from the top down,
is that solution necessarily top-down, as well?
Is a regulatory measure the only thing
that is gonna get this in check?
I think there are a few things.
I think projects like Laura's are important
because one of the things that happens
is when communities identify the purpose for the data,
they identify the scope they use,
how the data's gonna be stored,
they identify the parameters around how it's going to be,
just everything about it, then it can be used to empower.
It can be used, actually, not only to empower,
to democratize whatever the situation is, right?
Whatever we're talking about, education, food, whatever.
But also, it can be used to resist, right?
That's one way, but the challenge
with that process is scale.
I mean, that's always the challenge, right?
So that's why you also need laws.
And the problem that we're dealing with today,
and you already know this, right?
This is not something that, we not new to this, right?
We know that regulatory frameworks
for technology right now are weak.
We know that.
That's weak globally, that's weak nationally,
and so we have technology that is completely unrestrained.
We have laws that have very little,
we have few laws, and then we have very little power
to enforce the laws that we have.
So in my opinion, number one,
we need laws at the local and state level
because I don't have much faith right now, at least,
in anything federal, and I don't know that that will change
no matter who's in office.
I think that local, municipal governments
and state governments need to step forward.
I think at the international level,
we need some rules that constrain.
These are international companies, you know.
They don't simply affect, you know,
when I think about incarceration and the carceral state
being a landscape for technology,
that's not only here in the United States.
We're talking about borrowing technologies
from militaries all over the world, right?
So this is a global problem.
It needs a global solution,
and I think rules and laws are an important place to start.
Is there anything that's happening,
either at the municipal or state level
or I can only imagine the answer to this one,
anything happen cooperatively between nations
that gives you a sense that this is possible?
Is anybody doing this in a way
that gives you any kind of confidence?
Here are some of the challenges, right?
And you wanna talk about solutions.
Okay, I'm gonna do that.
But also, I'm gonna throw in some challenges, you know.
So at the local level,
you know there are municipal ordinances
happening all over the country
that allow communities to say they don't want
certain types of surveillance technologies to be used.
That said, what if you're in a community
that said is does want it?
So just simply giving the power over to a local community
to determine whether or not they want it,
it's a step, but it's not a sufficient step.
There are rules in California, state ECBA,
the state privacy law, and companies are finding
all kinds of ways to subvert that law, right,
and to sidestep it.
So there are things that are happening
that are very important,
but each one of those things has a limit.
You know, when we thought,
I won't say we 'cause I didn't think that,
but when people thought that body cameras
were gonna be this incredible saving grace
to police brutality, and those among us,
including myself, were saying, Wait a minute.
You're not considering all that data
that's gonna be collected.
Where's it gonna be stored?
That camera's facing me.
It's not facing the police officer.
So how is that gonna create accountability?
My point is that some of the things
we think will save us actually hurt us.
Now, Laura, you've been in Australia, New Zealand,
Africa, for the better part of a decade.
You are coming back here for a little while
and then relocating again to East Africa
to kind of continue with your work dedicatedly.
What are you hearing in those locales,
whether in Africa or Australia,
from colleagues and the farmers that you're working with?
Is there any sense of the conversation
that is happening in the states
that's percolating to them,
or are there just other concerns that they have?
So I think that it is a privilege to worry about this.
We're sitting in this room, and we're worrying about
this privilege that, when we're out on the farms,
farmers, these amazing women, have taught me
that there are two things that matter.
Your health and food, and everything else is bonus.
That's what they're worrying about.
A very large proportion of the population on this planet
is not thinking about this.
And not to say it shouldn't be at the forefront, right.
So for me, I think that all of the work that Malkia's doing,
all of that can be transferred over into East Africa
because Africa is the past, the present, and the future.
It really is, in everything, and so these laws
and this governance you're talking about,
will need to be applied there, as well.
But at the current state of affairs,
we're just trying to make sure that farmers aren't hungry.
And we're trying to use the latest technology
to make that happen by getting the data closer.
So they're not worried about a lot of these things
that we're talking about.
It's privileged to worry about those.
Malkia, from 2016 to now, 2019 to 2022,
do you see this as being a curve or a line?
Meaning, is there significant demonstrable change
to be affected in these next three years,
or is it a longer, more gradual battle?
Look, that's a great question.
I wish I had a clear answer, but I will say this.
Groups like MediaJustice,
groups like Media Mobilizing Project,
groups like, bunch of civil rights
and human rights organizations across the country,
came together, pushed Facebook to adopt
this civil rights audit.
You know, we won that civil rights audit.
We thought it was gonna be this amazing step forward.
They have to monitor themselves.
They have to stop making these kind of violations
around censorship, around surveillance, whatever.
And here we are, you know,
and we are in a presidential election cycle,
and they announced that they're gonna let politicians lie.
It's five steps forward, three steps back, right?
So we have the means to do
corporate accountability campaigns like that one,
and I do believe that it makes a difference
because even though they announced that rule,
I don't know if you saw today or yesterday or something,
they now announced a modification to that rule.
That is because human rights and civil rights organizations
are in the background all the time
pressuring, pressuring, pressuring, right?
This comes out as if this is a decision,
a Facebook decision, but it's not a Facebook decision.
There are people who are representing
hundreds of millions of other people
who are in those rooms arguing, arguing.
I know, I've been one of those people.
And then you win something, and then you go back,
and you win some more.
So in the corporate accountability front,
I think that's happening, right?
Color of Change is behind that.
Leadership Conference is behind that.
MediaJustice is behind that.
There are groups, you know what I'm saying,
that are working to ensure that voter suppression stuff
is dealt with on these platforms.
There are a million different arenas in which we are moving.
There are, at the same time,
folks working on electronic monitoring, right,
and making sure that that doesn't become
the new form of incarceration,
so that just because now you are incarcerated in your home,
you know, you're still in jail.
I mean, you just now have to pay for it.
So there are groups all over this country
organizing around use of location data, use of StingRay.
I could go on and on and on, right?
The movement is there.
The question isn't whether or not there are people fighting.
The question is whether or not it's going to be enough.
We are dealing with an incredibly corrupt administration,
but in addition to that incredibly corrupt administration,
we're dealing with a corrupt economic system
that advantages large technology companies
over poor people, and until we deal with that,
we're gonna keep going backwards instead of forwards.
Well, maybe by next year Sorry.
capitalism will be toppled, and we can address it again.
Mic drop, mic drop, mic drop.
Malkia and Laura, thank you so, so much for being here.
Before we get off the stage,
I do wanna say that we'd like to thank Incite
for partnering with us and making
this particularly provocative conversation possible.
They have the space downstairs.
It's kind of incredible.
It's a little bit West World, really.
But check it out, and they've actually got a video
they wanted to show, so after you both.
Thank you so much.
[applause]
Harvard Professor Answers Iranian Government Questions
Home Inspector Answers House Safety Questions
Harvard Professor Answers Iran War Questions
Barry Keoghan Answers The Web's Most Searched Questions
Professional Birder Answers Birding Questions
Nessa Barrett Answers The Web's Most Searched Questions
Now We Know Their Names
Self Defense Expert Answers Self Defense Questions
3 Strangers Test 5 Headphone Brands To Find The Best One
FINNEAS Answers The Internet's Best Questions