[LOU] I think we have time
for a few more. [MAN] Hey, how’s it going? I’m going to censor my name,
but I worked at YouTube on the
Trust and Safety team. And basically a lot of these
technology companies, they inherited the position of
being arbiters of truth, and they really
don’t want to do that. When the first formation
of companies like YouTube, it was the wild west for anything goes, any kind of comment goes. Say whatever you want. And then over the
past couple of years, we’ve had public pressure,
government pressure,
advertiser pressure. So they’re in a tough position
where it’s like: How do you please everyone?
How do you do the right thing?
Do you become the arbiter of truth? Companies like Facebook have taken
the approach where they want to
create a public panel, like a democratic panel,
that goes over issues like
hate speech and what is hate, what is graphic violence, what is,
you know, how will we determine this? And then instead of having it be
an internal process, they want to make it more
open to inquiry and insight
from different people. So, do you think that’s the
right approach? Outsourcing that to experts in the field versus keeping everything internally and making those decisions
yourself as a company? [NADINE] I agree,
it’s a very very tough position, because for all of us who are saying,
“You’re doing too much censorship,” I know there’s a huge amount
of pressure saying,
“You’re not doing enough!” And in fairness, the critics that were quoted in the USA Today piece are making both kinds of criticisms. You know, “You’re not taking down enough of the harassment and trolling and doxxing that’s aimed at us, but you are taking down
too much of our speech.” So first of all, I just want to
make the general point that we should not think that
there is a panacea in any enforcer. I don’t care whether it’s internal
to Facebook or external, or what level of government or
what the politics of the government are. We have to acknowledge that these are inherently subjective determinations, and then make a decision. Do we trust somebody else
to make those determinations, or would we rather trust ourselves? In some ways,
it goes back to your point. You’re not forced to read everything, you can just simply say no. Now I realize that
that may be difficult, some people have to be on the
social media for business purposes. So I can’t give you a
straightforward answer,
but let me tell you where… and there are many many people
who are thinking about this and working on it in-government
and out-of-government. At the very least, we need to have
some procedural protections
which don’t exist enough now. We need to know
what the standards are. Facebook didn’t even begin to tell us what standards they were using until a year ago
because of some leaks. And when those standards change,
we have to know what those are. We should be given some notice
when we are removed or when
our posts are blocked. That’s not done now.
What was the reason?
What are the criteria? There has to be an
opportunity to appeal. And the appeal, I think,
should definitely be to
an independent decision maker. I would also hope for
more of a range of choice, right? So that we would not all be driven
to the same few giant platforms, but some of us might choose
a platform that does advertise that, we are going to heavily err in favor
of taking down more hateful
or pornographic or pick whatever
unpopular messages. But you wouldn’t have
to go to that site, you could go to another one that
had more of an open philosophy. Hey guys, thanks for watching. If you enjoyed the video, don’t forget to like it, share it with your friends, and subscribe to our channel. Make sure to click the little bell to turn on notifications so you never miss a video. You can also follow us on Instagram and Twitter. And if you want to support us, check out our website, wetheinternet.tv