Rocksolid Light

Welcome to RetroBBS

mail  files  register  newsreader  groups  login

Message-ID:  

"Life sucks, but it's better than the alternative." -- Peter da Silva


devel / comp.internet.services.google / What Really Happened When Google Racists Ousted Timnit Gebru

SubjectAuthor
o What Really Happened When Google Racists Ousted Timnit GebruExclusive Originals

1
What Really Happened When Google Racists Ousted Timnit Gebru

<6d6afed58ed4ff8be099cd298c12a7ed@dizum.com>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=30&group=comp.internet.services.google#30

  copy link   Newsgroups: alt.society.liberalism comp.internet.services.google alt.privacy.anon-server alt.discrimination
From: right-back@invalid.biz (Exclusive Originals)
Subject: What Really Happened When Google Racists Ousted Timnit Gebru
Message-ID: <6d6afed58ed4ff8be099cd298c12a7ed@dizum.com>
Date: Sat, 12 Jun 2021 10:53:37 +0200 (CEST)
Newsgroups: alt.society.liberalism, comp.internet.services.google,
alt.privacy.anon-server, alt.discrimination
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!sewer!news.dizum.net!not-for-mail
Organization: dizum.com - The Internet Problem Provider
X-Abuse: abuse@dizum.com
Injection-Info: sewer.dizum.com - 2001::1/128
 by: Exclusive Originals - Sat, 12 Jun 2021 08:53 UTC

ONE AFTERNOON IN late November of last year, Timnit Gebru was
sitting on the couch in her San Francisco Bay Area home, crying.

Gebru, a researcher at Google, had just clicked out of a last-
minute video meeting with an executive named Megan Kacholia, who
had issued a jarring command. Gebru was the coleader of a group
at the company that studies the social and ethical ramifications
of artificial intelligence, and Kacholia had ordered Gebru to
retract her latest research paper�or else remove her name from
its list of authors, along with those of several other members
of her team.

The paper in question was, in Gebru�s mind, pretty
unobjectionable. It surveyed the known pitfalls of so-called
large language models, a type of AI software�most famously
exemplified by a system called GPT-3�that was stoking excitement
in the tech industry. Google�s own version of the technology was
now helping to power the company�s search engine. Jeff Dean,
Google�s revered head of research, had encouraged Gebru to think
about the approach�s possible downsides. The paper had sailed
through the company�s internal review process and had been
submitted to a prominent conference. But Kacholia now said that
a group of product leaders and others inside the company had
deemed the work unacceptable, Gebru recalls. Kacholia was vague
about their objections but gave Gebru a week to act. Her firm
deadline was the day after Thanksgiving.

Gebru�s distress turned to anger as that date drew closer and
the situation turned weirder. Kacholia gave Gebru�s manager,
Samy Bengio, a document listing the paper�s supposed flaws, but
told him not to send it to Gebru, only to read it to her. On
Thanksgiving Day, Gebru skipped some festivities with her family
to hear Bengio�s recital. According to Gebru�s recollection and
contemporaneous notes, the document didn�t offer specific edits
but complained that the paper handled topics �casually� and
painted too bleak a picture of the new technology. It also
claimed that all of Google�s uses of large language models were
�engineered to avoid� the pitfalls that the paper described.

Gebru spent Thanksgiving writing a six-page response, explaining
her perspective on the paper and asking for guidance on how it
might be revised instead of quashed. She titled her reply
�Addressing Feedback from the Ether at Google,� because she
still didn�t know who had set her Kafkaesque ordeal in motion,
and sent it to Kacholia the next day.

On Saturday, Gebru set out on a preplanned cross-country road
trip. She had reached New Mexico by Monday, when Kacholia
emailed to ask for confirmation that the paper would either be
withdrawn or cleansed of its Google affiliations. Gebru tweeted
a cryptic reproach of �censorship and intimidation� against AI
ethics researchers. Then, on Tuesday, she fired off two emails:
one that sought to end the dispute, and another that escalated
it beyond her wildest imaginings.

The first was addressed to Kacholia and offered her a deal:
Gebru would remove herself from the paper if Google provided an
account of who had reviewed the work and how, and established a
more transparent review process for future research. If those
conditions weren�t met, Gebru wrote, she would leave Google once
she�d had time to make sure her team wouldn�t be too
destabilized. The second email showed less corporate diplomacy.
Addressed to a listserv for women who worked in Google Brain,
the company�s most prominent AI lab and home to Gebru�s Ethical
AI team, it accused the company of �silencing marginalized
voices� and dismissed Google�s internal diversity programs as a
waste of time.

Relaxing in an Airbnb in Austin, Texas, the following night,
Gebru received a message with a ?? from one of her direct
reports: �You resigned??� In her personal inbox she then found
an email from Kacholia, rejecting Gebru�s offer and casting her
out of Google. �We cannot agree as you are requesting,� Kacholia
wrote. �The end of your employment should happen faster than
your email reflects.� Parts of Gebru�s email to the listserv,
she went on, had shown �behavior inconsistent with the
expectations of a Google manager.� Gebru tweeted that she had
been fired. Google maintained�and still does�that she resigned.

FEATURED VIDEO

Bar Owner Builds an Alarm That Stops You From Forgetting Your
Credit Card

Most Popular
FBI building
SECURITY
The FBI's Anom Stunt Rattles the Encryption Debate

LILY HAY NEWMAN

Screenshot of Vivaldi search
GEAR
You're Probably Not Using the Web's Best Browser

SCOTT GILBERTSON

BACKCHANNEL
How Roblox Became a Playground for Virtual Fascists

CECILIA D'ANASTASIO

Macbook Pro with MacOS Monterey
GEAR
Apple Starts Leaving Intel Macs Behind in MacOS Monterey

BOONE ASHWORTH

Gebru�s tweet lit the fuse on a controversy that quickly
inflamed Google. The company has been dogged in recent years by
accusations from employees that it mistreats women and people of
color, and from lawmakers that it wields unhealthy technological
and economic power. Now Google had expelled a Black woman who
was a prominent advocate for more diversity in tech, and who was
seen as an important internal voice for greater restraint in the
helter-�skelter race to develop and deploy AI. One Google
machine-learning researcher who had followed Gebru�s writing and
work on diversity felt the news of her departure like a punch to
the gut. �It was like, oh, maybe things aren�t going to change
so easily,� says the employee, who asked to remain anonymous
because they were not authorized to speak by Google management.

Dean sent out a message urging Googlers to ignore Gebru�s call
to disengage from corporate diversity exercises; Gebru�s paper
had been subpar, he said, and she and her collaborators had not
followed the proper approval process. In turn, Gebru claimed in
tweets and interviews that she�d been felled by a toxic cocktail
of racism, sexism, and censorship. Sympathy for Gebru�s account
grew as the disputed paper circulated like samizdat among AI
researchers, many of whom found it neither controversial nor
particularly remarkable. Thousands of Googlers and outside AI
experts signed a public letter castigating the company.

But Google seemed to double down. Margaret Mitchell, the other
coleader of the Ethical AI team and a prominent researcher in
her own right, was among the hardest hit by Gebru�s ouster. The
two had been a professional and emotional tag team, building up
their group�which was one of several that worked on what Google
called �responsible AI��while parrying the sexist and racist
tendencies they saw at large in the company�s culture. Confident
that those same forces had played a role in Gebru�s downfall,
Mitchell wrote an automated script to retrieve notes she�d kept
in her corporate Gmail account that documented allegedly
discriminatory incidents, according to sources inside Google. On
January 20, Google said Mitchell had triggered an internal
security system and had been suspended. On February 19, she was
fired, with Google stating that it had found �multiple
violations of our code of conduct, as well as of our security
policies, which included exfiltration of confidential, business-
�sensitive documents.�

Google had now fully decapitated its own Ethical AI research
group. The long, spectacular fallout from that Thanksgiving
ultimatum to Gebru left countless bystanders wondering: Had one
paper really precipitated all of these events?

The story of what actually happened in the lead-up to Gebru�s
exit from Google reveals a more tortured and complex backdrop.
It�s the tale of a gifted engineer who was swept up in the AI
revolution before she became one of its biggest critics, a
refugee who worked her way to the center of the tech industry
and became determined to reform it. It�s also about a
company�the world�s fifth largest�trying to regain its
equilibrium after four years of scandals, controversies, and
mutinies, but doing so in ways that unbalanced the ship even
further.

Beyond Google, the fate of Timnit Gebru lays bare something even
larger: the tensions inherent in an industry�s efforts to
research the downsides of its favorite technology. In
traditional sectors such as chemicals or mining, researchers who
study toxicity or pollution on the corporate dime are viewed
skeptically by independent experts. But in the young realm of
people studying the potential harms of AI, corporate researchers
are central.

Gebru�s career mirrored the rapid rise of AI fairness research,
and also some of its paradoxes. Almost as soon as the field
sprang up, it quickly attracted eager support from giants like
Google, which sponsored conferences, handed out grants, and
hired the domain�s most prominent experts. Now Gebru�s sudden
ejection made her and others wonder if this research, in its
domesticated form, had always been doomed to a short leash. To
researchers, it sent a dangerous message: AI is largely
unregulated and only getting more powerful and ubiquitous, and
insiders who are forthright in studying its social harms do so
at the risk of exile.


Click here to read the complete article
1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor