谷歌的知名人工智能(AI)研究员蒂姆尼特·格布鲁(Timnit Gebru)周三发布Twitter消息称,她突然遭到该公司的解雇。 格布鲁是谷歌“伦理人工智能团队”(Ethical AI Team)的技术联合负责人,致力于算法偏见和数据挖掘领域的工作。她是著名的技术多样性倡导者,并联合创立了一个名为“Black in AI”的黑人研究人员社区。
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
WSJ 报道里面这么一段话。您细品,愿意有这样的同事吗?我的工作环境如果有这样的同事,我得赶紧走人。 “In August, Gebru told Bloomberg News that Black Google employees who speak out are criticized even as the company holds them up as examples of its commitment to diversity. She recounted how co-workers and managers tried to police her tone, make excuses for harassing or racist behavior, or ignore her concerns.”
The paper called out the dangers of using large language models to train algorithms that could, for example, write tweets, answer trivia and translate poetry, according to a copy of the document. The models are essentially trained by analyzing language from the internet, which doesn’t reflect large swaths of the global population not yet online, according to the paper. Gebru highlights the risk that the models will only reflect the worldview of people who have been privileged enough to be a part of the training data.
所以她的研究发现 人脸识别技术 在识别深色肤色的女性上错误率很高,跟识别白男相比。懂行的来说说这是为啥?算法也种族歧视? ”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“ flowereat 发表于 2020-12-03 20:12
The paper called out the dangers of using large language models to train algorithms that could, for example, write tweets, answer trivia and translate poetry, according to a copy of the document. The models are essentially trained by analyzing language from the internet, which doesn’t reflect large swaths of the global population not yet online, according to the paper. Gebru highlights the risk that the models will only reflect the worldview of people who have been privileged enough to be a part of the training data. xuexuan 发表于 2020-12-04 01:20
好奇查了下。还真有。 Dark skin反射的光不够,导致sensor不work. 说明当时设计的时候没有考虑到肤色深的。 Is this soap dispenser racist?” was the question that became an internet sensation. In a video at a Marriott hotel, an automatic soap dispenser is shown unable to detect a black customer’s hand. The dispenser used near-infrared technology to detect hand motions, an article on Mic read. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn''t reflected back to the sensor to activate the soap dispenser.
好奇查了下。还真有。 Dark skin反射的光不够,导致sensor不work. 说明当时设计的时候没有考虑到肤色深的。 Is this soap dispenser racist?” was the question that became an internet sensation. In a video at a Marriott hotel, an automatic soap dispenser is shown unable to detect a black customer’s hand. The dispenser used near-infrared technology to detect hand motions, an article on Mic read. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn''t reflected back to the sensor to activate the soap dispenser. ReesWitherspoon 发表于 2020-12-04 03:48
好奇查了下。还真有。 Dark skin反射的光不够,导致sensor不work. 说明当时设计的时候没有考虑到肤色深的。 Is this soap dispenser racist?” was the question that became an internet sensation. In a video at a Marriott hotel, an automatic soap dispenser is shown unable to detect a black customer’s hand. The dispenser used near-infrared technology to detect hand motions, an article on Mic read. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn''t reflected back to the sensor to activate the soap dispenser. ReesWitherspoon 发表于 2020-12-04 03:48
好奇查了下。还真有。 Dark skin反射的光不够,导致sensor不work. 说明当时设计的时候没有考虑到肤色深的。 Is this soap dispenser racist?” was the question that became an internet sensation. In a video at a Marriott hotel, an automatic soap dispenser is shown unable to detect a black customer’s hand. The dispenser used near-infrared technology to detect hand motions, an article on Mic read. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn''t reflected back to the sensor to activate the soap dispenser. ReesWitherspoon 发表于 2020-12-04 03:48
🔥 最新回帖
哪个媒体敢批判这个研究人员?
你是主编,有这么傻?
为啥是jeff dean辞的她?她不是说是manager's manager么,那应该是他们的director,这黑女级别挺低,如果现在她不搞这套咋咋呼呼的,哪里轮得到jeff dean出面,太看得起她了
Email 不是书面啊?
外面啥公司像Google这么傻,愿意接盘?
🛋️ 沙发板凳
https://www.bloomberg.com/news/articles/2020-12-03/google-s-co-head-of-ethical-ai-says-she-was-fired-over-email
双标?
”She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on white men.“
一团黑不好辨认线条吧
如果这样的话在任何公司都会被解雇。
需要higher exposure
哈哈,你太坏了。
我看老板的回信,就是她一篇文章没通过狗家内部审核于是撒泼打滚以辞职威胁老板就范,于是老板就让她滚蛋了。
一方面深肤色对亮度要求更高,另一方面training data不平衡,深肤色的样本本身就少
你在晚上看黑人哥们只能看到一嘴白牙,光线不好的情况下颜色对比不够。技术问题,真不是故意歧视谁。
就是欠揍。
公司内部Clearance是必须的。
她要求大头交出review她文章的人的名字,要不然就辞职。大头吓尿了;只能让她Resign了。
这是没看原文吧,公司内部审核又不是peer review,居然说relevant work不够,显然是小题大做找借口不想让发而已
是的,我记得黑人脸的图像识别和白人不一样,其实各个人种都不一样,黑人的算法好像是小米还不知华为做的最好,用的是眼睛和牙齿的定位,而白人脸部识别好像用的是眉毛???然后说印度人的人脸识别还有一个眉心的聚焦,因为好多印度人会在眉心点一点。
反正技术是有种族的。
Twitter 上这么多精神不正常的人。
精神正常的研究人员,上Twitter的是少数。
TRUMP还有这么多人站呢。
“In August, Gebru told Bloomberg News that Black Google employees who speak out are criticized even as the company holds them up as examples of its commitment to diversity. She recounted how co-workers and managers tried to police her tone, make excuses for harassing or racist behavior, or ignore her concerns.”
知名么?没听说过......
哈哈哈哈笑死了
我咋觉得谷歌很傻啊,等着她辞职呗,看她辞不辞。这要是让人告了谷歌再出一笔和解费,多不合算
假发
为什么说她科研做的烂?
很多单位要求,员工投稿前必须单位 Clearance。
新闻链接里jeff dean的回复很清楚:公司员工发paper是要通过内部审核,她在deadline的前一天才提交审核。因为没有足够时间审核,所以就同意让她先提交了。之后内部审核的时候,发现很多了问题,尤其是有些相关的research没有提到,所以让她把文章撤回来。她就不干了,问是谁不同意啥的,威胁她老板说不让发就离职,然后老板的老板就发信让她离职了。也算是求仁得仁吧。
这人离职前愤怒的发信给组里的所有人,解释一通,然后扯一些少数族裔,不平等之类的事。
当然这都些是一面之词,具体细节估计只有当事人知道。
“inconsistent with the expectations of a Google manager.”
目前大热门话题
我们也发现确实训练数据挺biased的
活该、不是有一张黑脸就可以任意妄为,竟然还惊动我男神Jeff dean,神马玩意
why don't they let her revise it before making a final decision to retract it?
说实话不是歧视黑人,我也碰到过这样的黑人,没有真本事成天拿皮说事 会哭的娃有奶喝,要是老中估计乖乖早卷铺盖走人,你看这个人真能折腾,公司闹,群发邮件,twitter闹,马上变网红,搞不好会闹到媒体...起诉...
这居然是个女的,哎
他们越这样别人看他们就越是pink elephant in the room 惹不起躲得起
估计审核她文章那几个人现在都在瑟瑟发抖呢。这要是公司屈从满足了她的要求,这几个人估计要被拉出来批斗被cancelled 了。
这就是瞎扯啊
如果那些relevant work是狗家自己做的,她还在文章里说这个问题没人解决过,发表出来不是打公司脸吗?
她早就是网红了吧,闹出名的。
她不傻好吗?她就是在AI界一直拿race说事火的,也是吃定google不敢拿她怎样才这么为所欲为,没想到google这次居然长了个backbone。
我也经常发现自动感应水龙头在我这儿不出水。后来发现我一直搞错了感应器的位置......
那是因为摄像头在低光下的SNR不够造成的。这是最基本的常识啊。一个例子就是晚上的照片(或者白天照片中暗黑区域)有更多的麻花点。
感觉这次Google讨不了好。。。估计后面会服软。。。心疼Jeff Dean一秒钟。。。
不可能的。绝对不可能服软的。
工作上遇过这样的人,幸好我家老板硬气,你要走就走,宁可花钱请更多新人来顶替,长痛不如短痛。
真相了
Minions同学也是cs的?
她这个看法也是有道理的。
好奇查了下。还真有。 Dark skin反射的光不够,导致sensor不work. 说明当时设计的时候没有考虑到肤色深的。
Is this soap dispenser racist?” was the question that became an internet sensation. In a video at a Marriott hotel, an automatic soap dispenser is shown unable to detect a black customer’s hand. The dispenser used near-infrared technology to detect hand motions, an article on Mic read. The invisible light is reflected back from the skin which triggers the sensor. Darker skin tones absorb more light, thus enough light isn''t reflected back to the sensor to activate the soap dispenser.
这个研究几年前就有人做了,mitigation都有了。她做了啥新东西?
什么都能扯到privilege上,这人联想能力真tm丰富
上次她不就是和AI大神撕,撕的大神说要退出推特。
我同意不能接受种族和性别bias, 但也不能把种族性别变成自己的privilege 了,那就是在反向bias!
她的黑人血统已经混的很稀薄了吧?除了头发,看不出多少黑人的典型特征。
也对。虽说这件事狗家占理,但是让他们吃个教训挺不错的。这人的研究工作乏善可陈,估计狗家本来就只想把她当个吉祥物养着,给自己立个多样化的牌坊;但是没想到吉祥物作起妖来能量比一般码工大得多。
我觉得如果打官司狗家应该不会输,这女的撕逼怼人都是一把好手,解雇她的后果她老板应该有所预计,也应该咨询过律师。不过公关就要费点脑筋了。
这还真是个贱人,级别不高,就是个搅屎棍,希望谷歌能得个教训,以后别收这样的人
底牌就是种族歧视,王炸
水龙头不是热感应么
希望谷歌这次别倒下,现在社会需要有common sense和勇敢的人, we all need to do the right thing and be brave.
他们手掌颜色不深啊?水是手掌手背都要冲,肥皂不一般是用手掌去接吗?
你这个说法很不政治正确,要注意了。
古狗最近可能学习了《卖拐》,领悟到“差不多得了”的意思。
同意,想想黑人在美国的历史,就是抗争的历史。所以黑人真的特别能折腾,而能在google里的估计也是黑人里的精英了,深得抗争的精髓。希望google能继续硬气到底,而且吃一堑长一智,别搞BLM这些面子工程
也不能说没考虑到反射光不够,应该说是技术没达到这个程度吧
晚上我也看不到黑人,要求现在的技术看的清楚,我觉得有点难,让科技再腾飞一会吧~~~很讨厌黑人喜欢把什么都往歧视上扯(主观),很多东西其实是客观的
老川现在也在上演这一出,好像没用
他们手掌比其他人种深,虽然比他们自己的手背要浅。 https://reporter.rit.edu/tech/bigotry-encoded-racial-bias-technology
我还真没注意过我是手的哪部分接洗手液,刚才试了下,我是手掌向上,做个凹槽,所以是手指头背面对着那个机器。
真能抗争的话还能让阿拉伯人给卖了,漂洋过海这么远来吃土?
搞不懂黄鼬的拧巴逻辑。你们到底是喜欢blm还是反对啊?为什么怎么做你们都要作啊?再说谷歌什么时候说过会为了种族放弃原则的?所以你们是希望她被fire 还是不希望?
这是技术难点跟歧视有什么关系 我实验室eye tracker 老是对不上我眼,我也发篇文章说是歧视asian?
她被不被炒无所谓了,无非就是在google内部作妖或者在外面拉着主媒炮轰google的差别。楼上的意思是支持blm的大公司遭遇blm反噬这一事实很喜感,就好像今年夏天纵容打砸抢的西雅图市长最后被革命小将革到自家门口一样有意思。
如果是公共设施呢? 如果只能让一部分人洗手,那另外一部分人抱怨是合理的,公共设施的采购就不应该继续采购这个只有一部分人能用的产品了。 技术的问题,我觉得可以用其他技术,比如 motion detection.
黑人先开始贩卖黑人,然后阿拉伯人才跟进的,据说。
又是这种逻辑 那任何技术都有一部分人不干活啊,那干脆所有技术都别用了
肯定这个技术比别的好用才决定用这个的 要比人都是傻子啊,非用一个更差的技术
可以调高亮度,同事降低对比度