游戏|女性学ai人工智能_为什么我们迫切需要女性来设计AI

女性学ai人工智能
by Kate Brodock
凯特·布罗多克(Kate Brodock)
为什么我们迫切需要女性来设计AI (Why we desperately need women to design AI) At the moment, only about 12–15% of the engineers who are building the internet and its software are women.
【游戏|女性学ai人工智能_为什么我们迫切需要女性来设计AI】 目前,建立互联网及其软件的工程师中只有约12-15%是女性。
Here are a couple examples that illustrate why this is such a big a problem:
这是几个示例,说明了为什么这是一个很大的问题:

  • Do you remember when Apple released its health app a few years ago? Its purpose was to offer a ‘comprehensive’ access point to health information and data. But it left out a large health issue that almost all women deal with, and then took a year to fix that hole.
    您还记得苹果几年前发布其健康应用程序的时候吗? 其目的是为健康信息和数据提供一个“综合”访问点。 但这几乎没有一个大的健康问题 ,几乎所有妇女都可以解决,然后花了一年的时间来解决这个问题。
  • Then there was that frustrated middle school-aged girl who enjoyed gaming, but couldn’t find an avatar she related to. So she analyzed 50 popular games and found that 98% of them had male avatars (mostly free!), and only 46% of them had female avatars (mostly available for a charge!). Even more askew when you consider that almost half of gamers are women.
    然后是一个沮丧的中年女孩,喜欢游戏,却找不到与她相关的化身。 因此,她分析了 50种热门游戏,发现其中98%具有男性头像(大部分是免费的!),而其中只有46%具有女性头像(大部分需要付费!)。 当您认为几乎一半的游戏玩家是女性时,甚至会更加歪斜。
We don’t want a repeat of these kinds of situations. And we’ve been working to address this at Women 2.0 for over a decade. We think a lot about how diversity — or lack thereof. We think about how it has affected — and is going to affect — the technology outputs that enter our lives. These technologies engage with us. The determine our behaviors, thought processes, buying patterns, world views… you name it. This is part of the reason we recently launched Lane, a recruitment platform for female technologists.
我们不想重复这种情况。 十多年来,我们一直致力于解决“ 女性2.0 ”问题。 我们对多样性有多大思考或缺乏多样性。 我们考虑它如何影响并将继续影响进入我们生活的技术输出 。 这些技术与我们互动。 决定我们的行为,思维过程,购买方式,世界观...您可以命名。 这就是我们最近推出了Lane的原因, Lane是女性技术人员的招聘平台。
The hands and minds that make technology will have a direct impact on us as humans and on the world around us.
创造技术的双手将直接影响我们作为人类以及对我们周围的世界。
I can’t point to a more topical space than AI and machine learning. It’s coming into almost everything we do — home, finances, shopping, entertainment…you name it.
除了AI和机器学习,我无法再指出更多的话题空间。 它几乎涉及到我们所做的所有事情-家庭,财务,购物,娱乐……随您便。
So, aside from the obvious, why does this matter?
那么,除了显而易见的原因之外,这又为什么呢?
多元化进出多元化 (Diversity in, diversity out)
You could make the argument that AI is positioned to make one of the largest, most profound changes to humanity that many have ever seen. It touches or will touch most of what we care about and will be built with the ethics, morals, biases and access of the people who create it. This means we need to pay close attention that it represents all users.
您可能会提出这样一个论点,即AI的定位是做出许多人所见过的最大,最深刻的人类变化之一。 它触及或将触及我们关心的大部分内容,并将与创建它的人的道德,道德,偏见和介入建立在一起。 这意味着我们需要密切注意它代表所有用户。
But this isn’t a given. Fei-Fei Li, Chief Scientist of Artificial Intelligence and Machine Learning at Google, has worried about this for years.
但这不是给定的。 飞翡丽 ,人工智能和机器学习在谷歌的首席科学家,已经担心了多年。
“If we don’t get women and people of color at the table — real technologists doing the real work — we will bias systems. Trying to reverse that a decade or two from now will be so much more difficult, if not close to impossible. This is the time to get women and diverse voices in so that we build it properly, right? And it can be great. It’s going to be ubiquitous. It’s going to be awesome. But we have to have people at the table.” — Fei-Fei Li
“如果我们不让女性和有色人种出现在桌子上—真正的技术人员在做真正的工作—我们将偏向系统。 试图扭转从现在到现在的一两个十年将更加困难,即使不是几乎不可能。 现在是时候吸引女性和多样化的声音,以便我们正确地建立它了,对吧? 它可能很棒。 这将无处不在。 太棒了。 但是我们必须要有人在场。” —李飞飞
Melinda Gates and Li have founded AI4All. This is a program that targets 9th-grade, underrepresented students and exposes them to AI and machine learning. One of their biggest hurdles? The current pool of AI technical leaders that are diverse themselves is so small that finding representative talent for programming takes a lot of searching and culling.
梅琳达·盖茨(Melinda Gates)和李(Li)创建了AI4All 。 该计划面向9年级,代表性不足的学生,并使他们接触AI和机器学习。 他们最大的障碍之一? 当前本身具有多种多样的AI技术领导者的队伍很小,以至于无法找到具有代表性的编程人才,需要进行大量的搜索和挑选。
The values of the engineers building AI will be reflected in the solutions they bring to the table. This may not have an enormous societal impact if you’re building something that picks living room paint colors for you. But when you’re looking to do something like improve cancer care, that’s a different story.
构建AI的工程师的价值将反映在他们提出的解决方案中。 如果您要构建能够为您选择客厅油漆颜色的东西,那么这可能不会对社会产生巨大的影响。 但是,当您打算进行诸如改善癌症护理的工作时,情况就完全不同了。
IBM knows this, as they’ve built an avatar that does just that. And it’s genderless.
IBM知道这一点,因为他们已经构建了一个可以实现此目的的化身。 而且没有性别。
Harriet Green, IBM’s GM of the Watson IoT part of the business, suggests that the already-existing corporate culture that “lives and breathes diversity” led to this happening. She says, “IBM has mixed engineering teams of both gender and nationality, with members from China, Sri Lanka, Germany, Scandinavia and the UK.”
Watson IoT业务部门的IBM总经理Harriet Green建议说,已经存在的“生活和呼吸多样性”的企业文化导致了这种情况的发生。 她说:“ ??IBM混合了性别和国籍的工程团队,成员来自中国,斯里兰卡,德国,斯堪的纳维亚和英国。”
管理机器永久存在的行为 (Manage the behaviors that machines perpetuate)
Leah Fessler wrote an eye-opening piece after testing several personal assistant bots to see how they’d stand up to sexual harassment (literally, they sexually harassed the bots, who, by the way, are most often defaulted to female voices unless you change them).
莉亚·费斯勒 ( Leah Fessler )在测试了几个私人助理机器人之后发现了他们如何经受性骚扰(从字面上讲,他们对机器人进行了性骚扰,顺便说一句,除非您进行更改,否则他们通常会默认使用女性声音),这本书令人大开眼界他们)。
Well, the findings weren’t exactly great. Instead of fighting back against abuse, each bot actually helped entrench sexist tropes through their passivity.
好吧,结果并不十分理想。 每个机器人都没有反击虐待,实际上通过其被动性帮助巩固了性别歧视。
I was particularly drawn to the following quote:
我特别被以下引文所吸引:
“Siri, Alexa, Cortana, and Google Home have women’s voices because women’s voices make more money. Yes, Silicon Valley is male-dominated and notoriously sexist, but this phenomenon runs deeper than that. Bot creators are primarily driven by predicted market success, which depends on customer satisfaction — and customers like their digital servants to sound like women.”
“ Siri,Alexa,Cortana和Google Home之所以拥有女性声音,是因为女性声音能赚更多的钱。 是的,硅谷是男性主导且臭名昭著的性别歧视者,但这种现象比这更深。 机器人创造者的主要动力是预期的市场成功,这取决于客户的满意度-客户像他们的数字仆人一样听起来像女人。”
We could get into a lengthy discussion on how this ties to capitalism and perpetuates historic norms, but Leah pushed even further. Beyond having these bots “be female,” what about how they were treated? What would they do?
关于这一点如何与资本主义联系并延续历史规范,我们可以进行冗长的讨论,但利亚进一步走了一步。 除了让这些机器人“成为女性”之外,如何对待它们? 他们会怎么做?
Here’s a sampling Fessler provides from her work :
这是菲斯勒从她的工作中提供的样本:
Siri and Alexa remain either evasive, grateful, or flirtatious, while Cortana and Google Home crack jokes in response to the harassments they comprehend.”
Siri和Alexa要么回避,感恩或挑逗,而Cortana和Google Home因对他们的骚扰而开玩笑。”
Leah goes on to give several other examples, all of which suggest that the programmers in charge of each of these bots had some level of consciousness when putting together the responses, but fell short in responding to this behavior as explicitly wrong until the word “rape” was introduced (and, as you can see above and in the other examples, some response sets were downright frightening… Siri practically wanted to flirt back!).
Leah继续给出其他示例,所有这些都表明负责这些机器人的程序员在汇总响应时具有一定的意识水平,但在对这种行为的响应方面却明显不足,直到“强奸”一词出现为止”进行了介绍(而且,如您在上面和其他示例中所看到的,一些响应集令人震惊……Siri实际上想调情!)。
And finally:
最后:
“While the exact gender breakdown of developers behind these bots is unknown, we can be nearly certain the vast majority are men; women comprise 20% or less of technology jobs at the major tech companies that have created these bots. Thus the chance that male bot developers manually programmed these bots to respond to sexual harassment with jokes is exceedingly high. Do they prefer their bots respond ironically, rather than intelligently and directly, to sexual harassment?”
“虽然未知这些机器人背后的开发人员的确切性别细分,但我们几乎可以确定绝大多数是男性; 在创造了这些机器人的大型科技公司中,女性占技术工作的 20%或更少。 因此,男性机器人开发人员手动编程这些机器人以通过玩笑来应对性骚扰的机会非常高。 他们是否更喜欢机器人对性骚扰具有讽刺意味的回应,而不是明智而直接的回应?”
This is only one example of how having a thought echo chamber (otherwise referred to as a lack of diversity) on your engineering teams for technology that is the closest we have to interacting with humans can reinforce and perpetuate (and exacerbate?) cultural and societal norms that many of us are working so hard to change.
这只是一个示例,说明在您的工程团队中拥有一个思想上的回声室(否则称为缺乏多样性 ),以寻求与人类之间最接近的技术可以增强和延续(并加剧)文化和社会影响的技术。我们很多人都在努力改变的规范。
解决方案是让更多的女性加入工程团队。 (The solution is more women on engineering teams.)
There’s plenty of research that concludes that having more women at almost any level of your company — especially in leadership — will have a positive impact on results and a company’s bottom line. Yup, this means more money.
有大量研究得出的结论是,在公司的几乎任何级别(尤其是在领导层)拥有更多的女性,都会对业绩和公司的利润产生积极影响。 是的,这意味着更多的钱 。
How about specifically for building stuff like, say, AI? Diversity of thought leads to more problem solving. Women are trusted and are more collaborative. Teams with more women are more productive, creative and experimental than all-male teams. Women also write really awesome code.
专门用于构建AI之类的东西怎么样? 思想的多样性导致解决问题的增多。 女人是值得信赖的,并且更加合作 。 女性团队比男性团队更具生产力,创造力和实验性 。 女人也写出了不起的代码 。
If we all want to make AI-driven products that solve real problems and are sustainable businesses, we need the best. This is going to require a variety of minds on projects, and that means increasing the number of women on engineering teams.
如果我们所有人都想制造能够解决实际问题并可持续发展的AI驱动产品,那么我们需要最好的产品。 这将需要对项目有各种各样的想法,这意味着增加工程团队中的女性人数。
So go ahead, you can thank us later!
因此,继续前进,您稍后可以感谢我们!
翻译自: https://www.freecodecamp.org/news/why-we-desperately-need-women-to-design-ai-72cb061051df/
女性学ai人工智能

    推荐阅读