关于我们联系我们 网站地图 您好!欢迎访问广东某某机械有限公司!
品质留给时间来证明8年专注机械配件研发定制生产
全国咨询热线:020-88888888
您的位置: 主页 > 天博tb体育 > 案例分类六

机器人并不只抢人类工作 开始向人类发放工作岗位了‘天博Tb综合体育网页版’

作者:天博tb体育时间:2024-11-02 05:11:02 次浏览

信息摘要:

Robots are not just taking people’s jobs away, they are beginning to hand them out, too. 机器人并不只偷走人类的工作,它们也开始向人类派发工作岗位了。

本文摘要:Robots are not just taking people’s jobs away, they are beginning to hand them out, too. 机器人并不只偷走人类的工作,它们也开始向人类派发工作岗位了。

Robots are not just taking people’s jobs away, they are beginning to hand them out, too. 机器人并不只偷走人类的工作,它们也开始向人类派发工作岗位了。Go to any recruitment industry event and you will find the air is thick with terms like machine learning, big data and predictive analytics.参与聘用行业的任何一场活动,你都会找到空气中弥漫着像机器学习、大数据和预测分析这样的字眼。

The argument for using these tools in recruitment is simple. 在聘用中用于这些工具的理由很非常简单。Robo-recruiters can sift through thousands of job candidates far more efficiently than humans. 机器人招聘者可以较慢检验数以千计的应聘者,效率远高于人类。They can also do it more fairly. 它们还能做更为公平。

Since they do not harbour conscious or unconscious human biases, they will recruit a more diverse and meritocratic workforce.因为它们会像人类那样带着无意或有意的种族主义,它们不会聘用到一批更加多元化和择优录用的员工。This is a seductive idea but it is also dangerous. 这是个很诱人的点子,但也是危险性的。

Algorithms are not inherently neutral just because they see the world in zeros and ones.算法的中立并非是其固有,而是因为它们看见的世界只是0和1。For a start, any machine learning algorithm is only as good as the training data from which it learns. 首先,任何机器学习的算法,并会比它所自学的训练数据更佳。

Take the PhD thesis of academic researcher Colin Lee, released to the press this year. He analysed data on the success or failure of 441,769 job applications and built a model that could predict with 70 to 80 per cent accuracy which candidates would be invited to interview. 以学术研究者科林李(Colin Lee)今年向媒体公布的博士论文为事例,他分析了44.1769万份顺利和不顺利的打工申请人,创建了一个准确度约70%至80%的模型,可预测哪些应聘者不会被邀参与试镜。The press release plugged this algorithm as a potential tool to screen a large number of CVs while avoiding human error and unconscious bias.该新闻稿称之为,这一算法潜在可用于工具,用作在检验大量履历的过程中防止人为错误和无意识种族主义。But a model like this would absorb any human biases at work in the original recruitment decisions. 但这样的模型不会吸取最初聘用要求中的人为职场种族主义。

For example, the research found that age was the biggest predictor of being invited to interview, with the youngest and the oldest applicants least likely to be successful. 例如,上述研究找到,年龄因素可以在仅次于程度上预测该应聘者否不会被邀试镜,最年长和最年长的应聘者最不有可能顺利。You might think it fair enough that inexperienced youngsters do badly, but the routine rejection of older candidates seems like something to investigate rather than codify and perpetuate.你有可能实在这一挺公平,因为没经验的年轻人腊很差,但拒绝接受年长应聘者的少见作法或许有一点调查,而不是被编为程序和以求沿袭。Mr Lee acknowledges these problems and suggests it would be better to strip the CVs of attributes such as gender, age and ethnicity before using them. 科林否认这些问题的不存在,并建议最差从履历中去除一些属性(例如:性别、年龄和种族)再行加以用于。Even then, algorithms can wind up discriminating. 即使那样,算法仍有可能具有种族歧视。

In a paper published this year, academics Solon Barocas and Andrew Selbst use the example of an employer who wants to select those candidates most likely to stay for the long term. 在今年公开发表的一篇论文中,索伦巴洛卡斯(Solon Barocas)和安德鲁谢尔博斯特(Andrew Selbst)这两位学者用于了一个案例,即雇员期望挑选出最有可能长年回到工作岗位上的雇员。If the historical data show women tend to stay in jobs for a significantly shorter time than men (possibly because they leave when they have children), the algorithm will probably discriminate against them on the basis of attributes that are a reliable proxy for gender.如果历史数据表明,女性雇员在工作岗位上逗留的时间大大多于男性雇员(有可能因为当她们有了孩子之后不会辞职),算法就有可能利用那些性别指向具体的属性,得出结论对女性有利的结果。Or how about the distance a candidate lives from the office? That might well be a good predictor of attendance or longevity at the company; but it could also inadvertently discriminate against some groups, since neighbourhoods can have different ethnic or age profiles.应聘者住址与办公室之间的距离如何?这也有可能是预测该雇员出勤率和在公司服务年限的不俗的预测因素;但它有可能也不会在无意间种族歧视某些群体,因为有所不同的住宅社区有有所不同的种族和年龄特征。

These scenarios raise the tricky question of whether it is wrong to discriminate even when it is rational and unintended. This is murky legal territory. 这些现象明确提出了一个棘手问题:在理性和非无意的情况下,种族歧视否错误?这是一个模糊不清的法律领域。In the US, the doctrine of disparate impact outlaws ostensibly neutral employment practices that disproportionately harm protected classes, even if the employer does not intend to discriminate. 在美国,根据差异影响(disparate impact)原则,奇特中立的雇用实践中若远超过比例地损害了受保护阶层,即为不合法,即便雇员并非无意种族歧视。But employers can successfully defend themselves if they can prove there is a strong business case for what they are doing. 但雇员若能证明该作法有很强的商业理由,就能为自己顺利申辩。

If the intention of the algorithm is simply to recruit the best people for the job, that may be a good enough defence.如果用于算法的意图意味着是为涉及职位召募最佳人选,那有可能是个充足好的申辩理由。Still, it is clear that employers who want a more diverse workforce cannot assume that all they need to do is turn over recruitment to a computer. 话虽如此,那些期望享有更加多元化的员工队伍的雇员,似乎无法想当然地指出只需把聘用转交电脑去做到。

If that is what they want, they will need to use data more imaginatively.假如这正是他们想的,那他们也得把数据运用得更加丰想象力一些。Instead of taking their own company culture as a given and looking for the candidates statistically most likely to prosper within it, for example, they could seek out data about where (and in which circumstances) a more diverse set of workers thrive.比如说,与其将他们自己的公司文化另设为既定条件,进而找寻统计学上最有可能在该文化中顺利的人选,不如寻找涉及数据表明,一支更加多元化的员工队伍在哪些情况下不会顺利。

Machine learning will not propel your workforce into the future if the only thing it learns from is your past.如果机器学习唯一教给的只是你的过去,那么它将无法推展你的员工队伍南北未来。


本文关键词:天博tb体育,天博Tb综合体育网页版,天博·体育全站app官网,天博·综合体育登录入口,天博.体育登录入口

本文来源:天博tb体育-www.zhishi123.com

【相关推荐】