Computer Software Lawyer
Applying for a job these days can be a deeply anxiety-provoking experience—you submit an online application and upload a résumé and cover letter to the Internet only knows where and you cross your fingers that somewhere, someone will eventually look at your qualifications and call you back. Rarely do you sit down and shake hands with a Human Resource staff member before a computer reviews your résumé, and practically gone are the days when you could introduce yourself to company owners in person and explain why you’re the company would benefit from hiring you. But why would you need any Utah lawyers’ help with getting a job? Computer software now reviews your application, and hopefully it gets selected as one of the better candidates, good enough to get passed on to that HR hiring manager. But what if there’s a reason why you’re getting consistently overlooked for jobs that you’re qualified for?
Résumé-reviewing software in use in Salt Lake City might be infected with enough bias to get Utah lawyers involved
What if the computer software itself is discriminating against you? That “growing industry around doing résumé filtering and résumé scanning to look for job applicants” may be more unfair than we’d like to believe. And with as many companies in the Salt Lake valley using such software to select potential job candidates, you can bet that Utah lawyers are keeping a keen eye on “if there are structural aspects of the testing process that would discriminate against one community just because of the nature of that community,” because that would violate anti-discrimination laws.
Before Utah lawyers could get involved, though, Utah computer scientists had to figure it out, and a team of researchers from the University of Utah in collaboration with the University of Arizona and Haverford College in Pennsylvania have unlocked the secret to whether the algorithms used for making “hiring decisions, loan approvals, and comparatively weighty tasks could be biased like a human being.” We’ll let you further investigate the nerdy details of how the computer scientists determine “if these software algorithms can be biased through the legal definition of disparate impact,” but the point is they can.
And that “disparate impact” is where the Utah lawyers come in, since the theory in U.S. anti-discrimination law is what indicates that a policy “may be considered discriminatory if it has an adverse impact on any group based on race, religion, gender, sexual orientation, or any other protected status.” So if the résumé-scanning computer software can consistently and accurately predict a person’s race or gender based on the data provided, “there’s a potential problem for bias” based on that “disparate impact” definition.
Apparently it’s a quick fix, though, as the lead researcher from the University of Utah reveals that “all you have to do is redistribute the data that is being analyzed—say the information of the job applicants—so it will prevent the algorithm from seeing the information that can be used to create the bias.” He’s hopeful that changing the way companies utilize the software will directly feed “into better ways of doing hiring practices” in the future. And if there are enough lawsuits in the future, it probably will, eventually.
Free Consultation with a Computer Software Lawyer
When you need help from a computer software attorney, call Ascent Law for your free consultation (801) 676-5506. We want to help you.
8833 S. Redwood Road, Suite C
West Jordan, Utah
84088 United States
Telephone: (801) 676-5506
Recent Posts