When Algorithms Don’t Account for Civil Right
As people live more of their lives online, the necessity of figuring out how to extend offline protections to virtual practices becomes even more important. One way in which this problem is evident is bullying. Schools have long had punitive systems in place that, though far from perfect, sought to make their classrooms and hallways safe environments. Extending those same systems to the online world has been a significant challenge—how can they monitor what happens online and in private? And what’s the appropriate punishment for bad behavior that happens on the internet?
Another area that has proven difficult for this act of offline-to-online translation is the rules that protect Americans from discriminatory advertising. The internet is chock-full of ads, many of which are uncomplicated efforts to get people to buy more home goods and see more movies and so on. But things get a lot more tricky when the goods being advertised—such as housing, jobs, and credit—are those that have histories of being off-limits to women, black people, and other minorities. For these industries, the federal government has sought to make sure that advertisers do not help further historical oppression, via laws such as the Fair Housing Act and the Equal Credit Opportunity Act.