Saturday, November 4, 2017

Educause 2017, Day 3 - Part 2


Safiya Noble
Safiya Noble
As promised, a different post on the Algorithms of Oppression and Safiya Umoia Noble's discussion on how search engines reinforce racism.  For a short video on what she is talking about, see: https://www.youtube.com/watch?v=iRVZozEEWlE
Dr. Noble is faculty at the University of Southern California, focusing on racism, sexism and other oppression as they occur on the internet. Her talk was a clear call to action to improve our efforts in educating everyone to be digitally literate in order to be able to tell opinion and advertisement from facts that are more objectively presented. Focusing on the Google search engine, as it holds practically a monopoly on the market of search engines, Dr. Noble pulled together two points that we need to remember:
  1. People trust Google to give them the truth - a trust that comes primarily out of positive experiences with such things as getting good directions through Google Maps, helpful suggestions what to do, reliable information on common facts in Physics, Chemistry, History.  (and of course it starts to get a little iffy when it comes to history).
  2. People forget that Google's search engine is an advertising machine and that the search results can thus be manipulated in favoring certain sites over others because a company pays Google or tweaks its metadata to play the engine itself.
If you take these two points in combination, it becomes clear quickly that we cannot trust our searches quite as easily as we have been thinking.  Her examples show that the algorithm, designed by primarily white men (this is a guess on my part), perpetuates stereotypes about women and minority populations in this country.  And while Google publicly has stated that the algorithm cannot be tweaked as it is the truth, evidence suggests otherwise. 
Gillespie, 2012:  The algorithmic assessment of information, t hen, represents a particular knowledge logic, one built on specific presumptions about what knowledge is and how one should identify its most relevant components.  That we are now turning to algorithms to identify what we need to know is as momentous as having relied on credentialed experts, the scientific method, common sense, or the Word of God.
Gillespie on algorithms

The recent UN ad campaign shows that Google's search suggestions are rather troubling.
When a couple of years ago a tweet went viral about what happens when you search for three black teenagers and then three white teenagers, the search results changed miraculously over night to appease the troubled customers.  However (and this is my side note), we should not underestimate that one of the driving forces for these searches is previous search histories, which suggests that this country may be in even more trouble than we thought.  Another example of such claim to an authentic search was the search for unprofessional and professional hair styles -- all examples of unprofessional hairstyles were images of black women wearing their hair naturally, while all professional hair styles were white women wearing carefully braided buns and other constricting hair constructions.

So, what to do about this obvious but nevertheless practically invisible racism and sexism? Not using search engines may no longer be an option, but remembering that Google is an information broker is essential, and remembering the sources for more reliable information (eg, government databases, library databases) is essential.  Her example of the South Carolina murderer Dylan Storm Roof is quite telling. In his online diary, he described how he could not believe that mainstream media had not been reporting on certain types of crime -- that moment of not believing should have made him aware that his line of online inquiry had let him down a rabbit hole of fake fascist news. Unfortunately, because he already agreed with such views, he continued and ended up murdering black Americans in their church.

No comments:

Post a Comment