Press "Enter" to skip to content

EP 3: Bigoted Machine Learning

Show:

This week on Valley Drag we talk about machine learning and how peoples own biases can be used to create machine learning algorithms that are bigoted, racist, and sexist.

CW: Racism, Sexism

Show Notes:

  • First up, what’s the difference between algorithms and machine learning?
  • Sources of bias
    • Data itself people are biased, so data generated by people may reflect bias
    • Methodology researchers are people, so they have bias too
    • Society as a whole
  • How machines can execute our biases:
    • Google’s Sentiment Analysis thinks being gay is negative
      • Sentiment Analysis takes words or sentences and tries to say if they express something positive or negative. This can be useful for auto-moderating comments and detecting spam
      • for some reason, the system believed saying i’m a homosexual” or i’m a jew” or text containing black” names were negative, while considering opposite statements as positive
    • Criminal Sentencing in the US is being handed over to machines who are racist
      • The intent of such tools can be to reduce individual human biases, but this assumes acknowledgement of biases by the people who build the tools
      • If the data chosen (for example, what’s the likelihood of being a criminal”) uses extant data (“what’s the distribution of blacks vs whites in prison?”), it simply replicates the bias we already have
    • Programming decisions into self-driving cars is an IRL trolley problem
      • Preserve the driver at all costs
      • or prevent greater harm to others at all costs?
    • A paper recently came out claiming that researchers could identify sexual orientation from photos of people’s faces. Here’s some discussion about why it’s wrong and some further followup from some machine learning instructors
      • The problem here is that choices made by researchers were biased
        • Data came from biased sources dating websites, social media, etc
        • Researchers brought their own biased assumptions about homosexuality