1. Where am I? How far is my (unintended) reach?
2. I spent most of my queer childhood in a homophobic, misogynistic country. As I lingered in the periphery of online queer communities for years, I witnessed nameless, countless friends get outed, quit school (often “recommended” leave), run away from home, survive on part time jobs, quit because of abuse, take other part time jobs, repeat, and, eventually, disappear. Fifteen-year-old Hane thought, if I became an Educated Adult, maybe I would gain enough respect in my field to not get immediately fired if I were discovered.
2.1. I am academically the 99th percentile outcome of the entire community, if not an outlier. There will be socioeconomic consequences.
2.1.1. It was obvious by high school. I wanted to become a public role model and improve the reputation of queer people.
2.1.2. I want to give myself the responsibility of improving queer lives in that country.
2.1.2.1. Can this be done with technology?
2.2. What is the responsibility that comes with academic privilege? Is there any?
2.2.1. Is this elitist?
2.3. It is not without effort that I live here as a small, colored, assigned-female-at-birth, queer person, but life has dramatically improved since I have moved to Boston. I may be a token minority (minus religion), but at least I am a model one?
2.3.1. Repeat 2.1.2.
3. Engineering is my profession. It also happened that I went to a pretty good engineering school.
3.1. I am a master’s student in the Media Lab, Opera of the Future. Before that, I was an electrical engineering undergrad at MIT.
3.2. I inevitably embed my biases in the systems I develop, and my work may receive more credit than it warrants in itself.
3.2.1. This is privilege.
3.2.2. What is the responsibility that comes along with technological-academic privilege? Explanations? Clarity? Dedication? Infallibility? Good Morals?
3.2.3. Am I elitist?
3.3. The current development of machine learning (and so-called AI) algorithms bothers me. They make decisions based on biases while they are believed to be fair science and often disadvantage the disadvantaged. The algorithms themselves are more often than not unexplainable and uninterpretable, making matters worse.
3.3.1. What can be done?