Recently the Toastmasters club I am a part of had a meeting with the theme of MLK, Jr. day. At each Toastmasters meeting one part of the meeting is ‘Table Topics’ which is a chance to do some impromptu speaking (as opposed to a prepared speech). In this section, you volunteer to get up and then are asked a question, and you need to answer that within 1 – 2 minutes. With our club we usually have the questions correspond to the theme.
I was asked, ‘if you are on a bus with your niece and she turns to you and says, “why is that other kid brown?” what do you say to your niece?’ A good question, and the table topics master had 3 great questions that were asked that day.
But the question was based on assumption – my real or fictional nieces aren’t brown. And that’s true for 2/3 of them. One of them has a dad who is Haitian-American, so she looks … funny enough, sorta like Moana. In my answer I talked about (or attempted to) how kids are amazing because they just ask questions – there is no agenda or purpose other than to learn. My wife, son and I live in Colorado so my son will grow up seeing a whoooooole lot of white people and it wouldn’t surprise me to hear him ask such a question (though if he does clearly we’re not spending enough time with his cousins). And yes, it would make me uncomfortable and nervous and feel awkward, but hopefully the parent of the other kiddo would give me a look like, ‘yeah, I’ve been there’ and we could talk about the simple fact that some people are short, some people are tall, some people have pale skin, some people have dark skin. They are physical attributes, and they’re one of the many wonderful differences in people and that’s why it’s so amazing to get to live in a world where you can talk to other people are learn about them. Boom. (Pst. I said maybe 10% of that and it was maybe 1% comprehensible.)
In the meeting I also talked about a pet peeve of mine in storytelling. I mentioned that I noticed white people do this a lot, but later I thought about it and realized my sample pool for anecdotes is pretty much all white, so anyone could be equally guilty of this. It annoys me when someone identifies a person’s race in a story when it doesn’t matter.
Here’s where it doesn’t matter: I was at the grocery store, and the clerk was the sweetest black woman.
Here’s where it matters: You’re at convention center in Denver and the one black guy in the room is wearing an awesome t-shirt and you say, ‘oh man, check out that black dude’s t-shirt, it’s awesome.’ I could be race-free and say, ‘check out the uh … he’s like, 4 o’clock … no left a little more, kinda by that pole … no not that weird beard-y guy it’s the … he’s …’ But that’s just dumb. It’s not racist to use the most unique physical characteristic to describe someone. If the black dude was 9 feet tall I’d probably instead say, ‘check out the frighteningly tall dude’s awesome t-shirt, and also let’s leave because his height scares me.’
Now, what’s all that got to do with racist algorithms? The video I attached is awesome and you should watch it. Really. It’s 2 ½ minutes. It’ll make you smarter unless you already know it. And it’s fascinating.
Here’s my own example: we are letting computers figure stuff out these days, which is cool. Let’s say a team of doctor’s takes a million brain scans and says ok, we looked through these and 2,654 of these pictures have tumors, the rest are tumor-free. ‘Computer, take a gander at these and here are the ones that are tumor-free, and here are the tumor ones.’ And the computer goes, ok cool, got it. Then the team of doctor’s looks at a new set of one million brain scans and gives them to the computer and says, ‘tell me what you think, boss, which of these have tumors?’ And the computer comes back and says these 3,127 have tumors, the rest don’t. And you go back and forth and back and forth and the computer learns how to spot tumors.
That’s incredible. (And IMO, underutilized. Having my son at the NICU and knowing that they didn’t harvest the hordes of data they were collecting on him was an absolute travesty to me. With machine learning, they might be able to predict when a premature baby is going to have their heart rate suddenly drop so a nurse is standing there waiting patiently to intercede instead of sprinting into the room.)
Here’s where it’s bad. Let’s say you take police data and say, ‘hey computer, here’s a bunch of data on crime, traffic incidents, just anything and everything the cops took notes on … what do you think, can you draw any conclusions or guess when something bad or where something bad might happen?’ And the computer will says, ‘yeah dudes, but FYI, there is a definite risk of systemic ingrained cultural biases that factor into police work and it’s an incredibly complex topic and I’m not sure you or I are well-enough equipped to handle this but uh … I’m going to guess there will be some crime in the area where all the poor people live, especially the poor black people.’ And then the police can go patrol that area more and re-emphasize the bias.
F-ing racist computer.
So. Watch out when you feel the need to identify a characteristic for one person or set of people that you don’t for others. If you tell a story where you describe an old man, was his age relevant? What about her weight? What about his … etc, etc. It’s tough to be aware of your language, but it’s a good thing to shoot for.
Leave a Reply