Politically Correct, Green Computer Algorithms? Bullcrap!
In the last few days I've read several articles that all seem to say the same thing: computer algorithms we use today are biased towards wealthy white men, because they are made by companies in wealthy countries and predominantly by white male developers. Therefore they are inherently racist, misogynistic and wasteful. Nice journalistic metaphors were used such as: "Sea of dudes", "discriminatory code", "green algorithms", etc. I call bullshit!
Computer algorithms need to be, most of all, money makers. If Facebook or Google tweak an algorithm one way or another, the result is immediately apparent in the bottom line because of their huge user count. It may be possible that somehow, by cosmic coincidence, wealthy white males would also be the ones making most purchases, moving the most money, and thus the algorithm may appear biased. But it's not. The algorithm performs as it was supposed to. If a face recognition algorithm classifies black people as gorillas, Asian people as blinking, etc, it's not because the algorithm is racist, but because the data it was provided pointed to that result. If looking for a movie title you get torrent links rather than the official web page of the movie it's because that is what people want more. It's not a glitch, it's the way a machine understands reality. An algorithm is no more racist than the Nazi ovens or the Hiroshima bomb.
What I am trying to say is that code, especially now when it is becoming more and more embedded with machine learning (which is a much better term than the terrible misleading "artificial intelligence"), represents an intersection point between specifications, people biases and data biases, to which you add horrible bugs. Algorithms, just like the way pieces of your brain work, are but elements in a puzzle.
"Well, of course, and to make the entire puzzle more socially responsible, we need to make all pieces socially responsible!" That's stupid. It's like working on the paint of the car to make it go faster. Sure, you can use some over engineered paint to reduce drag, but the engine and the wheels are still going to be more important. Male developers don't decide to tweak an algorithm to make it disregard women any more than a human resources female employee doesn't decide to hire developers based on how much they value women. Owners, managers, money ultimately are what lead to decisions.
Stop trying to appear politically correct when you don't know what you are talking about. If a complex computer algorithm that uses math as its underlying mechanism shows a bias, it's not because statistics are racist, but because the data it was fed was biased. The algorithm in question doesn't reveal the small mindedness of the white developer or of the male mathematician, but a characteristic of the world it sees. Even with people feeding them the wrong data, algorithms are more objective than humans - that is a fact - because often you start developing them before you know what you are looking for; a person always works the other way around. Why not use code to show us where we are wrong, or biased, or angry at how the world is, or plain stupid? We have such a wonderful tool to make judgements from formal principles that we can actually tweak and, instead of scrutinizing the principles, you go nitpicking against the developers and the algorithms. I find it especially humorous to see random data introduced into a generic algorithm producing results that are considered biased because you don't like what you see.
Bottom line: want to change the world and make it better? Here is an algorithm for you: take the world and make it better.
And BTW, I find that constantly accusing developers of being white and male is a form of sexist racism. What do you want me to do? Turn black? If you would truly be unbiased you wouldn't care what is the social structure of your IT department. It's only now when computers matter so much that you are bothered of how much the geeks are getting paid.
Computer algorithms need to be, most of all, money makers. If Facebook or Google tweak an algorithm one way or another, the result is immediately apparent in the bottom line because of their huge user count. It may be possible that somehow, by cosmic coincidence, wealthy white males would also be the ones making most purchases, moving the most money, and thus the algorithm may appear biased. But it's not. The algorithm performs as it was supposed to. If a face recognition algorithm classifies black people as gorillas, Asian people as blinking, etc, it's not because the algorithm is racist, but because the data it was provided pointed to that result. If looking for a movie title you get torrent links rather than the official web page of the movie it's because that is what people want more. It's not a glitch, it's the way a machine understands reality. An algorithm is no more racist than the Nazi ovens or the Hiroshima bomb.
What I am trying to say is that code, especially now when it is becoming more and more embedded with machine learning (which is a much better term than the terrible misleading "artificial intelligence"), represents an intersection point between specifications, people biases and data biases, to which you add horrible bugs. Algorithms, just like the way pieces of your brain work, are but elements in a puzzle.
"Well, of course, and to make the entire puzzle more socially responsible, we need to make all pieces socially responsible!" That's stupid. It's like working on the paint of the car to make it go faster. Sure, you can use some over engineered paint to reduce drag, but the engine and the wheels are still going to be more important. Male developers don't decide to tweak an algorithm to make it disregard women any more than a human resources female employee doesn't decide to hire developers based on how much they value women. Owners, managers, money ultimately are what lead to decisions.
Stop trying to appear politically correct when you don't know what you are talking about. If a complex computer algorithm that uses math as its underlying mechanism shows a bias, it's not because statistics are racist, but because the data it was fed was biased. The algorithm in question doesn't reveal the small mindedness of the white developer or of the male mathematician, but a characteristic of the world it sees. Even with people feeding them the wrong data, algorithms are more objective than humans - that is a fact - because often you start developing them before you know what you are looking for; a person always works the other way around. Why not use code to show us where we are wrong, or biased, or angry at how the world is, or plain stupid? We have such a wonderful tool to make judgements from formal principles that we can actually tweak and, instead of scrutinizing the principles, you go nitpicking against the developers and the algorithms. I find it especially humorous to see random data introduced into a generic algorithm producing results that are considered biased because you don't like what you see.
Bottom line: want to change the world and make it better? Here is an algorithm for you: take the world and make it better.
And BTW, I find that constantly accusing developers of being white and male is a form of sexist racism. What do you want me to do? Turn black? If you would truly be unbiased you wouldn't care what is the social structure of your IT department. It's only now when computers matter so much that you are bothered of how much the geeks are getting paid.
Comments
Be the first to post a comment