- Book Review-Weapons of Math Destruction
- How AI affects communities of color.
- What is AI modeling?
It is true, artificial intelligence is used in thousands of software systems today to automate processes, and to produce algorithms. These new AI tools can be harmful and provide tremendous benefits. I am going to share how AI works in different ways. Why? For a few years, researchers have uncovered what these algorithms produce and it affects people.
Writers from Brookings shared how AI affects children of color. “Kids from Black and Latino or Hispanic communities-who are often already on the wrong side of the digital divide-will face greater inequalities if we go too far toward digitizing education without considering how to check the inherent biases of the (mostly white) developers who create AI systems. AI is only as good as the information and values of the programmers who design it, and their biases can ultimately lead to both flaws in the technology and amplified biases in the real world.” Full Article Here
Another MIT researcher shared biases on facial detection. With her research at MIT, Dr. Joy Buolamwini, discussed the biases in code. You can also view her Ted Talk where she said, “My robot could not see me.” “It could not detect my face.”
Full Ted Video Here How I am Fighting Bias in Algorithms
Mathematician, Dr.Cathy O’Neil details how algorithms are creating inequalities and are systems designed. I am going to share topics from her book, Weapons of Math Destruction and I hope you can read it.
Weapons of Math Destruction notes how mathematical AI models are made. AI models are the database of specific data that is deployed into machines to learn from. Dr. Cathy O’Neil indicates that every model requires an input of data, and in essence, you will have the output.
What is a model and how to create it?
Below is an example that was changed. The example in the book, Weapons of Math Destruction, was about a food recipe.
- Information-notes, musical sheet, rhythm, lyrics
- Output-decide on what type of music to create
- Evaluate-harmonization, does the song fit a genre, how many plays does the song get
- Dynamic model-updates and adjustments
What could be missing from this model are the instruments being used, the length of the music, using trending sounds.
Dr. O’Neil indicates, “There would always be mistakes, however, because models are their very nature, simplifications. No model can include all of the real world’s complexity or the nuance of human communication. Inevitably, some important information gets out (pg. 20).” For the current models in technologies that we use, information that is needed can or is left out, which in turn is where biases in robots exist.
To create a model, one must have choices, the relevant data, and make the information easy to understand. The data will need to be inferred with important facts. These types of systems are created for computerized or paper-pencil models.
An AI model is the organization of the data so the machine can perform tasks and recognize patterns.
Click Here to view Cathy’s Model
A simplified way on how the AI models work:
- Machines are searching through the data for habits, hopes, fears, and desires.
- The computer follows the information that the data provides.
- Patterns are discovered through time and connections.
Ultimately, millions of billions of data are needed for model creation.
Throughout this book, O’Neil shares a model that we all know too well. These are the models built within the criminal justice system. She notes that there is a predictive program called PredPol which is used in police departments all across the U.S. PredPol uses, “Software analysis, and predicts where crimes might occur. It looks at a crime in one area, incorporates historical patterns, and predicts what could happen next.”
She remarked that nonwhite prisoners from poor neighborhoods are more likely to commit from these data inputs. This disparity is heightened with the lack of jobs in poor neighborhoods, low education, previous run-ins with the law, and their friends who are in the same situation as them.
This system, which we all know too well, is a clear example of how data is collected and used to create different penalties for an individual’s actions.
One example of this Point System is an example is how judges score individuals in Orlando, Florida. Click Here These codes are tied to a specific activity. Unfortunately, with the information being built into systems, the level of biases are being integrated as well.
Another model is called the Level of Service Inventory.
Some questions asked are about
- Criminal history such as any prior convictions.
- If the individual has any friends or family criminal records.
- Alcohol/drug use
- Family history or marital status.
“The data is clear that a child growing up in a poor neighborhood compared to a child living in the suburbs would have different responses. Statisticians have used those results to devise a system which answers highly correlated to recidivism weigh more heavily and count for more points.”
Once the data is collected, the individual is ranked based on these points. The higher the number of points, the longer their sentence will be. This model is one example of how the criminal justice system works.
Other Examples of Models & Data Sets in the Criminal Justice System
In the system, to keep sentences “fair” and consistent judges use measures like the ones above.
“However, whether we have eliminated human biases or camouflaged it with technology. The new recidivism models are complicated and mathematical. But an embedded within these models are hosts of assumptions, some prejudicial.” Dr. O’Neil
AI Models in Higher Education
In Chapter 3, Arms Race, Going to College Dr. O’Neil explains how higher education institutions use models to rank higher on websites.
O’Neil described those models for ranking these institutions were from “a series of hunches.” These hunches were from variables they could count, and deciding how much weight to give each formula.
“They took proxies that could correlate with success-SAT scores, student-teacher ratios, and acceptance rates. Not at a student’s happiness, confidence, friendships, and other aspects about the experience. They calculated alumni giving.”
Previously college success was anecdotal. Some schools paid students to retake SAT. Others sent false data to U.S. News. Other schools worked harder to improve the score.
O’Neil concluded in this chapter that these proxies were gathered from human opinions; which in turn includes prejudice and ignorance.
AI in Job Seeking
In today’s modern world, if one is searching for a job, they have a low chance of having their resumes being reviewed by a human on the initial read. Read here how 75% of how resumes are not seen by individuals.
Dr. O’Neil notes the great disparities in computer programs affecting job seekers. These HR systems have been trained to pull data, skills, and experiences about the job seeker. Once the data is scored, it is now up to the manager or human to decide.
In conclusion, AI models have provided new insights into our own lives and how we interact and use data. AI can be found in the credit card, mortgage, insurance, education, banking, social media, automobile, and many other entities. If you desire to deepen your knowledge of AI, here is what O’Neil states, “The key is to learn what the machines are looking for. Our livelihoods increasingly depend on our ability to make our case to machines.”