Fair Representation In Arts and In Data
Principal Investigators
-
Jing Liu
Michigan Institute for Data Science
-
David Choberka
University of Michigan Museum of Art
-
Kerby Shedden
LSA Statistics
-
Sophia Brueckner
Stamps School of Art & Design
-
John Turner
University of Michigan Museum of Art
Affiliation
Michigan Institute for Data Science
”Fair Representation in Arts and Data” is an almost year-long collaboration between data scientists, artists and museum curators that was funded by the President’s Arts Initiative. The project analyzed the collection at UMMA using several of the most common face detection algorithms which are designed to distinguish a variety of factors including gender and race. Museum visitors can now get a first glimpse at these initial research findings from the working group, which can be found inside of the “YOU ARE HERE” exhibit at UMMA.
Big Data and Artificial Intelligence (AI) have become major forces that impact our daily lives in essential ways, from how political messaging and marketing are designed, to automating the process of deciding who gets hired or which neighborhood should be intensely patrolled. Big Data and AI can be an important agent for social justice and equality; or they can also be used to perpetuate injustice and hurt populations that are already disadvantaged and marginalized. Artists have been at the forefront, together with scientists, in exploring ways in which AI systems can be more equitable, transparent and inclusive.
Lead Investigator for the project, Jing Liu – who is also the Managing Director of the Michigan Institute for Data Science – is quick to point out that these machines are filled with implicit bias. Liu explained that In order to identify faces based on certain features, the machines must be taught what a face is, including what is the face of a man, and what is the face of a woman – and therefore these machines can, and nearly always do, have bias built into them.
“I love the fact that artists and scientists are working together and challenging each other’s thinking,”said Liu. “We asked each other questions that might be considered obvious, then realized that those are not obvious at all, such as why a certain artwork was acquired, or why AI would misclassify a face as a man when it is so clear to the human eyes that it’s a woman. This kind of dialogue has been great to broaden our views.”
Researchers on the project are still making sense of what all of the findings mean – since the process of data collection involves looking through thousands of images – but some initial findings include:
- The diversity of the collection has gone up over the years.
- The collection is very white-heavy, something the researchers weren’t surprised to find.
- The algorithm failed pretty badly to recognize females in the collection, which is also not surprising to the researchers
- The most representative face in the collection is a clown.
- The “Take Your Pick” exhibition, which was crowdsourced, was the most diverse collection to date.
“My favorite part of this project so far has been the regular meetings with people from across the university who are working in completely different areas and who I would have never met otherwise. Learning how to explain our work to each other hasn’t been easy, but it’s been fascinating!” reflected Sophia Brueckner, Associate Professor at Stamps School of Art & Design and member of the project team. “Each meeting involves artists, designers, data scientists, and curators coming to some shared understanding of machine learning, data, and bias. We all think differently and have very different vocabularies. In particular, I’ve loved seeing data science students and art/design students learn how to talk to each other and gain the confidence to ask questions of people working in other fields.”
The displays will stay up throughout the duration of the “YOU ARE HERE” exhibit which runs through the end of 2022. A more expansive exhibit for “Fair Representation in Arts and Data” is set to open at UMMA in winter of 2022.