How big data and machine learning encode discrimination and create agitated clusters of comforting rage.
In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data’s predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.
Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.
How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.
"synopsis" may belong to another edition of this title.
Wendy Hui Kyong Chun is Simon Fraser University’s Canada 150 Research Chair in New Media and Professor of Communication and Director of the SFU Digital Democracies Institute. She is the author of Control and Freedom, Programmed Visions, and Updating to Remain the Same, all published by the MIT Press.
Alex Barnett is Group Leader for Numerical Analysis at the Center for Computational Mathematics at the Flatiron Institute in New York. He has published more than 50 research papers in scientific computing, differential equations, fluids, waves, imaging, physics, neuroscience, and statistics.
"About this title" may belong to another edition of this title.
Seller: Bay State Book Company, North Smithfield, RI, U.S.A.
Condition: good. Barnett, Alex (illustrator). The book is in good condition with all pages and cover intact, including the dust jacket if originally issued. The spine may show light wear. Pages may contain some notes or highlighting, and there might be a "From the library of" label. Boxed set packaging, shrink wrap, or included media like CDs may be missing. Seller Inventory # BSM.SGYB
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New. Barnett, Alex (illustrator). Seller Inventory # 46045628-n
Seller: Grand Eagle Retail, Bensenville, IL, U.S.A.
Paperback. Condition: new. Barnett, Alex (illustrator). Paperback. How big data and machine learning encode discrimination and create agitated clusters of comforting rage.How big data and machine learning encode discrimination and create agitated clusters of comforting rage.In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal-not an error-within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data's predictive potential, stems from twentieth-century eugenic attempts to "breed" a better future. Recommender systems foster angry clusters of sameness through homophily. Users are "trained" to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates-groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data. "Chun investigates the centrality of race, gender, class, and sexuality to "Big Data" and network analytics"-- Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Seller Inventory # 9780262548526
Seller: Rarewaves USA, OSWEGO, IL, U.S.A.
Paperback. Condition: New. Barnett, Alex (illustrator). How big data and machine learning encode discrimination and create agitated clusters of comforting rage.In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal-not an error-within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data's predictive potential, stems from twentieth-century eugenic attempts to "breed" a better future. Recommender systems foster angry clusters of sameness through homophily. Users are "trained" to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates-groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data. How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data. Seller Inventory # LU-9780262548526
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. Barnett, Alex (illustrator). Seller Inventory # 26398984003
Seller: Massive Bookshop, Greenfield, MA, U.S.A.
Paperback. Condition: New. Barnett, Alex (illustrator). Seller Inventory # 9780262548526
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: As New. Barnett, Alex (illustrator). Unread book in perfect condition. Seller Inventory # 46045628
Seller: INDOO, Avenel, NJ, U.S.A.
Condition: As New. Unread copy in mint condition. Seller Inventory # RH9780262548526
Seller: INDOO, Avenel, NJ, U.S.A.
Condition: New. Brand New. Seller Inventory # 9780262548526
Seller: PBShop.store UK, Fairford, GLOS, United Kingdom
PAP. Condition: New. Barnett, Alex (illustrator). New Book. Shipped from UK. Established seller since 2000. Seller Inventory # GO-9780262548526
Quantity: 2 available