Technically Biased

Krystyn Gutu

The podcast that discusses bias in tech. read less
TechnologyTechnology

Episodes

Being Picky with Piki: Ironing out the Biases in Song Quality Algorithms
May 8 2024
Being Picky with Piki: Ironing out the Biases in Song Quality Algorithms
Listen to Krystyn Gutu, M.S. introduce Sasha Stoikov, a senior research associate at Cornell Financial Engineering in Manhattan (CFEM). His research studies algorithms in high-frequency financial trading, online ratings systems, and recommendation systems. In these various domains, he has come across algorithmic biases such as survivorship bias, popularity bias, and inflation bias. He is also the founder of Piki, a startup that gamifies music ratings. Ratings produced by users on Piki can mitigate algorithmic biases, which he discusses in a recent paper, aimed at answering a simple but provocative question: “Are the popularities of artists like Justin Bieber or Taylor Swift truly justified?” Check out his paper, Better Than Bieber? Measuring Song Quality Using Human Feedback, to find out. He also authored Evaluating Music Recommendations with Binary Feedback for Multiple Stakeholders, and Picky Eaters Make For Better Raters.  In this episode, we discuss  – the roles of survivorship bias, popularity bias, and inflation bias – Piki, the music ratings app designed exclusively for those who know what they like – how algorithms like Instagram, TikTok, and Spotify compare to how Piki analyzes song quality – data used to train these and other platforms – how interfaces collecting data unintentionally encourage certain biases  – implicit vs explicit data collection – how data is collected and how it addresses the main concerns of their users – how Piki incentives its users to listen to a larger music selection and nudges them into being fair with their ratings – the golden era of data and the tremendous opportunities and dangers that lie ahead If you like what you hear, follow us on LinkedIn (@Gakovii) or on Instagram (@gakovii__). Technically Biased is available on Spotify, Apple, and Amazon, among others. #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Algorithmic Auditing... it's Not Rocket Science, it's Astrophysics
Apr 24 2024
Algorithmic Auditing... it's Not Rocket Science, it's Astrophysics
Tune in to learn about the applicability of algorithmic auditing. Guest, Shea Brown, founder and CEO of BABL AI, joins us to share his perspective. An Associate Professor of Instruction at the University of Iowa, Brown has his PhD in Astrophysics and specializes in AI Ethics and Machine Learning, with a focus on Algorithmic Auditing and AI Governance. In this episode, we discuss: – Brown’s use of AI in astrophysics and its applicability, given the extent of data from the sky – Brown’s transition to founding BABL AI after realizing a big problem: countless examples of bias in AI – BABL AI and the consulting work they do and what the process entails – algorithmic auditing which, like any audit, is a check and balance aimed at ensuring people and organizations meet the required standards and follow appropriate procedures (i.e., we don’t want to harm people, we don’t want to infringe on people’s rights, people should be involved in algorithmic decision-making, etc.) – algorithmic auditing, with a focus on the socio-technical aspect of how tech is being used more broadly – what to expect for the future of AI regulation and governance – the importance of accountability and transparency, and the trade-offs that come with regulation – algorithmic transparency, and whether your audience knows an algorithm is being used and the data being processed; accountability and transparency are important for building trust within a society which should demand more from the companies using their data – how historical data will always hold a level of bias that needs to be considered If you like what you hear, follow us on LinkedIn (@Gakovii) or Instagram (@gakovii__). Technically Biased is available on Spotify, Apple, and Amazon, among others. #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Algorithmic Accountability... and What That Means from a Human Rights Perspective | 1.15
Feb 7 2024
Algorithmic Accountability... and What That Means from a Human Rights Perspective | 1.15
Damini Satija is a Human Rights and Public Policy Professional, as well as Head of the Algorithmic Accountability Lab and Interim Director at Amnesty Tech. Satija has experience working on data and AI, with a focus on government surveillance, algorithmic discrimination, welfare automation, and tech equity and justice. She has her Master of Public Administration (MPA) from Columbia University, with a specialization in tech policy, and a BA in Economics from the University of California, Berkeley. In this episode, she and Gutu discuss how: Bias and discrimination generally emerge in AI algorithmsHuman rights implications play a big role in data and consequently, in policy and regulationWe need to understand what needs to be addressed to properly mitigate AI harms... is it the model that should be optimized or the data (i.e., model-centric vs data-centric)?Our biases are codifiedWe can go about ensuring more inclusivity, more representation, and less bias in techNet neutrality, encryption laws, copyright, and content moderation effect usAI is playing an increasingly bigger role in Hollywood, art, and media. Is it possible to reclaim our data? Is data ownership a myth? What are the implications of assigning property rights to personal data?The hype of ChatGPT and GenerativeAI are overdone; and how environmentally unsustainable they are. Should ChatGPT be trained on people's writing, such as their books, articles, and/or poetry? How do property rights and copyright law apply?To be more mindful with technology and the ways it uses our data Check out our website, LinkedIn, or Instagram to stay up to date!   #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Legal Decisions are Being Codified and the Models are Perpetuating Historical Biases | Episode 1.14
Dec 13 2023
Legal Decisions are Being Codified and the Models are Perpetuating Historical Biases | Episode 1.14
𝗣𝗮𝘁𝗿𝗶𝗰𝗸 𝗞. 𝗟𝗶𝗻 is a lawyer and researcher focused on AI, privacy, and technology regulation. He is the author of 𝘔𝘢𝘤𝘩𝘪𝘯𝘦 𝘚𝘦𝘦, 𝘔𝘢𝘤𝘩𝘪𝘯𝘦 𝘋𝘰, a book that explores the ways public institutions use technology to surveil, police, and make decisions about the public, as well as the historical biases that impact that technology.  Patrick has extensive experience in litigation and policy, having worked for the ACLU, FTC, EFF, and other organizations that advocate for digital rights and social justice. He is passionate about addressing the ethical and legal challenges posed by emerging technologies, especially in the areas of surveillance, algorithmic bias, and data privacy. He has also published articles and papers on facial recognition, data protection, and copyright law. This podcast episode covers some of the many crazy topics Lin dives into throughout his book. Some of which include the following discussions: Robert Moses would often quote the saying “Legislation can always be changed. It’s very hard to tear down a bridge once it’s up.” Unsurprisingly then, Moses had a lot of influence in shaping the physical layout and infrastructure of New York City and its surrounding suburbs (i.e., hundreds of miles of road, Central Park Zoo, United Nations (UN) Headquarters, Lincoln Center, and more). Today, the digital landscape is similarly being built on a foundation of bias.Can history be biased? How do we codify bias and build legal models that perpetuate discrimination in policy? Though not an easy question, the answer lies in the data. Lin ends his book by emphasizing that “If we are not willing to reflect on the effects of our history, then our technology will simply continue to mirror our past mistakes. History will be doomed to repeat itself.” (178)It is important to understand what a model outputs and what inputs are considered in the overall assessment. Algorithms like COMPAS, which is used in the police system, consider variables such as education, which is indirectly classist, as education is a proxy for wealth. (120)The government uses surveillance technology disproportionately to target immigrant communities; and the deployment of new systems and technologies are usually tested on immigrants first.  This is yet another example of how those most influenced are those who are already most marginalized.Bias is present throughout all stages of policing – from the criminal trial case (where judges use biased algorithms to validate their already biased perspectives, i.e., confirmation bias), to the recidivism assessment process (i.e., models like the aforementioned COMPAS), to cash bail, and many others.Automated License Plate Readers (ALPRs) might seem harmless until you realize that “ALPRs often capture photographs of the vehicle, driver, and passengers [and] all of this data is uploaded to a central server that is accessible by law enforcement… The government can use ALPRs to target people who drive to immigration clinics, Planned Parenthood health centers, gun shops, union meetings, protests, or places of religious worship. ALPR vendors have stated police can use the collected information to find out where a license plate has been in the past, determine whether a vehicle was at a crime scene, identify travel patterns, and even discover vehicles that may be associated with each other.” (64-66)Generative AI uses nonconsensual pornography in its training data. How can we mitigate such breaches of privacy?Intellectual property and copyright law play an interesting role and work in the best interest of the AI Industry, which is incentivized to keep the space unregulated.Overrepresentation is an indicator of discriminatory purposes in a model’s training data. What we can to do hedge for such bias in an algorithm’s early phases? #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Racial Bias in American Housing, From Founding Policies to Predatory Algorithms | Episode 1.13
Nov 15 2023
Racial Bias in American Housing, From Founding Policies to Predatory Algorithms | Episode 1.13
Listen to Krystyn Gutu interview Leah Rothstein, co-author of Just Action: How to Challenge Segregation Enacted Under the Color of Law. Written with her father, Richard Rothstein, the book acts as a follow-up to his 2017 work, The Color of Law: A Forgotten History of How Our Government Segregated America. Together, they analyze how housing and community development policies impact the ways people relate to their communities and to each other. Leah is a community and labor organizer, with experience consulting on housing, police accountability, environmental justice, education, and worker health and safety issues. She has consulted on financial and policy topics for affordable housing developers, cities, counties, and redevelopment agencies; and has directed research on community corrections policies, practices, and populations, to help promote a rehabilitative approach. Stay up to date on their work and how you can get involved by subscribing to their Substack. In this episode, Leah discusses how bias infiltrated American housing policies and shaped segregation throughout the country. More specifically, we discuss how: - American housing deeds allowed stipulations that excluded eligible homebuyers based on their race; and that in addition to discriminatory policies against African Americans, some deeds stated that houses should not be “occupied by any person or persons not of the white or Caucasian race or by any Mexican, Filipino or Hindu” (Just Action, 32) - Discriminatory housing algorithms can be racist without “knowing” one’s race since this information can be deduced using one’s name and zip code  - Predatory corporations like Kodak, Fannie Mae, Baltimore Sun, and others influenced the segregation of neighborhoods through various tactics - African American home buyers experienced (and continue to experience) racial discrimination in the housing process (i.e., via appraisals, assessments, property listings, credit scores, mortgage loans, taxes, [homeowner’s] insurance, etc.) - Exclusionary Crime-Free Ordinances allow for the eviction of tenants based on any contact with law enforcement (including suspected criminal activity, as well as a connection to someone else with criminal activity) - Houses were exploitatively sold to African Americans on contract, rather than mortgage For more updates, follow Krystyn Gutu on LinkedIn, or Gakovii on LinkedIn or Instagram (@gakovii__). #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Palestine and Israel: Technology’s Role in How We Record and Analyze Data
Oct 25 2023
Palestine and Israel: Technology’s Role in How We Record and Analyze Data
Palestine and Israel have been fighting internally for more than a century. To lay blame to one side and call it a day is unfortunately not as easy as it sounds. There are human rights abuses occurring on both sides, inflicted by parties representing each nation/nation-state. Listen to this episode to learn about the historical timeline leading up to today, and how harmful technology like ChatGPT primes people to feel hatred toward one group of people over another. Marc Getzoff shares his knowledge of the histories of Israel, Palestine, and the surrounding region. We cover the following events: 1948: Israel declares independence / Arab-Israeli war breaks out 1967: The Six-Day War (also known as The June War; The 1967 Arab-Israeli War; The Third Arab-Israeli War) 1973: The Yom Kippur War (also known as The October War; The Ramadan War) 1987-1993: The First Intifada 1997: Economic developments and internal Palestinian politics 2000-2005:The Second Intifada (Al-Aqsa Intifada) 2008-2009: Israeli military operations July 2023: Israel launches biggest attack on occupied West Bank in decades October 2023: Israeli Government attacks and kills almost 6,000 civilians, with thousands more seeking shelter; Hamas-led Palestinian militant groups take several hundred hostages and terrorize civilians Note that this conflict cannot be easily simplified and all violence is condemnable, regardless of which side the inflicting parties represent. The only way to resolve this issue is by finding a way to stop the bloodshed in both Palestine and Israel. As long as one of us falls victim to the violence of another, we are all imprisoned and shackled. Please spread love, kindness, and accurate information with one another. It is our mission, as those not on the ground in Palestine or Israel, to do our part and take care of our Palestinian and Israeli families abroad. No one deserves violence, and no one deserves heartbreak. Please, PLEASE, please be good to one another. Check out the Technically Biased Podcast / Gakovii (parent company) mission at: Website: Home - Gakovii Instagram: @gakovii__ Company LinkedIn: Gakovii: Overview | LinkedIn Founder's LinkedIn: Krystyn Gutu, M.S. | LinkedIn #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Technology’s Role in the Policing and Erasure of the Chinese Uyghurs | Episode 1.11
Sep 6 2023
Technology’s Role in the Policing and Erasure of the Chinese Uyghurs | Episode 1.11
There are human rights abuses occurring in the Xinjiang Uyghur Autonomous Region of China, also known as East Turkestan, or the western part of China. The Chinese government is accused of detaining a significant number of Uyghur Muslims and other ethnic minorities in what they term "re-education camps" or "vocational training centers." These camps reportedly involve forced labor, cultural assimilation efforts, and religious restrictions, among other violations. Listen to our latest episode to learn how predatory technology is being used against the Uyghur community. Arslan Hidayat shares his knowledge based on research and first-hand interviews. We cover: - Facial Recognition Technology (FRT) and how one part of the population (the Uyghurs) is being more closely monitored than the other part (the Han Chinese) - Police Checkpoints, which are routine and frequent searches Uyghurs must comply to - WeChat and how this app is used for everything, making it easier for the Chinese government to monitor data (which has been neatly aggregated in one place for them) - Trustworthy Rankings, which work similar to a credit score, but instead of judging one’s credit, it judges a person’s score as a respectable member of society (and must be scanned before entering and exiting a Uyghur neighborhood, when trying to buy groceries / gas, etc. – a poor score can result in denial of what the person was seeking) - Predictive Policing and how only the likeliness of someone considering to do a crime matters… not the likeliness the machine may have computed faulty data to make a gibberish prediction, resulting in a potential lifetime in prison for the victim being targeted - Trusting No One and how kids are taught in school to share information of their parents with teachers, employees are encouraged to share information of their peers, etc. You can find it on our website at https://gakovii.com/podcast/ Or on Spotify, Apple, Amazon, and any other platform that hosts podcasts. #AlgorithmicBias #PredatoryTech #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Gender Equality & Women Empowerment on an International Platform | Episode 1.10
Aug 30 2023
Gender Equality & Women Empowerment on an International Platform | Episode 1.10
Get ready for an eye-opening discussion where Alia Flanigan helps us dive deep into the fascinating interplay of culture, language, and gender dynamics on a global scale. We will be discussing: – Women on the Global Stage: the international expectations placed on women and the many challenges of navigating a complex world while striving to shatter gender norms – from boardrooms to the household – Power of Language: the captivating connection between language and biases; the words we use can unintentionally perpetuate stereotypes and impact our understanding of gender roles and societal expectations – Cultural Lens: how our cultural background shapes the way we perceive the world around us – from our perspectives to our attitudes to our interactions and how this gives us unique insight into the diverse tapestry of humanity – Global Relations Impact: how gender dynamics intersect with international relations, and how we can gain insight from women across the globe, forging connections across borders and fostering a more inclusive global dialogue – Empowerment Redefined: how the new Barbie movie highlighted the powerful message of empowerment for women of all ages and all backgrounds and why media is so important in reshaping narratives and inspiring positive change ⁠#AlgorithmicBias⁠⁠ ⁠⁠#PredatoryTech⁠⁠ ⁠⁠#TechnicallyBiasedPodcast⁠ ⁠#Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Searching for Rare Metals and Renewable Energy - At The Expense of Humans and the Environment | Episode 1.7
Jun 21 2023
Searching for Rare Metals and Renewable Energy - At The Expense of Humans and the Environment | Episode 1.7
Listen to Krystyn Gutu, M.S. introduce award-winning journalist and author, Vince Beiser, who has had his work featured in Wired, Time, The Los Angeles Times, Harpers, and National Geographic, among others. His first book was "The World in a Grain: The Story of Sand and How it Transformed Civilization," which highlights how sand is the most important solid substance on earth - and what that means for the environment and its people. As we have started to run out of sand, and organized crime is starting to increase, it is important we understand the downsides of the energy transition and digital revolution. Beiser is now working on his next book "Power Metal" about the rare metals and renewable energy that might come at the expense of the environment, and its people. Metals like lithium cobalt, rare earth metals, and nickel are needed to build machines like solar panels, wind turbines, and electric cars. What's the trade-off? In this episode, we discuss: ~ The importance of sand – for roads, buildings, glass, and technology ~ The exploitation of the environment for technological needs ~ The exploitation of people for the mining of rare metals ~ Illegal sand mining and organized crime ~ If we should turn to the ocean and outer space for our tech needs ~ Minimizing harm to the environment - on an individual and societal level Tune into this episode of Technically Biased to learn more! You can find it on our website at https://gakovii.com/podcast/ Technically Biased is also available on Spotify, Apple, Amazon, and RSS, among others. If you like what you hear, feel free to visit our ⁠website⁠⁠ and subscribe to our newsletter. You can also follow us on ⁠LinkedIn⁠⁠ and ⁠Instagram⁠⁠. Tune in next week to learn about AI's role in creative and journalistic writing. ⁠https://gakovii.com/ #algorithmicbias⁠ ⁠#predatorytech⁠ ⁠#TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
Understanding Gender Bias in Tech, From Darwin to World War II to Present-Day | Episode 1.4
May 31 2023
Understanding Gender Bias in Tech, From Darwin to World War II to Present-Day | Episode 1.4
This episode, which discusses Liza Mundy’s book, Code Girls: The Untold Story of the American Women Code Breakers of World War II, delves into the history of American women cryptanalysts and code breakers. Tune into this episode to learn: How science used pseudoscientific gender biases to justify the exclusion of women How Darwin influenced societal gender expectations and beliefs The difference between an American housewife and an American woman code breaker How education was seen as a bad thing for women to attain How educated women saved American troops during WWII How the 1950s narrative suggested women want to marry young and the evidence that showed they’d rather work How women contributed to a combined ¾ of all code breakers in the American Navy and US Army, during WWII How the return of men from war meant women being slowly pushed back into the domestic role Books referenced: Code Girls: The Untold Story of the American Women Code Breakers of World War II by Liza Mundy The Patriarchs: How Men Came to Rule by Angela Saini Technically Biased is available on Spotify, Apple, Amazon, and RSS, among others. You can also check it out on our site at Podcast - Gakovii. You can check out our ⁠website⁠⁠⁠ for more book suggestions. If you like what you hear, feel free to visit our ⁠website⁠⁠ and subscribe to our newsletter. You can also follow us on ⁠LinkedIn⁠⁠ and ⁠Instagram⁠⁠.⁠ #algorithmicbias⁠⁠ #predatorytech⁠⁠ #TechnicallyBiasedPodcast #Gakovii --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
The Patriarchs: How Men Came to Rule with Angela Saini | Episode 1.3
May 24 2023
The Patriarchs: How Men Came to Rule with Angela Saini | Episode 1.3
Listen to author and science journalist, Angela Saini, discuss the myth of “The Patriarchy.” Rather than there being only one, there are and always have been many patriarchies. Her research illustrates how some societies used to be more egalitarian than they are today but have been influenced by the strict gender roles of more recent generations. By understanding the below, we may better understand how our perspective on gender influences sexist biases being codified in tech.In this episode, we discuss: How “we’re constrained by our own experiences and beliefs” (The Patriarchs, 7) How “subservience does not, on the whole, come naturally to people,” and how religion has been used to enforce women’s subservience in society (The Patriarchs, 27, 66) Çatalhöyük, one of the oldest settlements in ancient civilization, and their more egalitarian gender roles Marija Gimbutas’ theory that patriarchy in Old Europe dates back to around 3000 - 6000 years ago and comes from the Russian steppes How Greek history saw its goddesses gradually pushed into the background, while Greek gods came to the frontlines and started displaying more violent tendencies How the concept of the American woman was reimagined in the 1950s, to adhere to a more specific housewife trope that had been absent in the previous decades How the Native American woman had more rights than American women, and how American civilization took those rights away How Iran views transgender people and subsidizes sexual reassignment operations Ways that women maintain sexist patriarchal forces What it means to “resign ourselves to the systems and institutions we have, even when we know they’re not working,” and the blurry “boundary between choice and coercion” (The Patriarchs, 202) How Kyrgyzstan still faces high rates of bride kidnappings - both real and traditional Women’s timeless refusal to give into patriarchal oppression Unfortunately, not enough time could be allotted to cover everything, so do check out Angela Saini’s new book, The Patriarchs: How Men Came to Rule.Books referenced: The Patriarchs: How Men Came to Rule by Angela Saini Technically Biased is available on Spotify, Apple, Amazon, and RSS, among others. You can also check it out on our site at Podcast - Gakovii. You can check out our ⁠website⁠⁠⁠ for more book suggestions. If you like what you hear, feel free to visit our ⁠website⁠⁠ and subscribe to our newsletter. You can also follow us on ⁠LinkedIn⁠⁠ and ⁠Instagram⁠⁠.⁠ #algorithmicbias⁠⁠ #predatorytech⁠⁠ #TechnicallyBiasedPodcast --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message
More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech with Meredith Broussard | Episode 1.2
May 17 2023
More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech with Meredith Broussard | Episode 1.2
Second episode of Technically Biased is out now!Listen to expert Meredith Broussard share her knowledge regarding how biases are codified in tech. She will answer all the crazy questions, including:~ What is technochauvinism?~ Why shouldn't we use Facial Recognition Technology (FRT) in policing or EdTech?~ Do the algorithms used to predict recidivism rates ensure that they do not use data on wrongful arrests?~ Why is Google's skin cancer AI racist?~ How do algorithms perpetuate systemic biases, using centuries-old data?~ How did the NFL use bogus race science to perputuate racism?~ Why do Black people have to be significantly sicker than any other group of people to receive a kidney transplant?~ How do algorithms use student's grades to calculate imaginary scores and the likeliness for said students to commit a crime? How are these predictions racist and classist?~What is Y2K and Y2GAY?Unfortunately, not enough time could be allotted to cover all her wisdom, so do check out her new book, More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech! Check out this episode to find out the answers to the above! Books referenced: More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech by Meredith Broussard You can check out our website⁠⁠ for more book suggestions. If you like what you hear, feel free to visit our website⁠ and subscribe to our newsletter. You can also follow us on LinkedIn⁠ and Instagram⁠. Tune in next week for an interview with Angela Saini, who discusses her latest book - The Patriarchs.#algorithmicbias #predatorytech #TechnicallyBiasedPodcast --- Send in a voice message: https://podcasters.spotify.com/pod/show/technically-biased/message