The Shifting Privacy Left Podcast

Debra J. Farber (Shifting Privacy Left)

Shifting Privacy Left features lively discussions on the need for organizations to embed privacy by design into the UX/UI, architecture, engineering / DevOps and the overall product development processes BEFORE code or products are ever shipped. Each Tuesday, we publish a new episode that features interviews with privacy engineers, technologists, researchers, ethicists, innovators, market makers, and industry thought leaders. We dive deeply into this subject and unpack the exciting elements of emerging technologies and tech stacks that are driving privacy innovation; strategies and tactics that win trust; privacy pitfalls to avoid; privacy tech issues ripped from the headlines; and other juicy topics of interest.

read less
TechnologyTechnology

Episodes

S3E10: 'How a Privacy Engineering Center of Excellence Shifts Privacy Left' with Aaron Weller (HP)
Apr 9 2024
S3E10: 'How a Privacy Engineering Center of Excellence Shifts Privacy Left' with Aaron Weller (HP)
In this episode, I sat down with Aaron Weller, the Leader of HP's Privacy Engineering Center of Excellence (CoE), focused on providing technical solutions for privacy engineering across HP's global operations. Throughout our conversation, we discuss: what motivated HP's leadership to stand up a CoE for Privacy Engineering; Aaron's approach to staffing the CoE; how a CoE's can shift privacy left in a large, matrixed organization like HP's; and, how to leverage the CoE to proactively manage privacy risk.Aaron emphasizes the importance of understanding an organization's strategy when creating a CoE and shares his methods for gathering data to inform the center's roadmap and team building. He also highlights the great impact that a Center of Excellence can offer and gives advice for implementing one in your organization. We touch on the main challenges in privacy engineering today and the value of designing user-friendly privacy experiences. In addition, Aaron provides his perspective on selecting the right combination of Privacy Enhancing Technologies (PETs) for anonymity, how to go about implementing PETs, and the role that AI governance plays in his work. Topics Covered: Aaron’s deep privacy and consulting background and how he ended up leading HP's Privacy Engineering Center of Excellence The definition of a "Center of Excellence" (CoE) and how a Privacy Engineering CoE can drive value for an organization and shift privacy leftWhat motivates a company like HP to launch a CoE for Privacy Engineering and what it's reporting line should beAaron's approach to creating a Privacy Engineering CoE roadmap; his strategy for staffing this CoE; and the skills & abilities that he soughtHow HP's Privacy Engineering CoE works with the business to advise on, and select, the right PETs for each business use caseWhy it's essential to know the privacy guarantees that your organization wants to assert before selecting the right PETs to get you thereLessons Learned from setting up a Privacy Engineering CoE and how to get executive sponsorshipThe amount of time that Privacy teams have had to work on AI issues over the past year, and advice on preventing burnoutAaron's hypothesis about the value of getting an early handle on governance over the adoption of innovative technologiesThe importance of being open to continuous learning in the field of privacy engineering Guest Info: Connect with Aaron on LinkedInLearn about HP's Privacy Engineering Center of ExcellenceReview the OWASP Machine Learning Security Top 10Review the OWASP Top 10 for LLM Applications Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E9: 'Building a Culture of Privacy & Achieving Compliance without Sacrificing Innovation' with Amaka Ibeji (Cruise)
Apr 2 2024
S3E9: 'Building a Culture of Privacy & Achieving Compliance without Sacrificing Innovation' with Amaka Ibeji (Cruise)
Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka's passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable business leaders to do more with their data and provides a way for the community to share knowledge with one other.In our conversation, we touch on her career trajectory from security engineer to privacy engineer and the intersection of cybersecurity, privacy engineering, and AI governance. We highlight the importance of early engagement with various technical teams to enable innovation while still achieving privacy compliance. Amaka also shares the privacy-enhancing technologies (PETs) that she is most excited about, and she recommends resources for those who want to learn more about strategic privacy engineering. Amaka emphasizes that privacy is a systemic, 'wicked problem' and offers her tips for understanding and approaching it. Topics Covered:How Amaka's compliance-focused experience at Microsoft helped prepare her for her Privacy Engineering role at CruiseWhere privacy overlaps with the development of AI Advice for shifting privacy left to make privacy stretch beyond a compliance exerciseWhat works well and what doesn't when building a 'Culture of Privacy'Privacy by Design approaches that make privacy & innovation a win-win rather than zero-sum gamePrivacy Engineering trends that Amaka sees; and, the PETs about which she's most excitedAmaka's Privacy Engineering resource recommendations, including: Hoepman's "Privacy Design Strategies" book;The LINDDUN Privacy Threat Modeling Framework; andThe PLOT4AI Framework"The PALS Parlor Podcast," focused on Privacy Engineering, AI Governance, Leadership, & SecurityWhy Amaka launched the podcast;Her intended audience; andTopics that she plans to cover this yearThe importance of collaboration; building a community of passionate privacy engineers, and addressing the systemic issue of privacy Guest Info & Resources:Follow Amaka on LinkedInListen to The PALS Parlor PodcastRead Jaap-Henk Hoepman's "Privacy Design Strategies (The Little Blue Book)"Read Jason Cronk's "Strategic Privacy by Design, 2nd Edition"Check out The LINDDUN Privacy Threat Modeling FrameworkCheck out The Privacy Library of Threats for Artificial Intelligence (PLOT4.AI) Framework Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E8: 'Recent FTC Enforcement: What Privacy Engineers Need to Know' with Heidi Saas (H.T. Saas)
Mar 26 2024
S3E8: 'Recent FTC Enforcement: What Privacy Engineers Need to Know' with Heidi Saas (H.T. Saas)
In this week's episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado & Connecticut. Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data management practices to be transparent, accountable, and based on affirmative consent. We cover the role of privacy engineers in ensuring compliance with data privacy laws; why 'browsing data' is 'sensitive data;' the challenges companies face regarding data deletion; and the need for clear consent mechanisms, especially with the collection and use of location data. We also discuss the need to audit the privacy posture of products and services - which includes a requirement to document who made certain decisions - and how to prioritize risk analysis to proactively address risks to privacy.Topics Covered: Heidi’s journey into privacy law and advocacy for privacy by design and defaultHow the FTC brings enforcement actions, the effect of their settlements, and why privacy engineers should pay closer attentionCase 1: FTC v. InMarket Media - Heidi explains the implication of the decision: where data that are linked to a mobile advertising identifier (MAID) or an individual's home are not considered de-identifiedCase 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative express consent for location data collection; definition of a 'data product assessment' and audit programs; and data retention & deletion requirementsCase 3: FTC v. Avast - Heidi explains the implication of the decision: 'browsing data' is considered 'sensitive data'Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the decision, based on CalOPPA: where companies that share personal data with one another as part of a 'marketing cooperative' are, in fact, selling of dataHeidi discusses recent State Enforcement Sweeps for privacy, specifically in Colorado and Connecticut and clarity around breach reporting timelinesThe need to prioritize independent third-party audits for privacyCase 5: FTC v. Kroger - Heidi explains why the FTC's blocking of Kroger's merger with Albertson's was based on antitrust and privacy harms given the sheer amount of personal data that they processTools and resources for keeping up with FTC cases and connecting with your privacy community Guest Info: Follow Heidi on LinkedInRead (book):  'Means of Control: How the Hidden Alliance of Tech and Government is Creating a New American Surveillance State' Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E7: 'Personal CRM: Embracing Digital Minimalism & Privacy Empowerment' with Chris Zeunstrom (Yorba)
Mar 19 2024
S3E7: 'Personal CRM: Embracing Digital Minimalism & Privacy Empowerment' with Chris Zeunstrom (Yorba)
This week's episode, I chat with Chris Zeunstrom, the Founder and CEO of Ruca and Yorba. Ruca is a global design cooperative and founder support network, while Yorba is a reverse CRM that aims to reduce your digital footprint and keep your personal information safe. Through his businesses, Chris focuses on solving common problems and creating innovative products. In our conversation, we talk about building a privacy-first company, the digital minimalist movement, and the future of decentralized identity and storage.Chris shares his journey as a privacy-focused entrepreneur and his mission to prioritize privacy and decentralization in managing personal data. He also explains the digital minimalist movement and why its teachings reach beyond the industry. Chris touches on Yorba's collaboration with Consumer Reports to implement Permission Slip and creating a Data Rights Protocol ecosystem that automates data deletion for consumers. Chris also emphasizes the benefits of decentralized identity and storage solutions in improving personal privacy and security. Finally, he gives you a sneak peek at what's next in store for Yorba.Topics Covered: How Yorba was designed as a privacy-1st consumer CRM platform; the problems that Yorba solves; and key product functionality & privacy featuresWhy Chris decided to bring a consumer product to market for privacy rather than a B2B productWhy Chris incorporated Yorba as a 'Public Benefit Corporation' (PBC) and sought B Corp statusExploring 'Digital Minimalism' How Yorba's is working with Consumer Reports to advance the CR Data Rights Protocol, leveraging 'Permission Slip' - an authorized agent for consumers to submit data deletion requestsThe architectural design decisions behind Yorba’s personal CRM system The benefits to using Matomo Analytics or Fathom Analytics for greater privacy vs. using Google Analytics The privacy benefits to deploying 'Decentralized Identity' & 'Decentralized Storage' architecturesChris' vision for the next stage of the Internet; and, the future of YorbaGuest Info: Follow/Connect with Chris on LinkedInCheck out Yorba's website Resources Mentioned: Read: TechCrunch's review of YorbaRead: 'Digital Minimalism - Choosing a Focused Life In a Noisy World' by Cal NewportSubscribe to the Bullet Journal (AKA Bujo) on Digital Minimalism by Ryder CarrollLearn  about Consumer Reports' Permission Slip Protocol Check out Matomo Analytics  and Fathom  for privacy-first analytics platforms Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E6: 'Keys to Good Privacy Implementation: Exploring Anonymization, Consent, & DSARs' with Jake Ottenwaelder (Integrative Privacy)
Mar 5 2024
S3E6: 'Keys to Good Privacy Implementation: Exploring Anonymization, Consent, & DSARs' with Jake Ottenwaelder (Integrative Privacy)
In this week's episode, I sat down with Jake Ottenwaelder,  Principal Privacy Engineer at Integrative Privacy LLC. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy. Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interactions. He highlights what a successful implementation looks like and the negative consequences when done poorly. We also dive into the challenges of implementing privacy in fast-paced, engineering-driven organizations. We talk about the complexities of anonymizing data (a very high bar) and he offers valuable suggestions and strategies for achieving anonymity while making the necessary resources more accessible. Plus, Jake shares his advice for organizational leaders to see themselves as servant-leaders, leaving a positive legacy in the field of privacy. Topics Covered: What inspired Jake’s initial shift from security engineering to privacy engineering, with a focus on privacy implementationHow Jake's previous role at Axon helped him shift his mindset to privacyJake’s holistic approach to implementing privacy The qualities of a successful implementation and the consequences of an unsuccessful implementationThe challenges of implementing privacy in large organizations Common blockers to the deployment of anonymizationJake’s perspective on using differential privacy techniques to achieve anonymityCommon blockers to implementing consent management capabilitiesThe importance of understanding data flow & lineage, and auditing data deletion Holistic approaches to implementing a streamlined and compliant DSAR process with minimal business disruption Why Jake believes it's important to maintain a servant-leader mindset in privacyGuest Info: Connect with Jake on LinkedInIntegrative Privacy LLC Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E5: 'Nonconformist Innovation in Modern Digital Identity' with Steve Tout (Integrated Solutions Group)
Feb 27 2024
S3E5: 'Nonconformist Innovation in Modern Digital Identity' with Steve Tout (Integrated Solutions Group)
In this week's episode, I am joined by Steve Tout, Practice Lead at Integrated Solutions Group (ISG) and Host of The Nonconformist Innovation Podcast to discuss the intersection of privacy and identity. Steve has 18+ years of experience in global Identity & Access Management (IAM) and is currently completing his MBA from Santa Clara University. Throughout our conversation, Steve shares his journey as a reformed technologist and advocate for 'Nonconformist Innovation' & 'Tipping Point Leadership.'Steve's approach to identity involves breaking it down into 4 components: 1) philosophy, 2) politics, 3) economics & 4)technology, highlighting their interconnectedness. We also discuss his work with Washington State and its efforts to modernize Consumer Identity Access Management (IAM). We address concerns around AI, biometrics & mobile driver's licenses. Plus, Steve offers his perspective on tipping point leadership and the challenges organizations face in achieving privacy change at scale.Topics Covered: Steve's origin story; his accidental entry into identity & access management (IAM)Steve's perspective as a 'Nonconformist Innovator' and why he launched 'The Nonconformist Innovation Podcast'The intersection of privacy & identityHow to address organizational resistance to change, especially with lean resourcesBenefits gained from 'Tipping Point Leadership'4 common hurdles to tipping point leadership How to be a successful tipping point leader within a very bottom-up focused organization'Consumer IAM' & the driving need for modernizing identity in Washington StateHow Steve has approached the challenges related to privacy, ethics & equity Differences between the mobile driver's license (mDL) & verified credentials (VC) standards & technologyHow States are approaching the implementation of  mDL in different ways and the privacy benefits of 'selective disclosure'Steve's advice for privacy technologists to best position them and their orgs at the forefront of privacy and security innovationSteve recommended books for learning more about tipping point leadershipGuest Info: Connect with Steve on LinkedInListen to The Nonconformist Innovation Podcast Resources Mentioned: Steve's Interview with Tom KempTipping Point Leadership books:On Change Management Organizational BehaviorEthics in the Age of Disruptive Technologies: An Operational Roadmap Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.TRU Staffing PartnersTop privacy talent - when you need it, where you need it.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E4: 'Supporting Developer Accountability for Privacy' with Jake Ward (Data Protocol)
Feb 13 2024
S3E4: 'Supporting Developer Accountability for Privacy' with Jake Ward (Data Protocol)
This week, I chat with Jake Ward, the Co-Founder and CEO of Data Protocol, to discuss how the Data Protocol platform supports developers' accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.In our conversation, we dive into the topics covered in the Privacy Engineering Certification Program course offering , led by instructor Nishant Bhajaria, and the impact that engineers can make in their organization after completing it. Jake shares why he's so passionate about  empowering developers, enabling them to build safer products. We  talk about the effects of privacy engineering on large tech companies and how to bridge the gap between developers and the support they need with collaboration and accountability. Plus, Jake reflects on his own career path as the Press Secretary for a U.S. Senator and the experiences that shaped his perspectives and brought him to where he is now.Topics Covered: Jake’s career journey and why he landed on supporting software developers How Jake build Data Protocol and it’s community What 'shifting privacy left' means to JakeData Protocol's Privacy Engineering Courses, Labs, & Certification Program and what developers will take awayThe difference between Data Protocol's free Privacy Courses and paid CertificationFeedback from customers and & trends observedWhether tech companies have seen improvement in engineers' ability to embed privacy into the development of products & services after completing the Privacy Engineering courses and labs Other privacy-related courses available on Data Protocol, and privacy courses  on the roadmapWays to leverage communications to surmount current challengesHow organizations can make their developers accountable for privacy, and the importance of aligning responsibility, accountability & business processesHow Debra would operationalize this accountability into an organizationHow you can use the PrivacyCode.ai privacy tech platform to enable the operationalization of privacy accountability for developersResources Mentioned: Check out Data Protocol's courses, based on topicEnroll in The Privacy Engineering Certification Program (courses are free)Check out S3E2: 'My Top 20 Privacy Engineering Resources for 2024' Guest Info: Connect with Jake on LinkedIn Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnTRU Staffing PartnersTop privacy talent - when you need it, where you need it.Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E3: 'Shifting Left from Practicing Attorney to Privacy Engineer’ with Jay Averitt (Microsoft)
Jan 30 2024
S3E3: 'Shifting Left from Practicing Attorney to Privacy Engineer’ with Jay Averitt (Microsoft)
My guest this week is Jay Averitt, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community projects; and tips on how to spread privacy awareness and stay active within the industry. Topics Covered:Jay’s unique professional journey from Attorney to Privacy EngineerJay’s big mindset shift from serving as Privacy Counsel to Privacy Engineer, from a day-to-day and internal perspectiveWhy constant learning is essential in the field of privacy engineering, requiring us to keep up with ever-changing laws, standards, and technologiesJay’s comparison of what it's like to work for Twitter vs. Microsoft when it comes to how each company focuses on privacy and data protection Two ways to determine Privacy Program Maturity, according to JayHow engineering-focused organizations can unify around a corporate privacy strategy and how privacy pros can connect to people beyond their siloed teamsWhy building and maintaining relationships is the key for privacy engineers to be seen as enablers instead of blockers A detailed look at the 'Technical Privacy Review' processA peak into Privacy Quest’s gamified privacy engineering platform and the events that Jay & Debra are leading as part of its DPD'24 Festival Village month-long puzzles and eventsDebra's & Jay's experiences at the USENIX PEPR'23; why it provided so much value for them both; and, why you should consider attending PEPR'24  Ways to utilize online Slack communities, LinkedIn, and other tools to stay active in the privacy engineering worldResources Mentioned:Review talks from the University of Illinois 'Privacy Everywhere Conference 2024'Join the Privacy Quest Village's 'Data Privacy Day’24 Festival' (through Feb 18th)Submit a Proposal / Register for the USENIX PEPR ‘24 ConferenceGuest Info:Connect with Jay on LinkedIn Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnTRU Staffing PartnersTop privacy talent - when you need it, where you need it.Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E2: 'My Top 20 Privacy Engineering Resources for 2024' with Debra Farber (Shifting Privacy Left)
Jan 23 2024
S3E2: 'My Top 20 Privacy Engineering Resources for 2024' with Debra Farber (Shifting Privacy Left)
In Honor of Data Privacy Week 2024, we're publishing a special episode. Instead of interviewing a guest, Debra shares her 'Top 20 Privacy Engineering Resources' and why. Check out her favorite free privacy engineering courses, books, podcasts, creative learning platforms, privacy threat modeling frameworks, conferences, government resources, and more.DEBRA's TOP 20 PRIVACY ENGINEERING RESOURCES (in no particular order)Privado's Free Course: 'Technical Privacy Masterclass'OpenMined's Free Course: 'Our Privacy Opportunity' Data Protocol's Privacy Engineering Certification ProgramThe Privacy Quest Platform & Games; Bonus: The Hitchhiker's Guide to Privacy Engineering'Data Privacy: a runbook for engineers by Nishant Bhajaria'Privacy Engineering, a Data Flow and Ontological Approach' by Ian Oliver'Practical Data Privacy: enhancing privacy and security in data' by Katharine JarmulStrategic Privacy by Design, 2nd Edition by R. Jason Cronk'The Privacy Engineer's Manifesto: getting from policy to code to QA to value' by Michelle Finneran-Dennedy, Jonathan Fox and Thomas R. Dennedy USENIX Conference on Privacy Engineering Practice and Respect (PEPR)IEEE's The International Workshop on Privacy Engineering (IWPE)Institute of Operational Privacy Design (IOPD)'The Shifting Privacy Left Podcast,' produced and hosted by Debra J Farber and sponsored by PrivadoMonitaur's 'The AI Fundamentalists Podcast' hosted by Andrew Clark & Sid MangalikSkyflow's 'Partially Redacted Podcast' with Sean FalconerThe LINDDUN Privacy Threat Model Framework & LINDDUN GO Card GameThe Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework & PLOT4ai Card GameThe IAPP Privacy Engineering SectionThe NIST Privacy Engineering Program Collaboration SpaceThe EDPS Internet Privacy Engineering Network (IPEN)Read “Top 20 Privacy Engineering Resources” on Privado’s Blog. Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnTRU Staffing PartnersTop privacy talent - when you need it, where you need it.Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S3E1: "Privacy-preserving Machine Learning and NLP" with Patricia Thaine (Private AI)
Jan 2 2024
S3E1: "Privacy-preserving Machine Learning and NLP" with Patricia Thaine (Private AI)
My guest this week is Patricia Thaine, Co-founder and CEO of Private AI, where she leads a team of experts in developing cutting-edge solutions using AI to identify, reduce, and remove Personally Identifiable Information (PII) in 52 languages across text, audio, images, and documents.In this episode, we hear from Patricia about: her transition from starting a Ph.D. to co-founding an AI company; how Private AI set out to solve fundamental privacy problems to provide control and understanding of data collection; misunderstandings about how best to leverage AI regarding privacy-preserving machine learning; Private AI’s intention when designing their software, plus newly deployed features; and whether global AI regulations can help with current risks around privacy, rogue AI and copyright.Topics Covered:Patricia’s professional journey from starting a Ph.D. in Acoustic Forensics to co-founding an AI companyWhy Private AI’s mission is to solve privacy problems and create a platform for developers to modularly and flexibly integrate it anywhere you want in your software pipeline, including  model ingress & egressHow companies can avoid mishandling personal information when leveraging AI / machine learning; and Patricia’s advice to companies to avoid mishandling personal information Why keeping track of ever-changing data collection and regulations make it hard to find personal informationPrivate AI's privacy-enabling architectural approach to finding personal data to prevent it from being used by or stored in an AI modelThe approach that Privacy AI took to design their softwarePrivate AI's extremely high matching rate, and how they aim for 99%+ accuracyPrivate AI's roadmap & R&D effortsDebra & Patricia discuss AI Regulation and Patricia's insights from her article 'Thoughts on AI Regulation'A foreshadowing of AI’s copyright risk problem and whether regulations or licenses can helpChatGPT’s popularity, copyright, and the need for embedding privacy, security, and safety by design from the beginning (in the MVP)How to reach out to Patricia to connect, collaborate, or access a demoHow thinking about the fundamentals gets you a good way on your way to ensuring privacy & securityResources Mentioned:Read: Yoshua Bengio’s blog post: "How Rogue AI's May Arise"Read: Microsoft's Digital Defense Report 2023Read Patricia’s article, “Thoughts on AI Regulation” Guest Info:Connect with Patricia on LinkedInCheck out Private AI Demo Private Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnTRU Staffing PartnersTop privacy talent - when you need it, where you need it.Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E39: 'Contextual Responsive Intelligence & Data Minimization for AI Training & Testing' with Kevin Killens (AHvos)
Dec 26 2023
S2E39: 'Contextual Responsive Intelligence & Data Minimization for AI Training & Testing' with Kevin Killens (AHvos)
My guest this week is Kevin Killens, CEO of AHvos, a technology service that provides AI solutions for data-heavy businesses using a proprietary technology called Contextually Responsive Intelligence (CRI), which can act upon a business's private data and produce results without storing that data.In this episode, we delve into this technology and learn more from Kevin about: his transition from serving in the Navy to founding an AI-focused company; AHvos’ architectural approach in support of data minimization and reduced attack surface; AHvos' CRI technology and its ability to provide accurate answers based on private data sets; and how AHvos’ Data Crucible product helps AI teams to identify and correct inaccurate dataset labels.  Topics Covered:Kevin’s origin story, from serving in the Navy to founding AHvosHow Kevin thinks about privacy and the architectural approach he took when building AHvosThe challenges of processing personal data, 'security for privacy,' and the applicability of the GDPR when using AHvosKevin explains the benefits of Contextually Responsive Intelligence (CRI): which abstracts out raw data to protect privacy; finds & creates relevant data in response to a query; and identifies & corrects inaccurate dataset labelsHow human-created algorithms and oversight influence AI parameters and model bias; and, why transparency is so importantHow customer data is ingested into models via AHvosWhy it is important to remove bias from Testing Data, not only Training Data; and, how AHvos ensures accuracy How AHvos' Data Crucible identifies & corrects inaccurate data set labelsKevin's advice for privacy engineers as they tackle AI challenges in their own organizationsThe impact of technical debt on companies and the importance of building slowly & correctly rather than racing to market with insecure and biased AI modelsThe importance of baking security and privacy into your minimum viable product (MVP), even for products that are still in 'beta' Guest Info:Connect with Kevin on LinkedInCheck out AHvosCheck out Trinsic Technologies Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E38: "PrivacyGPT: Bringing an AI Privacy Startup to Market" with Nabanita De (Privacy License)
Dec 19 2023
S2E38: "PrivacyGPT: Bringing an AI Privacy Startup to Market" with Nabanita De (Privacy License)
My guest this week is Nabanita De, Software Engineer, Serial Entrepreneur, and Founder & CEO at Privacy License where she's on a mission to transform the AI landscape. In this episode, we discuss Nabanita's transition from Engineering Manager at Remitly to startup founder; what she's learned from her experience at Antler's accelerator program, her first product to market: PrivacyGPT and her work to educate Privacy Champions.  Topics Covered:Nabanita’s origin story, from conducting AI research at Microsoft as an intern all the way to founding Privacy LicenseHow Privacy License supports enterprises entering the global market while protecting privacy as a human rightA comparison between Nabanita's experience as a corporate role as Privacy Engineering Manager at Remitly versus her entrepreneurial role as Founder-in-Residence at AntlerHow PrivacyGPT, a Chrome browser plugin, empowers people to use ChatGPT with added privacy protections and without compromising data privacy standards by redacting sensitive and personal data before sending to ChatGPTNLP techniques that Nabanita leveraged to build out PrivacyGPT, including: 'regular expressions,' 'parts of speech tagging,' & 'name entity recognition'How PrivacyGPT can be used to protect privacy across nearly all languages, even where a user has no Internet connectionHow to use Product Hunt to gain visibility around a newly-launched product; and whether it's easier to raise a financial round in the AI space right nowNabanita’s advice for software engineers who might found a privacy or AI startup in the near futureWhy Nabanita created a Privacy Champions Program; and how it provides (non)-privacy folks with recommendations to prioritize privacy within their organizationsHow to sign up for PrivacyGPT’s paid pilot app, connect with Nabanita to collaborate, or subscribe to "Nabanita's Moonshots Newsletter" on LinkedInResources Mentioned:Check out Privacy LicenseLearn more about PrivacyGPTInstall the PrivacyGPT Chrome ExtensionLearn about Data Privacy Week 2024Guest Info:Connect with Nabanita on LinkedInSubscribe to the Nabanita's Moonshots NewsletterLearn more about The Nabinita De Foundation Learn more about Covid Help for IndiaLearn more about Project FiB Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E37: "Embedding Privacy Engineering into Real Estate" with Yusra Ahmad and Luke Beckley (The RED Foundation)
Dec 5 2023
S2E37: "Embedding Privacy Engineering into Real Estate" with Yusra Ahmad and Luke Beckley (The RED Foundation)
My guests this week are Yusra Ahmad, CEO of Acuity Data, and Luke Beckley, Data Protection Officer and Privacy Governance Manager at Correla, who work with The RED (Real Estate Data) Foundation, a sector-wide alliance that enables the real estate sector to benefit from an increased use of data, while voiding some of the risks that this presents, and better serving society.We discuss the current drivers for change within the real estate industry and the complexities of the real estate industry utilizing incredible amounts of data. You’ll learn the types of data protection, privacy, and ethical challenges The RED Foundation seeks to solve, especially now with the advent of new technologies. Yusra and Luke discuss some  ethical questions the real estate sector as it considers leveraging new technology. Yusra and Luke come to the conversation from the knowledgeable perspective as The RED Foundation’s Chair of the Data Ethics Steering Group and Chair of the Engagement and Awareness Group, respectively.Topics Covered:Introducing Luke Beckley (DPO, Privacy & Governance Manager at Correla) and Yusra Ahmed (CEO of Acuity Data); who are here to talk about their data ethics work at The RED FoundationHow the scope, sophistication, & connectivity of data is increasing exponentially in the real estate industryWhy ESG, workplace experience, & smart city development are drivers of data collection; and the need for data ethics reform within the real estate industryDiscussion of types of personal data these real estate companies collect & use across stakeholders: owners, operators, occupiers, employees, residents, etc.Current approaches that retailers take to protect location data, when collected; and why it's important to simplify language,  increase transparency, & make  consumers aware of tracking in in-store WIFi privacy noticesOverview of The RED Foundation & mission: to ensure the real estate sector benefits from an increased use of data, avoids some of the risks that this presents, and is better placed to serve societySome ethical questions with which the real estate sector needs to still align, along with examplesWhy there’s a need to educate the real estate industry on privacy-enhancing techThe need for privacy engineers and PETs in real estate; and why this will build trust with the different stakeholdersGuidance for privacy engineers who want to work in the real estate sector.Ways to collaborate with The RED Foundation to standardize data ethics practices across the real estate industryWhy there's great opportunity to embed privacy into real estate; and why its current challenges are really obstacles, rather than blockers.Resources Mentioned:Check out The RED FoundationGuest Info:Follow Yusra on LinkedInFollow Luke on LinkedIn Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E36: "Privacy Engineering Contracting: State of the Market & 2024 Predictions" with Jared Coseglia (TRU Staffing)
Nov 21 2023
S2E36: "Privacy Engineering Contracting: State of the Market & 2024 Predictions" with Jared Coseglia (TRU Staffing)
This week, I welcome Jared Coseglia, co-founder and CEO at TRU Staffing Partners, a contract staffing & executive placement search firm that represents talent across 3 core industry verticals: data privacy, eDiscovery, & cybersecurity. We discuss the current and future state of the contracting market for privacy engineering rols and the market drivers that affect hiring. You’ll learn about the hiring trends and the allure of 'part-time impact,' 'part-time perpetual,' and 'secondee' contract work. Jared illustrates the challenges that hiring managers face with a 'Do-it-Yourself' staffing process; and he shares his predictions about the job market for privacy engineers over the next 2 years. Jared comes to the conversation with a lot of data that supports his predictions and sage advice for privacy engineering hiring managers and job seekers. Topics Covered:How the privacy contracting market compares and contrasts to the full-time hiring market; and, why we currently see a steep rise in privacy contractingWhy full-time hiring for privacy engineers won't likely rebound until Q4 2024; and, how hiring for privacy typically follows a 2-year cycleWhy companies & employees benefit from fractional contracts; and, the differences between contracting types: 'Part-Time - Impact,' 'Part-Time - Perpetual,' and 'Secondee'How hiring managers typically find privacy engineering candidatesWhy it's far more difficult to hire privacy engineers for contracts; and, how a staffing partner like TRU can supercharge your hiring efforts and avoid the pitfalls of a "do-it-yourself" approachHow contract work benefits privacy engineers financially, while also providing them with project diversityHow salaries are calculated for privacy engineers; and, the driving forces behind pay discrepancies across privacy rolesJared's advice to 2024 job seekers, based on his market predictions; and, why privacy contracting increases 'speed to hire' compared to hiring FTEsWhy privacy engineers can earn more money by changing jobs in 2024 than they could by seeking raises in their current companies; and discussion of 2024 salary ranges across industry segmentsJared's advice on how privacy engineers can best position themselves to contract hiring managers in 2024Recommended resources for privacy engineering employers and job seekersResources Mentioned:Read: "State of the Privacy Job Market Q3 2023”Subscribe to TRU InsightsGuest Info:Connect with Jared on LinkedInLearn more about TRU Staffing PartnersEngineering Managers: Check out TRU Staffing Data Privacy Staffing solutionsPE Candidates: Apply to Open Privacy Positions Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E35: "Embed Ethics into Your SDLC: From Reactive Firefighting to 'Responsible Firekeeping'" with Mathew Mytka & Alja Isaković (Tethix)
Nov 14 2023
S2E35: "Embed Ethics into Your SDLC: From Reactive Firefighting to 'Responsible Firekeeping'" with Mathew Mytka & Alja Isaković (Tethix)
This week’s guests are Mathew Mytka and Alja Isakovoić, Co-Founders of Tethix, a company that builds products that embed ethics into the fabric of your organization. We discuss Matt and Alja’s core mission to bring ethical tech to the world, and Tethix’s services that work with your Agile development processes. You’ll learn about Tethix’s solution to address 'The Intent to Action Gap,' and what Elemental Ethics can provide organizations beyond other ethics frameworks. We discuss ways to become a proactive Responsible Firekeeper, rather than remaining a reactive Firefighter, and how ETHOS, Tethix's suite of apps can help organizations embody and embed ethics into everyday practice. TOPICS COVERED:What inspired Mat & Alja to co-found Tethix and the company's core missionWhat the 'Intent to Action Gap' is and how Tethix address itOverview of Tethix's Elemental Ethics framework; and how it empowers product development teams to 'close the 'Intent to Action Gap' and move orgs from a state of 'Agile Firefighting' to 'Responsible Firekeeping'Why Agile is an insufficient process for embedding ethics into software and product development; and how you can turn to Elemental Ethics and Responsible Firekeeping to embed 'Ethics-by-Design' into your Agile workflowsThe definition of 'Responsible Firekeeping' and its benefits; and how Ethical Firekeeping transitions Agile teams from a reactive posture to a proactive oneWhy you should choose Elemental Ethics over conventional ethics frameworksTethix's suite of apps called ETHOS: The Ethical Tension and Health Operating System apps, which help teams embed ethics into their collaboration tech stack (e.g., JIRA, Slack, Figma, Zoom, etc.)How you can become a Responsible FirekeeperThe level of effort required to implement Elemental Ethics & Responsible Firekeeping into Product Development based on org size and level of maturityAlja's contribution to the ResponsibleTech.Work, an open source Responsible Product Development Framework, core elements of the Framework, and why we need itWhere to learn more about Responsible FirekeepingRESOURCES MENTIONED:Read: "Day in the Life of a Responsible Firekeeper"Review the ResponsibleTech.Work FrameworkSubscribe to the Pathfinders NewmoonsletterGUEST INFO:Connect with Mat on LinkedInConnect with Alja on LinkedInCheck out Tethix’s Website  Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E34: "Embedding Privacy by Design & Threat Modeling for AI" with Isabel Barberá (Rhite & PLOT4ai)
Nov 7 2023
S2E34: "Embedding Privacy by Design & Threat Modeling for AI" with Isabel Barberá (Rhite & PLOT4ai)
This week’s guest is Isabel Barberá, Co-founder, AI Advisor, and Privacy Engineer at Rhite , a consulting firm specializing in responsible and trustworthy AI and privacy engineering, and creator of The Privacy Library Of Threats 4 Artificial Intelligence Framework and card game. In our conversation, we discuss: Isabel’s work with privacy-by-design, privacy engineering, privacy threat modeling, and building trustworthy AI; and info about Rhite’s forthcoming Self-Assessment Open-Source framework for AI maturity, SARAI®. As we wrap up the episode, Isabel shares details about PLOT4ai, her AI threat modeling framework and card game created based on a library of threats for artificial intelligence. Topics Covered:How Isabel became interested in privacy engineering, data protection, privacy by design, threat modeling, and trustworthy AIHow companies are thinking (or not) about incorporating privacy-by-design strategies & tactics and privacy engineering approaches within their orgs todayWhat steps can be taken so companies start investing in privacy engineering approaches; and whether AI has become a driver for such approaches.Background on Isabel’s company, Rhite, and its mission to build responsible solutions for society and its individuals using a technical mindset. What “Responsible & Trustworthy AI” means to Isabel The 5 core values that make up the acronym, R-H-I-T-E, and why they’re important for designing and building products & services.Isabel's advice for organizations as they approach AI risk assessments, analysis, & remediation The steps orgs can take in order to  build responsible AI products & servicesWhat Isabel hopes to accomplish through Rhite's new framework: SARAI® (for AI maturity), an open source AI Self-Assessment Tool and Framework, and an extension the Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework (i.e., a library of AI risks)What motivated Isabel to focus on threat modeling for privacyHow PLOT4ai builds on LINDDUN (which focuses on software development) and extends threat modeling to the AI lifecycle stages: Design, Input, Modeling, & OutputHow Isabel’s experience with the LINDDUN Go card game inspired her to develop of a PLOT4ai card game to make it more accessible to teams.Isabel calls for collaborators to contribute to the PLOT4ai open source database of AI threats as the community grows.Resources Mentioned:Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai)PLOT4ai's Github Threat Repository"Threat Modeling Generative AI Systems with PLOT4ai”  Self-Assessment for Responsible AI (SARAI®)LINDDUN Privacy Threat Model Framework"S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)”"Data Privacy: a runbook for engineers"Guest Info:Isabel's LinkedIn ProfileRhite’s Website  Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E33: "Using Privacy Code Scans to Shift Left into DevOps" with Vaibhav Antil (Privado)
Oct 31 2023
S2E33: "Using Privacy Code Scans to Shift Left into DevOps" with Vaibhav Antil (Privado)
This week, I sat down with Vaibhav Antil ('Vee'), Co-founder & CEO at Privado, a privacy tech platform that's leverages privacy code scanning & data mapping to bridge the privacy engineering gap.  Vee shares his personal journey into privacy, where he started out in Product Management and saw need for privacy automation in DevOps. We discuss obstacles created by the rapid pace of engineering teams and a lack of a shared vocabulary with Legal / GRC. You'll learn how code scanning enables privacy teams to move swiftly and avoid blocking engineering. We then discuss the future of privacy engineering, its growth trends, and the need for cross-team collaboration. We highlight the importance of making privacy-by-design programmatic and discuss ways to scale up privacy reviews without stifling product innovation. Topics Covered:How Vee moved from Product Manager to Co-Founding Privado, and why he focused on bringing Privacy Code Scanning to market.What it means to "Bridge the Privacy Engineering Gap" and 3 reasons why Vee believes the gap exists.How engineers can provide visibility into personal data collected and used by applications via Privacy Code Scans.Why engineering teams should 'shift privacy left' into DevOps.How a Privacy Code Scanner differs from traditional static code analysis tools in security.How Privado's Privacy Code Scanning & Data Mapping capabilities (for the SDLC) differ from personal data discovery, correlation, & data mapping tools (for the data lifecycle).How Privacy Code Scanning helps engineering teams comply with new laws like Washington State's 'My Health My Data Act.'A breakdown of  Privado’s FREE "Technical Privacy Masterclass."Exciting features on Privado’s roadmap, which support its vision to be the platform for collaboration between privacy operations & engineering teams.Privacy engineering  trends and Vee’s predictions for the next two years. Privado Resources Mentioned:Free Course: "Technical Privacy Masterclass" (led by Nishant Bhajaria)Guide: Introduction to Privacy Code ScanningGuide: Code Scanning Approach to Data MappingSlack: Privado's Privacy Engineering CommunityOpen Source Tool: Play Store Data Safety Report BuilderGuest Info:Connect with Vee on LinkedInCheck out Privado's website Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E32: "Privacy Red Teams, Protecting People & 23andme's Data Leak" with Rebecca Balebako (Balebako Privacy Engineer)
Oct 24 2023
S2E32: "Privacy Red Teams, Protecting People & 23andme's Data Leak" with Rebecca Balebako (Balebako Privacy Engineer)
This week’s guest is Rebecca Balebako,  Founder and Principal Consultant at Balebako Privacy Engineer, where she enables data-driven organizations to build the privacy features that their customers love. In our conversation, we discuss all things privacy red teaming, including: how to disambiguate adversarial privacy tests from other software development tests; the importance of privacy-by-infrastructure; why privacy maturity influences the benefits received from investing in privacy red teaming; and why any database that identifies vulnerable populations should consider adversarial privacy as a form of protection. We also discuss the 23andMe security incident that took place in October 2023 and affected over 1 mil Ashkenazi Jews (a genealogical ethnic group). Rebecca brings to light how Privacy Red Teaming and privacy threat modeling may have prevented this incident.  As we wrap up the episode, Rebecca gives her advice to Engineering Managers looking to set up a Privacy Red Team and shares key resources. Topics Covered:How Rebecca switched from software development to a focus on privacy & adversarial privacy testingWhat motivated Debra to shift left from her legal training to privacy engineeringWhat 'adversarial privacy tests' are; why they're important; and how they differ from other software development testsDefining 'Privacy Red Teams' (a type of adversarial privacy test) & what differentiates them from 'Security Red Teams'Why Privacy Red Teams are best for orgs with mature privacy programsThe 3 steps for conducting a Privacy Red Team attackHow a Red Team differs from other privacy tests like conducting a vulnerability analysis or managing a bug bounty programHow 23andme's recent data leak, affecting 1 mil Ashkanazi Jews, may have been avoided via Privacy Red Team testingHow BigTech companies are staffing up their Privacy Red TeamsFrugal ways for small and mid-sized organizations to approach adversarial privacy testingThe future of Privacy Red Teaming and whether we should upskill security engineers or train privacy engineers on adversarial testingAdvice for Engineer Managers who seek to set up a Privacy Red Team for the first timeRebecca's Red Teaming resources for the audienceResources Mentioned:Listen to: "S1E7: Privacy Engineers: The Next Generation" with Lorrie Cranor (CMU)Review Rebecca's Red Teaming Resources Guest Info:Connect with Rebecca on LinkedInVisit Balebako Privacy Engineer's website Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E31: "Leveraging a Privacy Ontology to Scale Privacy Processes" with Steve Hickman (Epistimis)
Oct 10 2023
S2E31: "Leveraging a Privacy Ontology to Scale Privacy Processes" with Steve Hickman (Epistimis)
This week’s guest is Steve Hickman, the founder of Epistimis, a privacy-first process design tooling startup that evaluate rules and enables the fixing of privacy issues before they ever take effect. In our conversation, we discuss: why the biggest impediment to protecting and respecting privacy within organizations is the lack of a common language; why we need a common Privacy Ontology in addition to a Privacy Taxonomy; Epistimis' ontological approach and how it leverages semantic modeling for privacy rules checking; and, examples of how Epistimis Privacy Design Process tooling complements privacy tech solutions on the market, not compete with them.Topics Covered:How Steve’s deep engineering background in aerospace, retail, telecom, and then a short stint at Meta, led him to found Epistimis Why its been hard for companies to get privacy right at scaleHow Epistimis leverages 'semantic modeling' for rule checking and how this helps to scale privacy as part of an ontological approachThe definition of a Privacy Ontology and Steve's belief that all should use one for common understanding at all levels of the businessAdvice for designers, architects, and developers when it comes to creating and implementing privacy ontology, taxonomies & semantic modelsHow to make a Privacy Ontology usableHow Epistimis' process design tooling work with discovery and mapping platforms like BigID & Secuvy.aiHow Epistimis' process design tooling work along with a platform like Privado.ai, which scans a company's product code and then surfaces privacy risks in the code and detects processing activities for creating dynamic data mapsHow Epistimis' process design tooling works with PrivacyCode, which has a library of privacy objects, agile privacy implementations (e.g., success criteria & sample code), and delivers metrics on the privacy engineering process is goingSteve calls for collaborators who are interested in POCs and/or who can provide feedback on Epistimis' PbD processing toolingSteve describes what's next on the Epistimis roadmap, including wargamingResources Mentioned:Read Dan Solove's article, "Data is What Data Does: Regulating Based on Harm and Risk Instead of Sensitive Data"Guest Info:Connect with Steve on LinkedInReach out to Steve via EmailLearn more about Epistimis Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.
S2E30: "LLMs, Knowledge Graphs, & GenAI Architectural Considerations" with Shashank Tiwari (Uno)
Oct 3 2023
S2E30: "LLMs, Knowledge Graphs, & GenAI Architectural Considerations" with Shashank Tiwari (Uno)
This week's guest is Shashank Tiwari, a seasoned engineer and product leader who started with algorithmic systems of Wall Street before becoming Co-founder & CEO of Uno.ai, a pathbreaking autonomous security company. He started with algorithmic systems on Wall Street and then transitioned to building Silicon Valley startups, including previous stints at Nutanix, Elementum, Medallia, & StackRox. In this conversation, we discuss ML/AI, large language models (LLMs), temporal knowledge graphs, causal discovery inference models, and the Generative AI design & architectural choices that affect privacy.  Topics Covered:Shashank describes his origin story, how he became interested in security, privacy, & AI while working on Wall Street; & what motivated him to found UnoThe benefits to using "temporal knowledge graphs," and how knowledge graphs are used with LLMs to create a "causal discovery inference model" to prevent privacy problemsThe explosive growth of Generative AI, it's impact on the privacy and confidentiality of sensitive and personal data, & why a rushed approach could result in mistakes and societal harm  Architectural privacy and security considerations for: 1) leveraging  Generative AI, and those to avoid certain mechanisms at all costs; 2) verifying, assuring, & testing against "trustful data" rather than "derived data;" and 3) thwarting common Generative AI attack vectorsShashank's predictions for Enterprise adoption of Generative AI over the next several yearsShashank's thoughts on proposed and future AI-related legislation may affect the Generative AI market overall and Enterprise adoption more specificallyShashank's thoughts on the development of AI standards across tech stacksResources Mentioned:Check out episode S2E29: Synthetic Data in AI: Challenges, Techniques & Use Cases with Andrew Clark and Sid Mangalik (Monitaur.ai)Guest Info:Connect with Shashank on LinkedInLearn more about Uno.ai Privado.aiPrivacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.Shifting Privacy Left MediaWhere privacy engineers gather, share, & learnDisclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.Copyright © 2022 - 2024 Principled LLC. All rights reserved.