
Sign up to save your podcasts
Or
Programming Note: I am travelling for work so I am re-running a previous episode. This is the episode where we layout the plans for Apple scan your iCloud photos for Child Sexual Abuse Material (CSAM). This is off the back of Google recently shutting down a Google account of someone who was accused of having CSAM, but it turned out he didn't.
-------------------------------------------------------------------------------
Apple searching your iCloud photos
Apple has come under fire recently because of their new machine learning they're adding to identify Child Sex Abuse Material. What are Apple trying to do? Are they looking into your phone and creating a backdoor? Why are people against this? Ehsan and I discuss.
Thank you for listening!
Read my latest blog post here
Click here to subscribe to this podcast on Apple, Google, Spotify, or wherever you get your podcasts.
Programming Note: I am travelling for work so I am re-running a previous episode. This is the episode where we layout the plans for Apple scan your iCloud photos for Child Sexual Abuse Material (CSAM). This is off the back of Google recently shutting down a Google account of someone who was accused of having CSAM, but it turned out he didn't.
-------------------------------------------------------------------------------
Apple searching your iCloud photos
Apple has come under fire recently because of their new machine learning they're adding to identify Child Sex Abuse Material. What are Apple trying to do? Are they looking into your phone and creating a backdoor? Why are people against this? Ehsan and I discuss.
Thank you for listening!
Read my latest blog post here
Click here to subscribe to this podcast on Apple, Google, Spotify, or wherever you get your podcasts.