back arrow
To all blogs

RHJ & Hacking//Hustling’s Statement at Perspectives on Encryption and Child Safety

apple logo with eye inside

Earlier this year, Apple shared a set of interventions regarding young people’s access to technology. After criticism, a narrow set of advocates were asked to give line-edits on the policy and held a series of conversations fronted by these hand-picked advocates.

Many things were missing.

Missing were marginalized communities who will be most impacted by these changes, and have struggled to be heard in tech spaces which value profits over people. Missing were conversations on the efficacy of these interventions, and any evidence that they would be successful in addressing the identified harms. Missing were conversations about evaluations of the program’s success, and transparency in how often these identifications were wrong. Missing were any protections against misuse, how misuse would be reported, or how to make sure these narrow intentions would not be forgotten during negotiations with authoritarian governments. Missing was the context that this has come in the midst of years of government demands for private companies — including Apple — to surveil their users to an extent that the government is barred by the fourth amendment, and then to report on its users in an effort to expand policing.

We are here to offer the perspective of sex worker activists, and our concerns are based on our experience in organizing. We are not speaking from conjecture, but from a history of having tech policies written in the name of protecting children being used to harm sex working communities, including youth who trade sex.

We also bring concerns that there is an underlying assumption that Apple is neutral on the sex trades. Apple products have never been a safe space for adults in the sex industry — in 2010, Steve Jobs said, “We do believe we have a moral responsibility to keep porn off the iPhone… Folks who want porn can buy an Android.” Right now you can’t have sex-related apps on their Appstore, meaning that we can develop harm reduction apps and create bad date lists, but we cannot as effectively distribute them because of Apple’s decisions to prioritize its moral stance over our community’s wellbeing.

This conversation does not exist in a vacuum and we should not treat it as such.

Privacy on our devices and encryption in our communication platforms have offered sex workers the ability to connect to community, negotiate with clients, and enact the basic harm reduction which keeps people safe. We are a community regularly erased and barred from technology — whether it is being banned or censored by Twitter, our fundraisers being deleted by GoFundMe, users weaponizing content moderation systems to erase marginalized communities, or our accounts being closed and our money stolen by Paypal.

Pretending that this specific intervention does not sit within the context of a host of abuses against sex workers, including an explicit antagonism against us by Apple, ignores everything that we know about how tech companies have treated the sex trades. And all of these erasures begin by being left out of every decision-making conversation.

There is no precedent that the expansion of policing and surveillance for a specific intervention has ever stayed within those parameters. Erasing users’ privacy should not be framed as a novel conversation with no implications to the wider impact.

When we think about who will inevitably be harmed by compromising privacy, we can simply look at those who already struggle to safely access information and connection. That means sex workers. That means LGBTQ youth seeking information on their bodies, communities and sexuality. That means anyone in Texas scrambling to find information on abortion care — something we have already seen Instagram compromise. This means anyone who does not consider home a safe place, and for whom parental control over their access to technology means isolation and the potential for abuse.

Like similar policies which have passed without community feedback, these policies will increase harm to the communities they purport to protect. Let’s not have a siloed conversation when we have never had a siloed intervention and lets not ignore the context that we live in. Let’s begin our conversations in addressing harm, not just hiding it and calling the damage done “unintended consequences.”

Shared on Oct 12, 2010 at the Electronic Frontier Foundation’s Perspectives on Encryption and Child Safety.

Learn more about how we can support you on these issues: https://www.reframehealthandjustice.com/services

Contact Us

Please reach out - we work with lots of people with lots of budgets. To learn more about pricing, visit our contact page.