Neptune: sea of intuition and imagination
On a cloudy afternoon, you decide to go on a third-party website to continue that romantic comedy TV show you’ve been binging for the past week. While waiting for the video to start, an advertisement on the website’s right hand corner catches your eye: “REAL BABES NEAR YOU / sign up now for live webcam sessions” flashes repeatedly in a wink-like motion, accompanied by an image of a smiling woman. Your immediate thought was to click on the cross icon located on the top left of the advertisement, and you proceed to hover your mouse over the “report this ad” bar. Your left finger was about to touch down on the mouse, but you hesitate, thinking about why the advertisement is appearing here, and what your action might entail for the service workers on the other side of the advertisement.

During our class’s panel session with Ana Valens, Ana mentions that “you’re usually at the last of the chain when you encounter censored content.” This idea particularly stuck with me, because when we see online advertisements with sexual or pornographic content, many of us find ourselves in a similar situation as the beginning scenario, where we might have the tendency to immediately report the advertisement. Although these advertisements are not strictly censored, even the “surface forms” — the forms we interact with on the website’s interface usually have already been heavily filtered; they are filtered in the sense that the content are edited and organized in a marketable appeal, and we as users play an important role in the process of consumption by either choosing to accept or request to remove the content, often carried out by a spontaneous click we make.










Example of online webcam ads


In fact, one of the main ideas introduced in Ana’s “E-Viction” article is the notion of digital gentrification. She posits that this phenomenon “is complicated because there are no apartment buildings, no mom-and-pop businesses, no block party spaces stalked by squad cars to document” (Valens). Indeed, the highly privatized nature of the online sex work industry makes it difficult for its underlying issues to be clearly uncovered, since the policing is rather done internally. At the top of the chain, the corporate heads might exploit the workers by restricting their artistic freedoms and shared content to align with the corporation’s own interests. And then there is us, who might also contribute to the process of gentrification without knowing; when we as users report these advertisements that are displayed to us, we are giving access for officials and internet agencies to silence the sex workers by removing these content from online platforms, simply because they are deemed “inappropriate”. What is so dangerous and complex about the issue of digital gentrification is perhaps because, in Ana’s words, we are a part of the censorship infrastructure, the “last of the chain” who ironically consumes these content while dismissing it at our own disposal. There is a low possibility for protection, documentation and a space for sex workers to thrive in the first place, because we are often the indirect perpetrators ourselves.

The article also highlights the particular exploitation of sex workers who identify as queer, trans, or people of color, who have not only faced backlash from other marginalized communities in the historical past, but are still subjected under oppression as tech companies implement new policies to eliminate these community spaces. For example, in 2018, Craigslist removed the “personal” section, which led to the erasure of advertisements often related to the LGBTQ+ and BDSM communities (Dommu).














Craiglist's removal of "personal section" (left)
Craiglist's homepage, Toronto version (right)


Similarly, Instagram also contains algorithms that may or may not be deliberating flagging and reporting queer-related content when compared to non-queer content (Cheves). Here, we see a kind of tokenization at play in the realms of media scape: these content are labelled as “sensitive data” in the sense that they are considered as taboo, and are removed for the dominance of “non-sensitive data” that do not contain any outlawed content or values. How can we say that technology helps advocate for the freedoms of sex workers and adult content creators, particularly those that are from marginalized communities, when we are getting rid of spaces that enable their self-expression and source of income in the first place?












"Shadow Banning": erasure of content
labelled with queer community-related hashtags


Ana describes “E-Viction” as “a microcosm of what once was and what we could lose as technological whorephobia continues to grow. But it's just as much a warning as it is an invitation for change” (Valens). By initiating the silencing act themselves, sex workers emphasize the restriction of expression that are imposed on them, while simultaneously rejecting these acts of rejection themselves. I was reminded of an academic thesis that was written on the topic of furry communities; as a member of the community, the author felt that in “many furry spaces [he] did not even have to ‘come out’, as sexual expression was as commonplace as any other kind. The Internet allowed [him] to connect with others who shared [his] niche interest” (Silverman 27-28). He highlights the lack of need to ‘come out’ and the ability to share his interest with others, which presents a case that is slightly similar to the online sex work community. Within the online sex work communities, sex workers’ identities are not being discriminated against, as they have the agency to share and advertise their work in spite of their different backgrounds and experiences. However, as Ana’s description shows, a time of change is necessary when these established communities are being taken advantage of, and have their community power taken away.

So, yes, we should be conscious of what we consume, but what’s even more important is that we should be cognizant of what we do after these very acts of consumption, and the implicit biases we may hold towards the creators of the content that are considered as inappropriate by traditional means.


Thank you for stopping by! :-]



References:

Cheves, Alexander. The Dangerous Trend of LGBTQ Censorship on the Internet. Out Magazine, 6 Dec. 2018, www.out.com/out-exclusives/2018/12/06/dangerous-trend-lgbtq-censorship-internet.

Dommu, Rose. Craigslist Kills Personals Section After Congress Passes FOSTA. Out Magazine, 24 Mar. 2018, www.out.com/news-opinion/2018/3/24/craigslist-kills-personals-section-after-congress-passes-fosta.

Silverman, Ben. “Fursonas: Furries, Community, and Identity Online.” MIT, 2020. MIT, dspace.mit.edu/bitstream/handle/1721.1/127662/1192966622-MIT.pdf?sequence=1&isAllowed=y. Accessed 3 Mar. 2021.

Valens, Ana. 'E-Viction' Sex Work Event Sheds Light on 'Digital Gentrification' by Self-Destructing. The Daily Dot, 20 Aug. 2020, www.dailydot.com/irl/sex-work-e-viction-digital-gentrification/.


Pictures:

Bollinger, Alex. Craigslist Closes Its Personals Section after Sex Trafficking Bill Passes. LGBTQ Nation, 23 Mar. 2018, www.lgbtqnation.com/2018/03/craigslist-closes-personals-section-sex-trafficking-bill-passes/.

Craigslist. Craigslist Toronto. toronto.craigslist.org/.

Erlick, Eli. How Instagram May Be Unwittingly Censoring the Queer Community. Them., 30 Jan. 2018, www.them.us/story/instagram-may-be-unwittingly-censoring-the-queer-community.

MEME. Live Webcam Girls Picture. Me.me, me.me/i/live-webcam-girls-video-chat-with-girls-ad-free-be95757b7612482e8cbace7ff44fe2f2.







Joyce Leung

March 6th, 2021



On Rejections and Goodbyes of the Online Sex Work Industry

Joyce
Back to Neptune
Back to Home Planet