Clubhouse’s security and privacy lag behind its explosive growth

Clubhouse’s security and privacy lag behind its explosive growth

by Tech News
0 comment 10 views
A+A-
Reset

Members only —

The platform has promised to do better after a string of incidents.

Lily Hay Newman, wired.com

Clubhouse Has A Long Way To Go To Assure Its Users That Its Privacy And Security Policies Are Fully Baked.

Enlarge / Clubhouse has a long way to go to assure its users that its privacy and security policies are fully baked.

Carsten Koall | Getty Images

In recent months, the audio-based social media app Clubhouse has emerged as Silicon Valley’s latest disruptive darling. The format feels familiar: part Twitter, part Facebook Live, part talking on the phone. But as Clubhouse continues to expand, its security and privacy failings have come under increased scrutiny—and left the company scrambling to correct problems and manage expectations.

Clubhouse, still in beta and available only on iOS, offers its users “rooms” that are essentially group audio chats. They can also be set as public addresses or panel discussions where some users are “speakers” and the rest are audience members. The platform reportedly has over 10 million users and is valued at $1 billion. Since last year it has been an invite-only haven for Silicon Valley elite and celebrities, including an Elon Musk appearance earlier this month. But the company has struggled both with concrete security issues and more ephemeral questions around how much privacy its users should expect.

“With smaller, newer social media platforms we should be on our guard about our data, especially when they go through huge growth it tests a lot of the controls,” says security researcher Robert Potter. “Things you might have gotten away with with only 100,000 people on the platform—you increase those numbers tenfold and the level of exposure goes up, the threat goes up, the number of people probing your platform goes up.”

Wired Logo

Recent security concerns about Clubhouse run the gamut from vulnerabilities to questions about the app’s underlying infrastructure. A little over a week ago, researchers from the Stanford Internet Observatory put a spotlight on the platform when they found that the app was transmitting users’ Clubhouse identifiers and chatroom identity numbers unencrypted, meaning that a third party could have potentially tracked your actions in the app. The researchers further pointed out that some of Clubhouse’s infrastructure is run by a Shanghai-based firm and it seemed that the app’s data was traveling through China at least some of the time—potentially exposing users to targeted or even widespread Chinese government surveillance. Then on Sunday, Bloomberg confirmed that a third-party website was scraping and compiling audio from Clubhouse discussions. Early Monday, further revelations followed that Clubhouse discussions were being scraped for an unaffiliated Android app, allowing users on that operating system to listen along in real-time.

Potter, one of the researchers who investigated the different Clubhouse data scraping projects, explains that these apps and websites didn’t seem malicious; they just wanted to make Clubhouse content available to more people. But the developers were only able to do so because Clubhouse didn’t have anti-scraping mechanisms that could have stopped that. Clubhouse didn’t limit how many rooms a single account could stream from at once, for example, so anyone could create an application programming interface to stream every public channel at the same time.

More mature social networks like Facebook have more developed mechanisms for locking their data down, both to prevent user privacy violations and to defend the data they hold as an asset. But even they can still have potential exposures from creative scraping techniques.

Clubhouse has also come under scrutiny for its aggressive collection of users’ contact lists. The app strongly encourages all users to share their address book data so Clubhouse can help you make connections with people you know who are already on the platform. It also requires you to share your contact list in order to invite other people to the platform, since Clubhouse is still invite-only, which contributes a sense of exclusivity and privacy. Numerous users have pointed out, though, that when you go to invite others, the app also makes suggestions based on what phone numbers in your contacts are also in the contacts of the largest number of Clubhouse users. In other words, if you and your local friends all use the same florist, doctor, or drug dealer, they very well could show up on your list of suggested people to invite.

Clubhouse did not respond to a request from WIRED for comment by press time about its recent security stumbles. In a statement to the Stanford Internet Observatory researchers, Clubhouse detailed specific changes it planned to make to strengthen its security, including cutting off pings to servers in China and strengthening its encryption. The company also said it would work with a third-party data security firm to help see the changes through. In response to the unauthorized website that was re-streaming Clubhouse discussions, the company told media outlets that it had permanently banned the user behind it and would add additional “safeguards” to prevent the situation from occurring again.

Though Clubhouse seems to be taking researcher feedback seriously, the company hasn’t been specific about all of the security improvements it has implemented or plans to add. Additionally, given that the app doesn’t appear to offer end-to-end encryption to its users, researchers say there is still a sense that Clubhouse hasn’t given adequate thought to its security posture. And that’s even before you grapple with some of the fundamental privacy questions the app raises.

When you start a new Clubhouse room, you can choose from three settings: an “open” room is accessible by any user on the platform, a “social” room only admits people you follow, and a “closed” room restricts access to invitees. Each comes with its own implicit level of privacy, which Clubhouse could make more explicit.

“I think for public rooms, Clubhouse should give users the expectation that public means public to all users, since anyone can join and record, take notes, etc.” says David Thiel, chief technology officer of the Stanford Internet Observatory. “For private rooms, they can convey that as with any communication mechanism, an authorized member can record contents and identities, so make sure you both establish expectations and trust the participants.”

Like any prominent social network, Clubhouse has also struggled to deal with abuse on the platform. The app’s terms of service ban hate speech, racism, and harassment as of November, and the platform offers some moderation features, like the ability to block users or flag a room as potentially abusive. But one of Clubhouse’s biggest features is also a problem for anti-abuse: People can use the platform without the liability that their contributions will be automatically saved as posts. This can embolden some users to make abusive or derogatory remarks, thinking they won’t be recorded and won’t face consequences.

Stanford’s Thiel says that Clubhouse currently stores recordings of discussions temporarily to review in case of abuse claims. If the company were to implement end-to-end encryption for security, though, it would have an even more difficult time staying on top of abuse, because it wouldn’t be able to make those recordings so easily. Every social media platform faces some version of this tension, but security experts agree that, when relevant, the benefits of adding end-to-end encryption are worth the added challenge of developing more nuanced and creative anti-abuse solutions.

Even end-to-end encryption doesn’t eliminate the additional possibility that any Clubhouse user could be externally recording the conversation they’re in. That’s not something Clubhouse can easily solve. But it can at least set expectations accordingly, no matter how friendly and off the record the conversation feels. “Clubhouse should just be clear about what it’s going to contribute to your privacy,” says Potter, “so you can set what you’re going to talk about accordingly.”

This story originally appeared on wired.com.

Read More

You may also like

Leave a Comment