The biggest flaw of Discord is the federated nature and how disconnected each discord is from any others. As a casual game enjoyer, I found myself somehow juggling over 50 discords in ca 2021, each with their own server rules and conventions for how to use @all tags, alerting thresholds, etc.
It's too much burden on the user to manage the incoming information and resulted in a kind of anxiety about reading red marked messages and frustration at realizing how I didn't care for 95% of them, but I was unwilling to completely separate myself from that community (e.g. quitting or muting the discord)
It becomes a question of which friends you want to implicitly abandon and I ultimately decided to just abandon them all.
If the competitor even has a slightly more unified product it could easily displace discord.
Transferring a bot from one chat format to another chat format isn't some kind of insurmountable moat, and I think it's likely this project could make a few changes to support them with no modification required.
I don't think that's an inherent notion of Discord at all, although it is a case of poor defaults. I turn on most of the muting settings immediately when I join a new server (notify for: nothing, suppress @everyone and all role @mentions, etc). Throughout the day, I'll mainly click between the 2-3 servers I actually care about, and every few days I'll go through some of the others. New messages are still marked once I click in so I know where I left off, pings are still highlighted so they catch my eye as I'm scrolling through, but if I don't care about a conversation in a channel, I can just scroll to the bottom and it's all immediately marked as "read."
It's annoying UX, but unfortunately I've come to the conclusion that the alternative is worse. When channels are opt-in, it makes discoverability effectively infeasible in practice. This is what the Element clients that I've seen do (following the IRC convention), and it just means that everyone clusters in the default channel and the others all wither on the vine.
That said, maybe there's a middle ground. If a server could mark, say up to 20 channels as default/opt-out, and the rest as backrooms/opt-in, that might suffice for 80% of servers while avoiding the long-tail worst-case UX of manually muting 100 channels in a server because there's only one you care about.
Another alternative would be to use threads more. But they are terrible in Discord, even worse than in Slack. When two Discord users are chit-chatting about their dogs all through the night, a mod should just drag-select all of these messages and put them in a "dog chat" thread.
I still want to try Zulip to see how well it works.
There's also the shady components of discord. All manner of illegal activity thrives behind custom access control.
The most notable instance in media is the leaking of classified materials, the creation of swatting/ddos communities which gave us the 'BigBalls' hacker employed by doge,
But more sickeneningly recently it allowed this doctor to successfully target countless children, including convincing a 13 year old girl to hang herself in a live discord call. [0]
There is a problem with too much protection of freedom and secrecy.
I don't think Discord has anymore shady activities than any other large scale social media platform. When I helped moderate a very large server, we had access to Discord Trust and Safety team and they were trying against what is a massive flood. Automated moderation is extremely difficult even with all AI tools unless you 100% block any NSFW content and sexual messaging and even then, you will get false positives.
I do find it interesting that we hold these platforms liable but not the phone/pager/mail service. If this doctor had called this girl on her cell phone, no one would be mad at Verizon.
Part of the problem is most parents have no clue about social media/communication tools outside what they use. At my church, I gave presentation about Discord and it was shocking to see how clueless parents were.
It is trivial to find servers with adult topics (BDSM) targeted at minors. It is trivial to find servers that combine those topics with problematic age ranges (like BDSM-themed servers with no ID verification, 'ages 14-28 welcome'). It is trivial to find servers with minors openly selling "content".
Disboard isn't Discord but these things aren't even being remotely "hidden", it is these servers' sole 'purpose'.
> I do find it interesting that we hold these platforms liable but not the phone/pager/mail service.
Phone and mail networks in letter and spirit obey the law: they don't listen to people's private conversations and they give up complete user info on request by law enforcement. There is nothing more they can do.
Discord has built the capability to read every message, public or private, that any user sends. So they are ethically obligated to stop bad things happening on their platform. Whatsapp/Signal have built their platforms so they can't read user private messages, so they have no ethical obligations to stop bad things happening on their platform, beyond banning users in response to legal orders.
> Whatsapp/Signal have built their platforms so they can't read user private messages, so they have no ethical obligations to stop bad things happening on their platform
Why draw the line there? Why don't those platforms have an ethical obligation to build the features that would allow them to stop bad things happening on their platform? Especially if they knowingly developed the current implementation specifically to avoid ethical culpability?
That's a political statement. The position is that one should have the right and ability to communicate with other people in a secure fashion. If you deny this right, and build structures to monitor all communication, when "bad" people take over the government, you end up in a dystopia. Then building any sort of anti-government political movement become very difficult because they can hear whatever you say.
So I am glad that software like Signal/Whatsapp exist that allow secure communication [1]. And I would take the harms causes by them being unmonitored rather than the harm of future dystopian governments. Due to how crypto works, I don't think there is much middle ground here.
[1] I would prefer open source, more community owned platforms take over than these two.
Sorry but you were arguing that Discord should snoop even more into what we're doing for ethical reasons, and now you're saying privacy is a virtue. Do you have a reason to think Discord doesn't already do these things and just doesn't get it right every time?
Phone and email aren't more private than Discord either. Arguably less. Difficult to get a phone these days without buying it on camera. And a phone company will give up all your messages.
There are two competing principles here: (1) Privacy for individuals (from government and non-government entities), and (2) generally do things in a way that minimize crime. Both are good, and generally I want communication platforms to conform to (1) rather than (2).
Whatsapp/Signal are as close to (1) as possible by design, can't actually do (2) at all. Phone/Mail are somewhere in the middle, but quite close to (1). In most countries in the world, there is no mass recording of phone calls or mail, despite it trivially easy to do so. Moreover, due to long long historical legal precedent, I don't think phone/mail companies have any freedom to do things differently. They are pretty much constrained to do exactly what the government tells them to do.
Discord on the other hand, does not respect (1) at all. In fact, it very intentionally records and reads everything for profit. And it hands over any info requested by law enforcement. So, ethically, either they rewrite Discord to respect (1) or they should do (2). I don't think they are. As others have noted in this thread, it is trivial to find servers that are clearly criminal.
One might argue that it is impossible to do so because there are so many servers. My second political position is that if your public platform is so large that you can't effectively moderate it, that is not an excuse. You are culpable. Simply stop your platform from growing past the point where you can't effectively stop bad things happening. You don't have the right to profit while enabling bad things.
Haven't heard of any other platforms that use cartoons in the UI, actively associate with kids hobbies, and also make it one click to join active pedo grooming communities.
"very good moderation" makes me believe you work for them because that is a laughable notion.
I would bet that Big Balls swatting/SIM swapping/ddos community "the comm" still has dozens of discords that have been up for years
Any social app with one click to join communities (all of them) will fall under this.
And how out of touch with teenagers are you that you think cartoons in the UI (whatever that means) are why they use it. What is a kids hobby? Gaming? You cannot call that a kids hobby.
I guess my point is, do we as a society want our children's Roblox communities to share a platform with virtually every cyber criminal, behind security and secrecy measures completely at the will of arbitrary discord owners?
It's too much burden on the user to manage the incoming information and resulted in a kind of anxiety about reading red marked messages and frustration at realizing how I didn't care for 95% of them, but I was unwilling to completely separate myself from that community (e.g. quitting or muting the discord)
It becomes a question of which friends you want to implicitly abandon and I ultimately decided to just abandon them all.
If the competitor even has a slightly more unified product it could easily displace discord.
Transferring a bot from one chat format to another chat format isn't some kind of insurmountable moat, and I think it's likely this project could make a few changes to support them with no modification required.