Everything you wanna know about me can be found here! aboutme.roltairtheprotogen.me
I'd be interested to see a lawyers comments on Discord and Veratads policies around this.
I'd be interested to see a lawyers comments on Discord and Veratads policies around this.
Lets say you moderate a discord server, and someone sends content that gets blurred by discord, we need to be able to verify that the user posted something innapropriate ourselves before taking action at risk of mod abuse.
Lets say you moderate a discord server, and someone sends content that gets blurred by discord, we need to be able to verify that the user posted something innapropriate ourselves before taking action at risk of mod abuse.
The severe lack of transparency here is deeply concerning, expecially if this gets extended globally instead of just the UK and Australia.
The severe lack of transparency here is deeply concerning, expecially if this gets extended globally instead of just the UK and Australia.
If anyone wants to take a better look at it and correct me they can, I could only find "we do not store sensitive personal information"
If anyone wants to take a better look at it and correct me they can, I could only find "we do not store sensitive personal information"
Stop crossposting, or even checking on twitter sure, just make sure to maintain the handle to prevent impersonation.
Stop crossposting, or even checking on twitter sure, just make sure to maintain the handle to prevent impersonation.
If they close their accounts, theres a chance someone could make an account in their name and attempt to impersonate them.
I think the correct thing is more "Private your account, and log out."
If they close their accounts, theres a chance someone could make an account in their name and attempt to impersonate them.
I think the correct thing is more "Private your account, and log out."
If you don't authorise the AI app, you are opting into nothing.
As for the server component, yes, discord does need to implement permissions that do not allow these to function to protect images in servers.
If you don't authorise the AI app, you are opting into nothing.
As for the server component, yes, discord does need to implement permissions that do not allow these to function to protect images in servers.
The prompt appears on attempting to use one of the AI apps, and cancelling it means you cannot use the AI apps. (Refusing does not affect discord.)
The prompt appears on attempting to use one of the AI apps, and cancelling it means you cannot use the AI apps. (Refusing does not affect discord.)
OP's post was how server owners cannot disable it, which is correct.
In terms of the policies, before being able to use any apps, including these AI ones, you have to authorise it and agree to the policies.
OP's post was how server owners cannot disable it, which is correct.
In terms of the policies, before being able to use any apps, including these AI ones, you have to authorise it and agree to the policies.
Before being allowed to use the AI apps, you have to authorise them, which is you agreeing to their TOS and Privacy Policy and giving the AI app a bunch of permissions.
You do NOT have to agree to the apps policies to use Discord.
Before being allowed to use the AI apps, you have to authorise them, which is you agreeing to their TOS and Privacy Policy and giving the AI app a bunch of permissions.
You do NOT have to agree to the apps policies to use Discord.
This happens with all apps, like Dyno, YAGPDB, Etc.
This happens with all apps, like Dyno, YAGPDB, Etc.
The best way to "opt-out" is to not even test the services, by authorising the app, you are effectively opting into the service.
Unfortunately because these are external apps as opposed to an internal feature, I don't believe discord will add a permission to block this for server owners.
The best way to "opt-out" is to not even test the services, by authorising the app, you are effectively opting into the service.
Unfortunately because these are external apps as opposed to an internal feature, I don't believe discord will add a permission to block this for server owners.
- Disabling External Apps / application Commands / Activities does not disable the feature (Only limits it to being seen by the user running it).
- The AI's can respond with 18+ content in non explicit channels, particularly with "Roast".
- Disabling External Apps / application Commands / Activities does not disable the feature (Only limits it to being seen by the user running it).
- The AI's can respond with 18+ content in non explicit channels, particularly with "Roast".