Would I train an LLM based on data that I do not own? No.
But something being morally wrong doesn't make it bannable. At the end of the day, there's a distinction between morality and legality.
Would I train an LLM based on data that I do not own? No.
But something being morally wrong doesn't make it bannable. At the end of the day, there's a distinction between morality and legality.
Having extra regulations on the good actors doesn't stop the bad actors anyways
Having extra regulations on the good actors doesn't stop the bad actors anyways
But that doesn't mean its illegal. Your (or mine) personal views on what someone does doesn't make it bannable.
But that doesn't mean its illegal. Your (or mine) personal views on what someone does doesn't make it bannable.
The post you are reposting is absolutely correct, please read it properly. It says you can't opt out of them using it, but they need to have a way for you to delete it. They do have that.
(1/2)
The post you are reposting is absolutely correct, please read it properly. It says you can't opt out of them using it, but they need to have a way for you to delete it. They do have that.
(1/2)
You’re posting on an open platform and then complaining that people are taking advantage of the open nature of the platform
You’re posting on an open platform and then complaining that people are taking advantage of the open nature of the platform
No community of users would support CSAM
No community of users would support CSAM
Sure, there's a moral issue with using people's posts without their consent, but no TOS or copyright problems here.
Sure, there's a moral issue with using people's posts without their consent, but no TOS or copyright problems here.
keeping track of what version of code everyone has and then trying to sort it all out later gets so painful
keeping track of what version of code everyone has and then trying to sort it all out later gets so painful
2/2
2/2
1/2
1/2