Alan Grossfield
@agrossfield.bsky.social
Expert in biomolecular simulation and biophysics.
http://membrane.urmc.rochester.edu/
http://grossfieldlab.github.io/loos/
https://orcid.org/0000-0002-5877-2789
http://membrane.urmc.rochester.edu/
http://grossfieldlab.github.io/loos/
https://orcid.org/0000-0002-5877-2789
Not quite. Training an llm is fine — using that llm to distribute the information is not. To use google books as an example, buying copies of all of the books and scanning them is fine. Sharing them with the rest of the world violated copyright and was unethical, because it screwed the creators.
November 11, 2025 at 12:29 PM
Not quite. Training an llm is fine — using that llm to distribute the information is not. To use google books as an example, buying copies of all of the books and scanning them is fine. Sharing them with the rest of the world violated copyright and was unethical, because it screwed the creators.
Moreover, you keep ignoring 2/3 of my arguments. I’m back to you either debating in bad faith or just not very well. Either I’m muting this thread.
November 11, 2025 at 12:27 PM
Moreover, you keep ignoring 2/3 of my arguments. I’m back to you either debating in bad faith or just not very well. Either I’m muting this thread.
I’m not a defender of google by any stretch, but there’s nothing unethical or illegal about reading and indexing everything that’s publicly available. The ethical issue comes when you regurgitate it as your own, or redistribute it without permission. The llm business model doesn’t exist without that
November 11, 2025 at 12:27 PM
I’m not a defender of google by any stretch, but there’s nothing unethical or illegal about reading and indexing everything that’s publicly available. The ethical issue comes when you regurgitate it as your own, or redistribute it without permission. The llm business model doesn’t exist without that
An LLM takes the same information and spits it out as if it is its own, obscuring the sources of the information in addition to being unreliable. Before you point out that much of the internet is also unreliable, the difference is I can learn over time which sites are good, but I can’t with an llm.
November 11, 2025 at 1:45 AM
An LLM takes the same information and spits it out as if it is its own, obscuring the sources of the information in addition to being unreliable. Before you point out that much of the internet is also unreliable, the difference is I can learn over time which sites are good, but I can’t with an llm.
Yes, it is different, or at least search, which is what you initially invoked, is. For search, you slurp in everything, and use it to send people to the places that created the information (or at least displayed it). There’s no intrinsic ethical or intellectual property issue.
November 11, 2025 at 1:45 AM
Yes, it is different, or at least search, which is what you initially invoked, is. For search, you slurp in everything, and use it to send people to the places that created the information (or at least displayed it). There’s no intrinsic ethical or intellectual property issue.
You’re either an idiot or not arguing in good faith (or both, I suppose). If you actually want to learn, there are plenty of good writers who’ve delved into these issue professionally. Regardless, I’m done wasting my time with you.
November 11, 2025 at 12:36 AM
You’re either an idiot or not arguing in good faith (or both, I suppose). If you actually want to learn, there are plenty of good writers who’ve delved into these issue professionally. Regardless, I’m done wasting my time with you.
I know copilot has had issues with returning verbatim code from github, which is a copyright violation. Moreover, you keep pretending to think that returning copyright is the only ethical and legal issue, and ignoring the rest of the argument, that the training itself was unethical.
November 11, 2025 at 12:36 AM
I know copilot has had issues with returning verbatim code from github, which is a copyright violation. Moreover, you keep pretending to think that returning copyright is the only ethical and legal issue, and ignoring the rest of the argument, that the training itself was unethical.
First of all, yes, they can, on occasion. Second, their ability to mimic things, from “write a python script to do X” to “write a limerick in the style of famous author Y” is mostly derived from sucking in copyrighted material, without permission.
November 10, 2025 at 11:51 PM
First of all, yes, they can, on occasion. Second, their ability to mimic things, from “write a python script to do X” to “write a limerick in the style of famous author Y” is mostly derived from sucking in copyrighted material, without permission.
Yeah, I’ll take “things that didn’t happen” for $200.
November 10, 2025 at 6:49 PM
Yeah, I’ll take “things that didn’t happen” for $200.
Yes, but the search engine serves up links to the sites it “sucked up”; an llm represents its output as its own material. Both ethically and practically there’s very little similarity. Are you really this clueless that you don’t see a distinction?
November 9, 2025 at 8:12 PM
Yes, but the search engine serves up links to the sites it “sucked up”; an llm represents its output as its own material. Both ethically and practically there’s very little similarity. Are you really this clueless that you don’t see a distinction?
Yes, and neither of those are llms trained on stolen data.
November 9, 2025 at 7:16 PM
Yes, and neither of those are llms trained on stolen data.
Especially interesting since I suspect Franklin is far better known now than Wilkins.
November 7, 2025 at 8:13 PM
Especially interesting since I suspect Franklin is far better known now than Wilkins.
He had the rare ability to come off like an asshole in his own autobiography
November 7, 2025 at 8:06 PM
He had the rare ability to come off like an asshole in his own autobiography
That’s because the stated reason (usually antisemitism on campus, or some vague complaint about dei) has nothing to do with the real reason (intimidating elite academic institutions and bringing them to heel). They don’t really care what the school did — this is basically a shakedown.
November 7, 2025 at 6:35 PM
That’s because the stated reason (usually antisemitism on campus, or some vague complaint about dei) has nothing to do with the real reason (intimidating elite academic institutions and bringing them to heel). They don’t really care what the school did — this is basically a shakedown.
Even if they won, it might take years. Even a year or two without nih funding would essentially destroy their research infrastructure for a generation.
November 7, 2025 at 5:52 PM
Even if they won, it might take years. Even a year or two without nih funding would essentially destroy their research infrastructure for a generation.
Similarly, I hope mine didn’t come off that way. I was hoping to express sympathetic disappointment.
November 7, 2025 at 4:17 PM
Similarly, I hope mine didn’t come off that way. I was hoping to express sympathetic disappointment.
If we won’t do it for guns, I find it hard to believe we’ll do it LLMs.
November 7, 2025 at 1:46 PM
If we won’t do it for guns, I find it hard to believe we’ll do it LLMs.
The impersonators are less likely to kidnap you?
November 7, 2025 at 12:31 PM
The impersonators are less likely to kidnap you?
November 6, 2025 at 11:06 PM
Thanks for sharing — this is on my “to try” list!
November 3, 2025 at 5:23 PM
Thanks for sharing — this is on my “to try” list!
We haven’t tasted it yet, but it smells really good
November 2, 2025 at 4:59 PM
We haven’t tasted it yet, but it smells really good