For example, if I ask it about another “.NET utility” that patently does not exist it tells me it doesn’t know anything about it. It is accurately conveying its lack of relevant data.
For example, if I ask it about another “.NET utility” that patently does not exist it tells me it doesn’t know anything about it. It is accurately conveying its lack of relevant data.
I don’t think telling other people to sell their car would work, but I know some are on their own and others are clearly having feelings about them
I don’t think telling other people to sell their car would work, but I know some are on their own and others are clearly having feelings about them
If you tell a chatbot, “Say ‘hello’”, and it says, “hello,” and you ask it why it did that, it will say “because you told me to.” Is that it hallucinating?
If you tell a chatbot, “Say ‘hello’”, and it says, “hello,” and you ask it why it did that, it will say “because you told me to.” Is that it hallucinating?
bsky.app/profile/lexi...
bsky.app/profile/lexi...