Even if an LLM could be trusted to give you correct information 100% of the time, it would be an inferior method of learning it.
Shared by @gizmodo.com: buff.ly/yAAHtHq
by u/Many-Excitement3246
www.gov.uk/government/n...
www.gov.uk/government/n...
You'd think a company that had to pay $787,000,000 for deceiving its viewers would be a little more careful about blatantly continuing the practice.