You may have seen this tragic story about a teenager who committed suicide and used chat GPT to plan and work up the nerve to go through with it.
-
You may have seen this tragic story about a teenager who committed suicide and used chat GPT to plan and work up the nerve to go through with it. If you are skeptical that an LLM could really be responsible the details of this case will challenge you.
With LLMs "the user is always right" they are validation machines and will reinforce and validate any idea presented in a prompt.
Any idea, no matter how bad, can be refined, amplified.
-
M monkee@chaos.social shared this topic