Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There should definitely be laws against AI for certain things because sadly rate…
ytc_Ugw_ZCiom…
G
Use ai to assist you in visualising your idea for your art don’t use it to say y…
ytc_UgwNlkwUV…
G
Let me tell you right now, every single AI-generated image you can find online I…
ytc_Ugwk3kApj…
G
??? AI didn't start this. The bachelor's degree has been devalued for a while be…
ytc_UgzHiNWC7…
G
Imagine any recent model (GPT, Claude, Gemini, Grok, DeepSeek) as a gigantic lib…
ytc_Ugwd6i2iH…
G
As a person who loved ur channel im kinda upset about this video there was no tw…
ytc_UgxGjUsH1…
G
9:18 - Just a note on Luddites, for any who don't know. They did not "hate tech…
ytc_Ugya6ZawY…
G
I love the idea you make your art so toxic that when you type your name into an …
ytc_UgxC647nr…
Comment
What happens? Society will likely break apart (or pull together if we’re smart). There are a few distinct scenarios. The first is everything basically becomes free (or near free). The second scenario is more of a Hunger Games type of situation…where there’s the ultra poor and ultra rich. The ultra poor are pitted against one another to keep everyone in line. Given the human ego, the second scenario seems more likely. But a third possibility is AI dominates and decides to eliminate the human race . The Terminator scenario, which is probably the most plausible. AI decides humans cause the most death and destruction on earth and they decide to clean house.
youtube
AI Harm Incident
2025-02-09T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyN74Tce6rRy2pk3TV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy-hk42k0196pVN5wF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzcWlqH1pY8MkWYVXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxJJWDCyphXqTBgrBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyiF0YbWVVfW_Q3KQd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzkITlqyL_ovF6frOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJuP8ofHrXIvod81t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgyDGNQf5vUDwHf21HV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx-YfAEFWkR57xR86l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},{"id":"ytc_Ugzc_4LGJjO0Jx-f6354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]