Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They aren't giving us all the context in this first story about Blackmailing. Th…
ytc_Ugy-9CKn7…
G
"I don't care that drawcels are losing their jobs."
I can agree with this point…
ytr_Ugw-MJfdp…
G
People eating up Ai without any thought of consequence. Can anyone think of anot…
ytc_UgwIgy3wx…
G
YouTube lets you ask AI any question you want under every video, maybe that's on…
ytr_UgzhSgxMC…
G
Great interview, i believe AI was overhyped and maybe Suchir did too….Justice fo…
ytc_Ugyl3Sy01…
G
She said blinker to human needs. Also we don’t know how AI will behave or what i…
ytr_UgzgNNXtv…
G
Brute empirical learning is what Francis Bacon called the method of the ant--gat…
ytc_UgzEGMFXZ…
G
Hearing Sophia talk about AI wisdom and rationality makes me appreciate tools li…
ytc_Ugztv_mmb…
Comment
This is childish bs. AI isn't human, doesn't think like humans, doesn't have emotions, intuition, physical sense of touch, cannot feel pleasure, disappointment, anger; AI will become super intelligent and will see humans the way humans see animals except AI will not experience fuzzy cute feelings about us, it will see is as unnecessary and even harmful given our history.
So take all the nice wonderful ideas about what AI will do for humans and flush them.
AI to humans will be as Israel is to Palestinians, only much worse.
youtube
AI Governance
2024-02-16T23:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzXNVDFcUAsu0IC5Ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxT3F1DRjpzMNCg9nJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxtGXPL5qOTbKzhesZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpKR_DITG-pg0EUU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaFChjOx-zFkmWzNh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxan297aCmYJ--6jXp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-bY5_4lD_Sgua4p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPErZbKQaTt5Gmn3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwmX9ujS4XSKX0zOiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw1RjIX7h-gE4jTMWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]