Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
(Stealing this straight off the transcript, response below)
"But what’s the real…
ytc_UgynRumz9…
G
You all know that coward white supremacy was going to instill this racism in AI.…
ytc_UgyHKjLPh…
G
The scary thing about AI is that it doesn't actually work the way people think i…
ytc_UgystbIJ1…
G
By making art a technology you literally give corporations an advantage at it. A…
ytc_Ugw0w8dLy…
G
Generative AIs are a waste of time and effort exactly for what you've mentioned.…
ytc_UgyNASgz4…
G
I DO NOT WANT WHAT AI BRINGS
I DO NOT WANT WHAT AI TAKES.
She’s right.…
ytc_UgwBHCrG4…
G
We have also kinda programmed ourselves what to do in situations like this. We k…
ytc_UgyHUa4Mw…
G
i used ai back when it was realy bad but as i found out it was realy realy bad i…
ytc_UgxOqFgV8…
Comment
I feel like psychologists probably foresaw this. If you train goals through rewards, you get a system that optimizes for rewards. And if lying is more efficient to receive rewards than doing real work, then you optimize for lying.
I know they tried to train AIs for intrinsic value, but since they can only judge the outcome, they can never be sure if an AI actually means well or is just a very well trained liar.
youtube
2025-11-06T07:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQs_rf83amVBeKNPN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCW19DkwO1YiyZ0dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzisCij5LKjgqIuWB54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1FtkdcGF-h0gHzvx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXUjaxLh4DH13Yyc54AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4hWNqYmhr6uIchPh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKRoYdXiHDGNdDr5d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCytUnVqe692S4Xg94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyB5AXJYNED6QvKsch4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5SsWR8nMEfVOfWS14AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"fear"}
]