Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm going to comment before I watch the video. I think mixing of Worker and A.I.…
ytc_UgwW-wWgo…
G
We don’t even know what consciousness is. How could we expect AI to understand t…
ytc_Ugy3aBvh8…
G
I am so disappointed. How, HOW is digital art even CLOSE to being comparable to …
ytc_UgxIc1DLi…
G
Great. Now AI making accusations of these people. Even AI loves helping criminal…
ytc_UgzYLO11I…
G
> continue to have kids
But why though? Why risk the lives of people who don…
rdc_emnxg9c
G
The real reason it can go bad is because as you said most people care in gen abo…
ytc_UgwSNavfO…
G
At the first sight of Social Unrest, the law makers will hit AI companies with l…
ytc_UgzVIYeeL…
G
13:13 I encountered the SAME THING decades ago, on the job after our office impl…
ytc_UgzjsM_dd…
Comment
i think we fucked up the most when we thought it was a good idea to give these AI's "personalities".
I always felt that different PC's had personalities based on some working better than others, even with the same builds, so for this to be put into AI is dangerous because now we can only predict what it can do with the personality vs what it shouldn't or can't and thats such an experience i have no words for
youtube
AI Harm Incident
2025-09-10T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx1QC5Iu-IqctHoMBR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2RLbdwnyVAjh7q0t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBGa1-oJIs0akrkNx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhuswUmvTw__WrTAJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwP-AwEeKE4snSaE2R4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoE8kBMSwq-hNgatJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzmMLthmaY9a_xigJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuQdZ03O1npCFsMyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRJ33P7KYLNmNAu2h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLCaNovFohGmeeYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}
]