Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most scary about this is that people don't know if this is real or not, we a…
ytc_UgyG-3G6W…
G
Biologists: the difference of human brains has to do more with structure than si…
ytc_UgzvEQppl…
G
Okay. Maybe I missed something, so this is an honest inquiry. I didn't see wher…
ytc_Ugx7p9TZC…
G
@MrGrantGregory thanks for the update. This video was awesome and scary at the s…
ytr_UgxqQ-S4B…
G
The better question is, how can we use AI to genetically engineer ourselves to c…
ytc_Ugz_tndJH…
G
Holy, a single hoyo employee used one AI art suddenly make the rest of the art t…
ytr_UgxHXgO7p…
G
when i grow up and become president I will burn every single ai art piece that h…
ytc_UgwP4Azat…
G
Yes just like what they have done in Wisconsin. Data Centers are very bad to th…
ytc_Ugy6Fi2Iv…
Comment
On the topic of emotions: an AI responding faster does not make the argument that AI can have emotions. Fear is diferent than working faster. Emotions can trigger actions. Jealousy can trigger violence. If an AI is violent, that does not mean it felt jealousy. Programming rhe responses that humans have to emotions will not give AI emotions. Emotions are felt.
youtube
AI Governance
2025-07-10T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzKmqcp8zaguJpaVY54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwo2rvJBpJcPo4J0j54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0uKxAzcywOlDs2UZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy92j_6QZ2YcUfLMxd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7ACFl34UJ0hfH1Wx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwJOUn7pnCl76enpe94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9qETBIgAOVsASh6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwicOSSVyAy3PVw90l4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylRBlWIRbkCTvsTJ14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy0AGUvAa2BO-IFjvx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]