Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a flaw in the argument that the ability to feel pain or pleasure is nec…
ytc_Ugg4TuIQP…
G
A.I. learns like a child at first. However like a child it will listen and learn…
ytc_UgyB-Cs0w…
G
I dont understand people that want Art to progress when it doesnt progress anymo…
ytr_UgyHlp75y…
G
I can't see how it won't replace ceos. An AI can easily be made to make better d…
rdc_jf74cdq
G
See, why would I , as an artist, be concerned or fed up with AI-art? If I am goo…
ytc_UgxyYw6Oz…
G
I'll throw in (maybe 1/8 thru the video) that ChatGPT is an absurdly powerful te…
ytc_UgxpZ1dny…
G
My sister traces coloring book pictures from the inernet is that the same thing …
ytc_UgzgIfnTp…
G
Ghibli's character artwork is not the hottest thing around. He can't draw a fema…
ytr_UgylUmSFx…
Comment
All critical outputs esp research into weapons and virology, should be air gapped. Robotics should be built of more fragile materials to make the likelihood of AI "malfunctions" less likely to result in fatalities. Imagine if all humanoid robots were composed of weak plastics, thus unable to significantly harm humans. Easily repairable, but still controllable. Just a few thoughts. Any other ideas out there?
youtube
AI Harm Incident
2025-07-29T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx0w_JTRNvfMglznot4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDfnHtSmsIMIEB1ch4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTYyR4J8wt8Dry9fN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxekzN2roxQqc8qZSZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4cCRaNQKs3usAgjV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"},
{"id":"ytc_UgyjYa7aAz--csoOLCN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyrP1rZLDgmpFlytUl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzzzQKYQl0PvpKfDFV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxS-D9y7pFEYPLQfQx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzSOUOodNzVzP-NfwF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]