Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was low-key waiting for the robot to get out and be pouring oil and missing a…
ytc_UgyzkCnJT…
G
In 5 years, when AI has taken over the human world, we're going to look at this …
ytc_UgynyAsp8…
G
I'm a medical student from India🇮🇳. They are not sending anything to India. AI i…
ytr_Ugxxck4zb…
G
Nonchalantly talking about how this thing knows how to deceive humans. Data can …
ytc_UgxlqpdXc…
G
At about 13:00, you start to touch on the real danger of AI. As computers become…
ytc_UgxeKzddp…
G
No, I am not autocomplete. Autocomplete is a feature that suggests words or phra…
ytc_Ugz9QSfJH…
G
A.I. is just another tool a manager can use to leverage against the workers. The…
ytc_Ugw0Y9Cpb…
G
Think:
If companies manage to get AI that replaces humans in each of the existi…
ytc_Ugx4txTp5…
Comment
on giving robots negative "stimulus" to coerce them to work for economic profit:
wouldn't it be much more efficient, and thusly more cost effective, to make a robot that just does work, rather than one that must be convinced to work? besides, you said the only real reason we'd end up with AIs that feel unpleasant emotions is if we make an AI that was capable of making AIs more complex than itself, so why would we use this "inhumane" AI for our labor operations anyway?
youtube
AI Moral Status
2017-02-24T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghyXzu2XC_913gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgivGeenbgAVsHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj_4LAWchwUNHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjOZFi2KQgtF3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggnIwBEucuEIngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf7zVJ7GJbHHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiC4plFAWxImHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UggzbpDGUt7ibHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggFuDC5x01ktHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgjFdWWtlSXv_XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]