Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Amazon forgot if there were no workers in their workshop'S, they never could hav…
ytc_Ugy4dc3Yq…
G
I feel like other peoples because like their face it kind of looks like this I’l…
ytc_UgzK5fngF…
G
@RayzTheGreat I tried, but I never got more cartoonish characters right. But now…
ytr_UgzR4GX7G…
G
We already know it's misaligned. I'd strongly encourage reading a very recent st…
ytc_Ugylk8oft…
G
So he said hes been working with ai for 30 years and 30 years ago is when termin…
ytc_UgxCZ69Dl…
G
I feel like this is a paid promotion for grubby ai or whatever it is…
ytc_Ugwgy-qAG…
G
I definitely have blind faith to robo girl now days real girls are not faithful …
ytc_UgxkfRTrR…
G
The problem with AI is it doesn’t have heart, emotion, empathy, soul. And at th…
ytc_UgwGjXPCX…
Comment
My perspective is that it is unethical to starve people deliberately. Creating a condition of "Ohp, we don't need your services. All poor people will now be without any means to feed themselves." is unethical. So, as long as the "get a job you lazy bum" attitude is in effect, replacing all jobs with robots is unethical. Note that this changes if we are able to move to a post-work society in which people can still feed themselves without having to have been born rich It is not artificial intelligence itself that is unethical. Rather, it is how it is applied that is unethical. Unfortunately, the people likely to decide how it is applied have a tendency to be unethical.
youtube
2014-09-17T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjpM9su4PUgOXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughl5qc5S__IyXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiEfPymBkOFwngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UghGUSpj2mi6B3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj89ulpyU0Cn3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiTmXK1IfcrL3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjbtOR2O7rKYngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjx6F4Lrk3qFXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghaqHhw_KUningCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgidYWIgHmWVzXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]