Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It disturbs me that my 2019 Toyota can break for me If something is in the way, …
ytc_Ugw_nz5Tw…
G
as an artist going on 18 soon and applying for art college, seeing people try to…
ytc_Ugwa9-avH…
G
after asking for “not human kids” the ai creates a photo with black children. Ba…
ytr_UgxmiNlVC…
G
You cannot silence the truth, you feel threatened by AI then you are not a real …
ytc_Ugx21wCBK…
G
@jackastor5265they aren't dumb they won't care about the public . Mainly this 💩…
ytr_UgyGo-wIZ…
G
So now all the rich folk of the markets are now talking about Ai like it’s a bad…
ytc_UgyRcHu_8…
G
No, the problem is who controls AI! AI will be forced upon everyone by the compa…
ytc_UgwHViP3O…
G
100% agree with you, as a disabled author who is a woman, spent my lifetime figh…
ytc_Ugy8ggdXj…
Comment
The main reason that artificial intelligence (AI) will take far more jobs than other technology is "human intuition". In almost every job, things go wrong, and the worker must find a creative way to fix the problem (like fixing a machine, cleaning up a mess, etc.). Because of this, an actual person was required for many jobs, even jobs that could be mostly automated. AI specifically seeks to exhibit human intuition, and in many ways AI has been successful at this. When robots can handle messy situations, then human workers will really be unnecessary.
youtube
AI Moral Status
2020-01-26T00:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxvRmbO-776mbs_gBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlYdKiGzbgH1qziiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4y0hErdGKG11ait94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynBrwiqp2SZH5Nlh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjaJSOuKzLKV58NYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw56vZ9uqn3TBMnEOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1PzQusXdtijEHd5p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzdH3TKRSU0iKLy9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzatEFpalIQlbEj5pF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZoIni1WZneRueg5t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})