Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've seen current tech and robotics and most of them are modeled after humans, i…
ytc_UgyOxEIKs…
G
These AI centers are going to use up some much energy and water. Get ready.…
ytc_UgyL3Ub_p…
G
If they used ChatGPT they didn’t write anything, therefore it is not their essay…
ytc_Ugys13H5A…
G
I don't understand much of this. But I need you all to protect me from A.I.. P…
ytc_UgyeVz3yi…
G
i really only use ai images for my twitter pfp or discord like i dont sell them …
ytc_UgzeO8xD4…
G
So for me Elon I think emotions can be taught by showing how with understanding …
ytc_UgyC7Is4Z…
G
I think one of the most shocking insights Hao articulates in this interview is t…
ytc_UgwP5GFwX…
G
@nobleradical2158
ChatGPT, the most highly regimented, controlled and restricted…
ytr_UgwYxNNdz…
Comment
What do you think about jobs that require human interaction and maybe more compassionate or caring modalities where the accuracy is important, but the more important thing is emotional? I teach dropout recovery students and often they have discipline issues. Making eye contact and a person's energy can be important. What do you think about social workers, teachers, police officers? My instinct is very similar-- to say AI would not be as good at those social emotional tasks as humans, but I am not naive enough to think that organizations and companies won't hire them anyway if they're cheaper.
youtube
AI Governance
2025-09-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwndxoSQxoIQWv_OId4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPtCNZJk_QLEp0kml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhluISH7ccqxIVWIV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHWV57lqkLnbxtuWZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzwfS4ECucHfa4StBR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgywIpBhUwSiakX5-yx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFmsdr0K1EF-BVDLN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUvUv1FAgPt_QxExF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyCH8Z3inYBBm4Sdsd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVFwG2fUTVpYITA754AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]