Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't fear autonomous weapons. I fear the people creating and programming them…
ytc_Ugj-QbhQa…
G
It’s insulting when CEOs state that AI will create other jobs; as if they will s…
ytc_UgzHZr9tn…
G
This man comes across as cold callous and a scare mongerer and devoid of humilit…
ytc_Ugze3_ugm…
G
I view humans in the face of AI as dogs to humans now. Through strict natural se…
ytc_UgzoNFcB3…
G
I genuinely think most AI bros think AI is better ironically as a sort of rage b…
ytc_Ugw0Wmn1v…
G
Senior tech Project Manager here…I am being told everyday that AI will do my job…
ytc_UgxbfHA1V…
G
rkraiem100
While I understand what you're getting across with that analogy, some…
ytr_Uggozw99v…
G
Yes manual skills will continue as long as there’s funds for them. Farming is af…
ytc_Ugx36t_1z…
Comment
It is strange to listen to this video. What is really missing in this talk is to seperate between types intelligence. They are just talking about the cognitive type of intelligence. What about social and emotional intelligence? AI may learn how to possibly react on emotional problems. But there is a difference between understanding a problem and feeling what the person with the problem feels. AI cannot think by heart, because it has none. And we need jobs where you can feel and not just think. A roboter can take care of a child for exemple. It can watch it, feed it, write how it develops but really see a human is just what a heart can do, not a brain
youtube
AI Governance
2025-12-13T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdvm_ji62KuXX9Kcl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztRgdjtpuxO8gIIup4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzIkHNhE_wcnY96Ntl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtK4YZiJzXWLMawNF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzc1N0LTqIfyoflUSF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx540xP3BtNgkoU8ot4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJwzgxJpdsbZ9TqyR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUzTteBk-AsblDcBB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqOhdy7WnRXkd3Tup4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzofneNXxK35Ai7nyx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]