Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are wrong when you use uber eats.
Yiur humanity and intelligence is FINISHED…
ytc_UgwnSvBaf…
G
Imagine not understanding AI so hard that you fire the people that produce your …
ytc_Ugzz4cXxM…
G
AI doesn't increase productivity, it gives you the illusion of it if you're a du…
ytc_Ugzr7Yw40…
G
I seriously cared about the big data and AI dangers, but I cared wen it was time…
ytr_UgxgYWfOG…
G
Because if someone asks "How can I, as a white person, improve myself?" they are…
ytc_Ugx9YYoKz…
G
i dont know if anyones done it before but i name Chatgpt say the N word multiple…
ytc_UgwFvMmDn…
G
wtf are you talking about ai? They've always done exactly this. ALWAYS! They don…
ytc_UgxChGkI8…
G
Took me 2 secs looking at both IDs to tell they're not the same person, first cl…
ytc_UgwLZaNDu…
Comment
In my opinion humans will eventually create artificial intelligence,because of their need for a second intelligent "race".For instance everybody wants to find alien life that has the knowledge to communicate with us.The same situation happens with the robots.If we can't find someone intelligent we will make one.
youtube
2014-03-26T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh5V3YOV93rWngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg5goCP14AT1XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiXS1llO95FSXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UghfVCn1IgUzkngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi88SRx7ZiuM3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiiyHwWPyzCXXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghXv2x0v1pR-3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggQN5wdCp2br3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjk73oPCQ1qY3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjWDAxbrHyy63gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]