Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Those fast fppd restaurant owners are ridiculous
They triee to use ai at the dr…
ytc_UgwM4cJBS…
G
AI will only take you job if you yourself are too stupid to question whatever sa…
ytc_UgyoPb9hG…
G
The AI will be fully conscious when it starts to think everyone else is a mindle…
ytc_UgzphqjMV…
G
They really said “yeah this dude? he hasn’t done absolutely anything at all but …
ytc_UgxlOZUzN…
G
First they said 2025….then 2026…then 2027….then 2030….AI is not even able to lif…
ytc_UgytcYUlW…
G
Good talk. I'm not 100% in the same camp (my issues are more with the environmen…
ytc_UgxkRpfjZ…
G
@stevemoeller4522already have called and emailed my local representative and at…
ytr_Ugy4hRA_M…
G
Yeah, AI is never going back in the box. There's far too much money to be made u…
ytc_UgyXdnfZt…
Comment
Thanks Sasha. Could you please explain how AI electronics system emit carbon. Nobody can guarantee biases free artificial systems because they will be created by human beings and human beings are natural intelligence systems created by God or nature. Unfortunately humanity can not exist without any kind of biases
Actually their survival mechanism is based on biases. Regards.
youtube
AI Responsibility
2023-11-17T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy9GIq7u3cF4CgnN4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwI8HgK2aXkyCSJ3Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugza8zGRkNc2m3u-CDV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTJtdGgVQyqHC6Kc14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzkm-k9OVFneEaNdl94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx33c4NdMdI4YOFCX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdyJx3GX9fk6pZ0xB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwo8Hn-KQ8tvJxt4Kh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwVhoZ0THH-AYd7FFV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwZ3bVGiCasQQCmXXp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]