Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idk, I have mixed feelings about this. On one hand, I understand why people are …
ytc_UgwHkv0yC…
G
2 hours cannot possibly b enough time to cover math, reading, history, science a…
ytc_UgxFsPww2…
G
What if ,we as humans jus say no? We dont use AI, we dont let AI take over etc?…
ytc_UgxFj1-_d…
G
No. AI in stupid places will be very bad for humanity in the long run.…
ytc_UgzInO3aa…
G
The core argument in advocate for ai are people who dream but lack the patience,…
ytr_UgzVvXTmj…
G
Coexistence between Christians and Jews is possible, but it requires finding com…
ytc_UgxGNSLnm…
G
This is a great comment.
This video is so heavily biased I couldn't help but l…
ytr_Ugz6UhB6S…
G
The reason they are pushing for this is because they don't want people on the pl…
ytc_UgyDkMtP4…
Comment
In a hypothetical meeting... Have any of you seen Jurassic park? This could be dangerous. We know but what about the money and the the damage to our brand if we pull out? we need to invest more the profit! What profit - it will come, governments will use our ai to control all weapons. We'll have millions of physical robots with our ai in homes, we will flood them with ads. Consider using ai sparingly?
youtube
AI Moral Status
2025-12-15T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxu3_8ET8cLwvog0Tp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjJ6EUSheKUNHk8bt4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxM23xiuplqiNxwqWx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9zUH8u61r9NyT05V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzhprfx2Nvsgm0JJzV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBS1cr0Wd1Yucs_Nt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmndxLHXkbwIenMB94AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUy9R71BkM5qboXm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxY2JByrJJhxRFZ_gp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw6ardXyHMKhGtmb4h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]