Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I would like to see is a AI script or movie vs A human script or movie to s…
ytc_UgxSBdtjb…
G
We are at least 100 years away from artificial intelligence. What we have now ar…
ytc_Ugws2l-Wc…
G
From a moral perspective, I don't see it as very different.
AI isn't psychic. …
rdc_lgn9xhu
G
Help- I accidentally went on one of my yandere ai characters as Francis Mosses- …
ytc_UgwR4Gv01…
G
Well I tell you what it’s cap an ai isn’t real until it’s not structured under c…
ytc_UgxtA5qef…
G
These AI needs to be able to get to know the user to tailor the response to the …
ytr_UgzunQp8t…
G
nothing clarifies a drawing as "bad", it's all based on how you or other people …
ytr_UgwCfj43v…
G
you know this is kind of a self-own right? my position isn't that AI is useless,…
ytc_UgzbV0-F6…
Comment
Does any one value human life .... Look we spend trillion of dollars to go to space... But earth is getting destroyed...plastic ...trees getting cut down etc...we only have one earth.....and we worried about wrong things ....maybe ai is right wipe the human virus out ...what good do humans do or bring ... We destroy breed etc
youtube
AI Governance
2023-07-08T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpzfzYQw2K-icjDKx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4ZCPmnYg5qgMTM1h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1nz_ap1mhwwaCW8J4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxbnOvXu3louZQ28xR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlNmJNXHD8rcORZZJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvMQKQnHqcU2jkC0R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5ncCFg97A3lHClEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"hope"},
{"id":"ytc_Ugwdyzf6ftPA9BBTbAx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV6ABBZe00eh8NxiV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyL5gOk0_jo7Soxlmp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]