Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ahhhh you are making it seem these AI programs are perfect they do not seem to b…
ytc_UgzKiNrJR…
G
The AI they use to make the art steals from actual artists against their wills.…
ytr_UgwXuA48C…
G
🎉Im 52, i said it when internet came out in 92. I said over the years, this tech…
ytc_UgxMpqvkc…
G
@Aubreykun1. Wrong. Does an Artist lesrn by looking at noise and trying to figu…
ytr_Ugw-oXnv6…
G
Hi Sandra, you got the right answer. Kudos.
The contest is over and winners have…
ytr_UgyK25aa2…
G
I get that some might say
Oh but there is revert button and premade textured pe…
ytc_UgxtqeUv9…
G
@chrisporter9397 You didn't double check your examples didn't you?
None of them …
ytr_Ugx3M_Zwl…
G
Damn right, we should be concerned about it. The pros are miraculous (curing can…
ytc_UgwJCEmZT…
Comment
It's funny how a man who spent his life to develop a specific technology then suddenly alarms everyone that the very same technology can basically destroy everything. To me it just sounds like crazyness. Science sometimes can just create crazy, out of their head, annoying arrogant, greedy, materialistic, nihilists, feeling omnipotent people. That is your typical tech company CEO/employee who "wants to make the world a better place", talks and seems to reason very politely, then proceeds to invent the nuclear bomb, the next gen weapon or AI, then, MAYBE regrets to have done it.
youtube
AI Governance
2025-06-18T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz9FbfchMBjPT3WB_94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyc-wwKGpuvyRaLD5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTK7WVxVE6s_N19nl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3bNlZZHkP0Z6ejqd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8KyJ17DGmDo6lcxx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwfhsOEvK5NMtwOdEp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7kf24I0m5XPDTXsV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgyMXSGhHs2qccH4PhB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7joVTsIQjEvaq5lx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyELf_5uSO1YXy7-914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]