Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
9:00 I feel like even a human tracing parts of other peoples' art is different f…
ytc_UgzY1vms8…
G
to be fair to McGee, what even is the point of autopilot/self driving that...doe…
ytc_UgwNQlaSW…
G
If AI fails, the economy collapses
If AI succeeds as promised, it takes all our …
ytc_UgzvORYnd…
G
Wouldn't it be sufficient to use AI to figure out how to exit the simulation BEF…
ytc_Ugzi1ppnw…
G
If i have to go through the same routine (school+work) for the rest of my life j…
ytc_Ugy5KEnju…
G
With all the uncertainty of A.I. roll in the future and with there super intelli…
ytc_UgzgEvQSf…
G
This seems like a pandora's box of circumstances, If you choose to sue Open Ai, …
ytc_UgzPfA0bK…
G
I don't think we're remotely close to having sentient robots and so calls to thi…
ytc_UgyHQdLBG…
Comment
I liked Star Trek TNG way of dealing with it. They banned the production of intelligent AI along with genetic manipulation due to wars in their past and now they strive to better humanity. We are going down a path that is heading towards hell on earth, but I doubt we will get to explore the final frontier at the end of it.
youtube
AI Governance
2025-12-04T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyx6V6zD3-bhjLipTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweU4vRSkL2mPqQL4Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEgrg0tr4jZXs5L4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxk-L_hCLUiM3EAab14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwehb2mxN_BQq1sivp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUdClDsMbDrILFwZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyvb3NQYHoDp2Rv1sN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYnAB9jRNojwJ4HIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyI48L2_V0gR3PJUJ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHMXezoTkUxM0VH-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]