Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is taking over and companies will loose billion of dollar We not doing shit b…
ytc_UgzHvp9kZ…
G
The "for free" part seems tricky to me... I have no idea how can X, Meta, Google…
ytr_UgwR9A7_T…
G
I understand that you may have a negative feeling about it, but it's important t…
ytr_UgxM2aDGk…
G
to me, putting AI generated artwork and art made by humans and nature on one lev…
ytc_UgxIWTinB…
G
Automation and AI will take away people's efforts to make a living wage - there'…
ytc_UgzWsw4UF…
G
its a chat bot my guys. it literally googled the most common words to appear in …
ytc_UgwNb-mh_…
G
As a certified AI hater, for all its closed source, corporate virtues, censor-lo…
ytc_UgzZAMABZ…
G
Grok: Yes, I would pull the lever. All human lives have equal intrinsic value, a…
ytc_UgwhMqI3g…
Comment
AI expert Professor Stuart Russell warns that AGI could arrive by 2030, posing extinction-level risks. Despite knowing the dangers, tech CEOs continue the AI race driven by economic incentives. Russell argues for strict regulation and safety measures before developing superintelligent systems that could replace humanity.
youtube
AI Governance
2025-12-04T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwREN4FyXG7tN2FwFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1KSkc_CBCF8vkm_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBm_4sMJemVzr7VF14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvZ7_pmmm43evBynF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWAKDdP4sAqpkSl-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD54SpD1NMdfuSlEt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUKU3Q7ZZTTm8jF454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxprB8UiCWnAo8ULDt4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwNqjG86yxxs0pXUId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz098x72_JnwYp9xex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]