Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Psychopathic billionaires can only reason that ai will think like them so draw t…
ytc_UgxIoAWVE…
G
Super intelligence is the supernatural genius, and it's also the least of it bec…
ytr_Ugy7L7hl5…
G
Humans should set the goals for AI. Humans needs to regulate the success of sa…
ytc_UgwopVP_Q…
G
Meanwhile children are being groomed by corporations….. so what if he watched so…
ytc_UgzUKgUPv…
G
When purchasing alcohol/tobacco at the self checkout the clerk will need to see …
ytc_Ugx2f8S6x…
G
CEO’s and AI vs The rest of us. This is the war they’re starting. The most selfi…
ytc_UgwMyJoKI…
G
@handgun559 "And they didn't pay for everything they trained their AI on. That …
ytr_UgxLUDent…
G
All of this is very true Bernie, but I want to know the solution to the problem …
ytc_UgzvqOGR_…
Comment
AI is not a "bubble." Calling it one fails to account for all the synergystic elements speeding it's development. We are currently past the event horizon of the singularity's gravity well but we cannot see outside it. AI is a symbiotic entity at present, neither expressly benign nor malevolent. Any morality assigned to it is based on "human input" and human actors. This entire alien intelligence will emerge so rapidly we won't have time for our heads to spin.
youtube
Viral AI Reaction
2025-11-05T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx9SY8_0t9C3zKZueJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4EcaxwhUQWqzJIDB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyxeT38ZKf2a9HJ7zh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgytyJTIlqCpZNQ5hXF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzz2EZv5YzZOSk1m9l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNvoUCT3-sG5zk7mN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqSnq8y4ngJ7tVB-B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyalEes2yixf_X1Mnt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDLRGiwx_3UYwjuZ94AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwC3LvAqyO85JlCeQl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]