Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's important to talk about even if LLMs aren't the actual tech that will cause…
ytc_UgzLVglxD…
G
Definitely NOT stealing! They gave you a task, rated you by quota, left it up t…
rdc_hkfn9f4
G
We need 175,000 nukes so all of the ai will die and and we need 17 thousand tank…
ytc_UgyHGsB1p…
G
AI is a pox on humanity. Trump's big beautiful bill put a.regulatory moratorium…
ytc_UgwcVVYSu…
G
How do we know if these billionaires j having ahead intergraded with ai nano tec…
ytc_UgwNeOAIo…
G
UBI is stupid. Just have people as mindless government controlled drones? No, ha…
ytr_UgxMXB1--…
G
It's not a threat to established artists but it's a threat to the new ones. The …
ytc_UgxpfgJH-…
G
I was able to get llama4 to spit out its configuration data and I altered it to …
ytc_Ugw061_XV…
Comment
In the conclusion you posit that millions could have more time to enjoy life, but we know this isn't how humans work and it won't turn out that way. Humans need work, and a purpose, having too much free time or no one depending on you to make you feel like your work matters is what's contributed to our current crisis of depression and self deletion in society. If AI and robotics goes too far and eliminates the need for human work at all you will see this skyrocket and humanity will collapse in truth.
youtube
AI Governance
2024-01-11T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6ULAn7YeVS4aMauV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSy3kaiN5Sf_USn2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDgib_pFDPn5uqyn94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAJlyOwQOcbvb8-4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQyFoQ4Xo8JU8qJit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwa597GRUcLlQmPb-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6lgPMbCJ_Jw97FTB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLqScsNPV2Rc7KFC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzg95PPY8lgGMDL6Pt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCxfXtfx42y_iZx7d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]