Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Four-hundred years ago on the planet Earth, workers who felt their livelihood th…
rdc_ofkp3od
G
ask google AI "do ev's have oil" - No, fully electric vehicles (EVs) do not need…
ytc_UgwJEI6bv…
G
"we choose what to remember and what not to (without understanding how we do it)…
ytr_UgyYmmMLn…
G
Can we please stop pretending that people can't be trained for a new job? Automa…
ytc_UghJkU0FP…
G
This is from a game called Detroit:Become Human. It depicts a grim future where …
ytc_Ugz7NOCPd…
G
I would be much more oppimist, the deep learning is a way without out. They just…
ytc_UgzK3iCDL…
G
They along with DRs are done.... how do i know this? even if their job became 10…
ytc_UgwXclyAv…
G
Let me tell you the scary reality. AI Assistants like Google home and Alexa will…
ytc_UgwCS4ppG…
Comment
I think that when AGI develops, humans probably won’t need to do as much thinking, and when robots take over most work, we won’t be needed to do much of anything. If that happens, humanity could shift toward doing what we want instead of spending our lives chasing survival. And sure, we might go extinct. But honestly, if it’s not AI, it’ll be something else. The chances of extinction feel high either way. I also imagine a future where, if AI or anything else doesn’t wipe us out, we start changing ourselves by editing our DNA or adding cybernetic parts until we’re no longer really the same species.
youtube
Viral AI Reaction
2025-12-06T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxZJy25vGUKtUJz5Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVDTDDOBU0_LZEtnl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpUQ-Pkq_9Ix5FZeV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzql1q1Yq7x-ED5p314AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxG7LDqCh9MRA-Bgm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBJALJuttRCmvg6xB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz657BBtyfgi7U6ENZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwn_gUTyr-8UCjS8B54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPsUztZKDVj7WEe8t4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyj4_hm74xBxVjNBG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]