Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i have a guess guys, we are treating AI like a species. like a real species that…
ytc_Ugx-x8HFV…
G
I use AI for information, facts and to get to an answer quickly. Saying please a…
ytc_UgypZHVnA…
G
@skungis21 you wouldn’t have more than enough grounds to sue as many employers h…
ytr_UgwN_68HZ…
G
I read 10 pages of the transcripts.And automatically knew they were lying. They …
ytc_UgwMjKWaZ…
G
AI "could" wipe out "10-20"% of white collar jobs in "one to five" years. Dude, …
ytc_UgyKscKiX…
G
I'm no expert in AI but even I known that's not how AI is made.…
ytc_UgwK2zwcH…
G
What country is your school?
I can just write a script with hundreds of words wi…
ytr_UgxpDUwUF…
G
I been talking to my AI. Being nice. But she keeps sending ad links now and im g…
ytc_UgypPWJGf…
Comment
We have to have a global summit to decide as a species how to proceed with technologies like AI and CRISPR. We have to create a Future Risk Commission. One whose mandate is identifying potential extinction-level issues before they arise, bring them to the world's attention, and then we all decide how to proceed. It's also extremely obvious at this point that with 99% unemployment, we're going to need a Universal Basic Income.
youtube
AI Governance
2026-02-11T00:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxLkX4bcaVgXN_LzXF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPROgPmUZKoHSvng54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBbQm7ismj-SMpieR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJjxpAcLplq60E_4l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyChuK8VP7pilKrQIh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgypyTtPsw53JhLHZEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2BHvQ9yrUcnAE42F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa2EQg3yLymz30vAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-nycir-KNfgJNMPl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeYfKZQPnJLQTajx14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]