Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You need to articulate specifically WHY A.I. is so dangerous. Because I don't se…
ytc_UgwVziG32…
G
That's a remote controlled robot though, right? The article in the OP is about f…
rdc_gs6rkhb
G
1companies will replace everyone with ai because it's cheaper.
people will stop…
ytc_Ugzz-5MZ-…
G
Bernie Sanders, thank you so much for highlighting the major concerns about the …
ytc_UgyGr4kaP…
G
"Could AI eliminate humans in ten years?"
Not bloody likely... please make it fi…
ytc_Ugy0CpXzv…
G
I think the assumption is pooled/fleet vehicles. There are a couple of inhibitin…
rdc_dbytuc7
G
i always find this argument a bit stupid. it is like comparing a nation firing n…
ytc_UggesFpy1…
G
Well since yall live under rocks and are not well versed with AI in general, her…
ytc_Ugx_PuuN8…
Comment
He’s saying how will ai know it’s smarter than us when we trained it
It’s a good counter but unfortunately i believe AI will know,
it’ll have a eureka moment and yes that will be that proto- conscious moment /AGI
he’s not accounting for emergent behaviors in his argument which is not something we can account for what it will “emerge” with.
We’re cooked
youtube
AI Moral Status
2025-07-03T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxvWZQwC15w7ssOLiF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzI_azA5RwTBwO6f-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7tHhfuZ7GqxYWVXh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdQV72TltEFkramqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi1eERQeZ2WBZzYtt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3vSZvCRBN3UImvp94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXupdi0pVh_K0MsYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzV3QcGZaU4X-0IQDt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnVUsD54bOEBPLPER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygvcndPTywJ1IjU_d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]