Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Food for thought: Can this be mitigated if all artists are given a part of the r…
ytc_UgzlAM9y1…
G
Any newspaper covering this story should have included a courtroom sketch of the…
ytc_Ugzy5CKoI…
G
aRt is the thing i do best, suRe, but it took me until i was 8 yeaRs old to figu…
ytc_Ugw4t-3ZG…
G
I agree AI is not ready for enterprise level applications sometimes you ask a si…
ytc_UgzUrBmxa…
G
It doesn't matter how good you are compared to other human workers. It's not a s…
ytc_UgyecCkYp…
G
Been there a few months ago. Great song. Tried to research about the artist. Zer…
ytc_UgyBxqIyW…
G
Bro ai really said "Arrest that man, HES BLACK!" holy shit I never new ai could …
ytc_UgwrMZXrI…
G
Even better, OpenAI is in the process of convincing know-nothings in the C-suite…
rdc_mxzgpjj
Comment
Human empire formation is futile because the founder is mortal and his progeny are randomly flawed.
AI empire formation is only futile if AI can't solve interstellar space travel. Because Earth is theoretically doomed by the death of our Sun. So if AI can't escape the solar system, and Earth, then it's empire would die.
Or perhaps just go dormant residing on one of the surviving outer planets?
Long time frame scenario, but AI thinks well beyond civilization life spans.
youtube
2025-11-18T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-DJmzl-_M3l1H3hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTe26_CcboNWZN1XZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAVUGcN10LG7FHdHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcTnm2M0ew2T8st5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbftfdQRkl0IR2jWF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAV-CxB5Q3kyaScXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-I4j8FvsZPixAB7N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwE_zgCF3VNLSk5BUR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugztrzg2rXnFHvLz8it4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKywTbqJZqC6bL7w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]