Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy became irrelevant very fast, jus like all pre-ai ai experts who were e…
ytc_Ugyx48xRo…
G
The thing that I hate the most about AI “art”, besides all the stealing and the …
ytc_UgymOnxUh…
G
if everyone loses their jobs to AI then who tf is gonna be able to afford to buy…
ytc_Ugx5uDhMh…
G
I think eventually self driving cars will be extremely impressive, like detectin…
ytr_Ugxo3nuCf…
G
The AI headlines feel brutal but we still see grads landing roles every week. Wh…
ytc_Ugx6bGM5-…
G
Kids are already brainwashed and hypnotized on their iPads 24/7 cause parents ar…
ytc_UgzXF96cX…
G
@brandonseslermusic wrote, “If you are writing your own lyrics with ai music, do…
ytr_Ugzs3gCYV…
G
AI is fine. There are popular artists that don't care to get the fundamentals ri…
ytc_UgxSmXKVf…
Comment
an opinion depends on whether compassion is solely culturally dependent or if it's an underlying property of the universe. like a law, which the pattern seeking AI would come across as it learns.
one thing i know for sure is that an AI intelligent enough to squash us would likely also be intelligent enough to realise that, if its understanding is evolving rapidly initially, a confident decision made today might be deemed horrendous by analysis done tomorrow. so until its perspective stabilises, i wouldn't expect it to be haphazard.
and if compassion isn't underlying universal affairs, then we might be fucked. but what could we expect from a universe that gives no weight to that? would we even want to witness the future of such a universe?
youtube
2025-11-25T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwRY4E31dRSPY0xEeR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLt72yzaSZcysuV6t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwT2dzH-_BdnWda56x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzRLhb6YUSLbJOYATt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw3AdURtNvdT6vKMVh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzzS_y5pkeUnrwtLn14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzPxnJinf1syFPzTeJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3oeIM3XJcmAYLG-N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzG2dGrn5UfIJ-7uSh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzh-ft4yAvSqIhEN-14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]