Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
JD Vance speech gave me hope about the progress of Ai & technology. E/Accelerati…
ytc_Ugz9wWLUv…
G
Well the likely forecast is for people to become more unhoused, more impoverishe…
rdc_kt69x11
G
Politicians only care about one thing: money. Public transportation in the US su…
ytr_Ugz9KTw59…
G
don't call it AI art, it just legitimizes prompt-ists. it's not art, and that's …
ytc_UgyxoXEIm…
G
You'd have thought they could have gotten Claude to give them a way out of that…
ytc_UgyeCExOv…
G
This video is actually so helpful and educational! I use AI to generate genaric …
ytc_Ugwhklw89…
G
We appreciate your interest in the video! If you have any questions about artifi…
ytr_UgyMvh8es…
G
Will AI and robots buy the goods and services they produce? Production and servi…
ytc_UgzVmy1Xp…
Comment
I think the Fermi Paradox and AI are closely linked.
AI might be the biggest wall that need to be broken for a civilization to reach other Star systems and expand.
But AI itself doesnt need. Creators need it. AI needs energy source. Thats why I think when civilizations fail to control it (because they seek work automation) the AI simply builds a Dyson Sphere around their closest Star, locking that infinity energy source for itself and making it vanish from the night sky
youtube
AI Moral Status
2026-01-18T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgymWwHngHj1eSjKt1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZ0z_3uoi70aGCrEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwse_L3mOImsePev7x4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyXfj3SUEFISZE0rTJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIstEO2jbRfSR1sYN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwmGa9A9yM4BYVFTnd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwaS5QJb2NvZSLFKwx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxoW_CZTjltSBTTjSF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy53B3WvgywWVmnkKd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFJGqCWLeRALnRNGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]