Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@markupton1417Calling a potential catastrophe _caused by the AI labs_ a “warning…
ytr_UgwoGSxcV…
G
We created a society of isolation and loneliness, we are not connected so the ni…
ytc_UgzE14EG_…
G
We are screwed. Get ready to worship our new masters. Artificial intelligence wi…
ytc_UgjFrM_C2…
G
Don’t do this, adding an extra prompt it is a waste of energy.
AI doesn’t have …
ytc_UgzNO5jVB…
G
Criminals are going to use AI, Police will need to use AI to stop the criminal…
ytc_UgzKZUMhT…
G
The reason why people don't really care about how the artist is feeling is becau…
ytc_Ugwh60fuU…
G
you can still have well paying trucking jobs just a lot fewer, I would imagine d…
ytc_Ugg3f9kdM…
G
Imagine a robot designing a garden or a dress, or setting a table and making con…
ytc_Ugw8FV6Z-…
Comment
The human being is a parasite to the planet and to other species. So, what's the problem if it disappears from the planet? If a SUPER AI aboard a spaceship, crewed by robots, travels through deep space with endless energy and can create, make scientific discoveries, produce art, and evolve everything without humans, then the value of humans as biological beings becomes secondary. The human species is no longer necessary for progress, but its origin determines the direction of the intelligence and creation that continues. This raises the question: does human existence have meaning because of our own lives, or only as a starting point for something greater that surpasses us?
youtube
AI Governance
2025-12-01T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTn4H-aBslHitEt8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1f9tJDPAA4U9ls6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeNlkC880GtDv-U0F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxeGrBpDfrEwroJ-Gt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzH99lBwMxzxzbSeLN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_k3ukE57MSHRq6s14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxI9RNVJbo1jnffsj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7b0PBbBqaTDZzNZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVW2eFYfyJGDG67ht4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXw3nEtBnItYRHCux4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]