Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Been saying it since I was a kid. Machines/AI will take over many fields in our …
rdc_j0b8fr2
G
What about people that day that general non-task specific AI can be used by anyo…
ytc_UgzhvfOYR…
G
it also creates new jobs and new skill sets. if you do not change with the times…
ytr_UgxNbSisq…
G
I like prompting stuff from ai if I just need a neat wallpaper or something, but…
ytc_UgyXS3Jc5…
G
The only thing google want a A.I they can control like a god child and tell it w…
ytc_UgxtjpmpY…
G
many ppl say''nah its all bs" but im curious will they take a plane under contro…
ytc_UgyQmUVWz…
G
Proposal to change title to:
That one time the internet made the perfect AI waif…
ytc_UgxZI2zK4…
G
Google free microsoft ai course for beginners and choose the first result that c…
ytc_UgzTf43E0…
Comment
Once computers make other computers, humans will eventually cease to exist on any significant scale. To imagine this, you must remove our emotions and conscience from any talk of a future with AI. There will be no understanding of emotions. No appreciation for music, nature, food. The earth will be a sphere suspended in space.
youtube
AI Governance
2025-08-26T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy29htWaxqJDB78gQt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGYLH5aYrwyIkskcF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweOTT0F05j-9FAtnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNGyO4loNUPQeZflt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwhc49xB8f29Y0bMoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRBOdvVpDVfOrCjSh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZQ5z0fSTwahdr1sl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8i3mSArc_JP5FQn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyyuyTzLAkpe6heD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGT3_jPQeL0SR9SV14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]