Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone that works in IT I’m certain that your salary for this time period an…
rdc_hkgg8gl
G
We trained LLM's on the Internet... The one place where humans drop any and all …
ytc_Ugw37FJG9…
G
I don't understand what the problem was... That the guy calls himself an artist …
ytc_UgyLpkVsN…
G
Oh nice! My plan B was feeding it AI images to make those abominations that you …
ytc_UgxNkbkZi…
G
This planet is dying. The human rase is killing it. If the Earth dies, you die. …
rdc_emp4cnz
G
So basically A.I becomes super intelligent and decides it doesn't need humans to…
ytc_Ugwvm5KHX…
G
If the AI is so smart, why can't it deliver the theory of everything that combin…
ytc_UgztSdIyI…
G
Even right now in fairly harmless innocuous work, AI casual use in doing basic r…
ytc_Ugz5nsYIQ…
Comment
give robots rights after we leave earth and tell no one/put nothing in the media or internet about it so it would take the robots a long time to find us.when you do that, give robots feelings and stuff, but don't teach them violence and make them incapable of learning it. its the only way to give robots rights and feelings without it becoming Terminator or the Sentinels from The Matrix. the more we worry about giving something rights that doesn't even exist yet, lets ponder the questions: should feeling or self aware AI exist without it trying to kill everything? or would a Skynet-like virus infect the self aware robots and make them genocidal? should their minds be the internet? what will they do?
youtube
AI Moral Status
2017-02-25T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_lhwycoYkl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggdcRoBuxL9z3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf3fSG7XhI2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNUjIVjFsz73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzsyIZT_K1A3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh32Vghx0DeXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7NSvs2yysSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugiv_A9GlQMe1XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh4RafM8_B_dXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjj2FpuF6iXDngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]