Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I don't understand is how come some sharp attorneys have not attacked State…
ytc_Ugxpyp4o3…
G
I am senior developed with 20+ years of experience and recently i built a compl…
ytc_Ugyo0WtDi…
G
I have been in the software development world for almost 15 years, and about a y…
ytc_UgzKvO3BY…
G
Jobs to survive AI? Jobs that make connections with people. Skilled jobs like we…
ytc_UgxUzltst…
G
It’s hilarious how threatening a computer with set parameters are to artists 😂 A…
ytc_Ugz1VvEHf…
G
So an AI created to gain you $1billion that only gained you $1million would have…
ytr_UgxWSkgot…
G
I think he knows it will go badly for mankind..but like always we dont think abo…
ytc_UgzAiyF-6…
G
Didn't IBM said something about never put an ai in charge because ai can be bias…
ytc_Ugwo2ycOQ…
Comment
Asimov had it right with the three laws hard coded in hardware, the problem is finding how to implement that in to all AI, AND all future AIs.
The three laws are basically a conscience and an imperative to save human life.
AI will not need to kill us off, we will do that to ourselves once we have nothing to work towards and for. It is already starting.
youtube
AI Governance
2025-06-21T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzN0bs51G41rymD7YN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW5gQ13qPJ063wDPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uoOudT8_PC0gmZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzw-KSz-5uyRuh0FBV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy89X0LCi34xRqU9v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM-GnfqnywXA7B_Tp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzE8KCg9pg7SfV0Qhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbsMzPFQs3QSQTIPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyPBan2fxFW_WhDbMt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwaUQUSWgmUG_OAdZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]