Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This was fantastic! I've had almost identical conversations with ChatGPT, albeit…
ytc_UgyK7PRs1…
G
You know, people who think AI art will replace artists all together are the same…
ytc_Ugzon4IuX…
G
Makes me want to kill it. Gtfo of my country robot!!! If I ever see a robot like…
ytc_UgzNRov8A…
G
DID NOT YAHWEH START OUT WITH A MAN AND WOMAN FROM HIS RIB AS A HELPMATE...
SO W…
ytc_UgyceDjHf…
G
The companies involved, the pundits, the forecasters, the media, they all keep c…
ytc_UgzNKofaU…
G
Whens the "tech Billionaires run a successful business without depending on gove…
rdc_m6yug8y
G
We appreciate your engagement! While the idea of Skynet from science fiction can…
ytr_UgwYZcgZr…
G
Honestly I used to start seeing good in Ai to help with a few textures or side g…
ytc_UgzoWRhFy…
Comment
Rather than eliminate humans -as if global capitalism could dispense with both working class and consumers-advanced AI can only undermine the very foundation of the capitalist system of exploitation and oppression of humanity. One of the late great science fiction writers, Isaac Asimov , made it axiomatic in his Basic Laws for Robotics :robots must never harm human beings!
NOTE : From GOOGLE search :
[ " Isaac Asimov's Three Laws of Robotics are a set of rules for robots in his science fiction stories, designed to ensure their behavior is beneficial and harmless to humans. The laws are: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law; and 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. " ]
youtube
AI Governance
2025-09-08T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxBy0BqvXo-NO5ujtx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwafRQGp_U9oB_WRPB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwlMfIiUfJMJ3oXgPx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6zsUcLDLI-Iu8Fpl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbFB02o_4bN_meXeV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzyb6FqrpO3hfIa1Xl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKn4rOdSAP_NomJap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRgt23SjhfgfNr3W94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYSbdW5qMKOcMvvQd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxQKAxvyfgPugyZ50x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]