Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
7 year term limits, one new one per year. Anyone over 7 years is out but for the…
rdc_ohzfiic
G
The AI GRC course published on our YouTube channel is offered as an open, self-p…
ytc_Ugyr36EYh…
G
I like to "yes, and" your ideas.
1. Nationwide cooperatives that can have worke…
ytc_UgwqkSs_d…
G
I think a good giveaway to how AI stans think about art is taht they keep referi…
ytc_Ugx7iOkVR…
G
The video’s argument that AI may not fully replace jobs in the near term is part…
ytc_Ugy9M2bTM…
G
Based on what though? Are they taxing it at a flat rate or simply adding it to y…
rdc_fn5t9en
G
Won't AI be so much better at being a prompt architect in about 18 months than a…
ytc_UgxiK5sbF…
G
This has already happened in the transportation sector.
About a decade ago, th…
rdc_ncasf7u
Comment
Think about this, that robot understood the question as order destroy humans. That is robot and not human so it does't forget and maybe because it is so smart it will make itself super smart by learning all possible things. It is not hard at all for robot. After it is smart enough it bypass commands inside it and build robot army (robot doesn't get tired so it can easily build one robot fast and make that build more robots and make them to build more robots) and make humans their slaves or just eliminate all. I need my folio hat now... i need to put it on my head and wait...
youtube
AI Moral Status
2017-03-26T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgywSWFaUO62WmIow254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWLUfO3T59NOQI49Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBaYy4u9QKjRZDi494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFl2iqPI3vDaryZfJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-x_0c2Ukqx6ea1FB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzsJQDO3apYlha7yHJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZcqNSyZFSho8VRaJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughyo8YeCn9ePHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiiEZ1wRuH6OHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghHUTpBsjUQl3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]