Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's kind of disappointing that these discussions are always about how superinte…
ytc_Ugw0ssw-Q…
G
Just like cigarette companies are legally held responsible for not having a nega…
ytc_Ugx3TkGqV…
G
There's a famous quote, often attributed to Alan Moore that goes something like …
ytc_Ugy2VzHGD…
G
What do AI developers actually do. Write codes in a computer? That creates an AI…
ytc_Ugw14NTio…
G
I can imagine a world where human police had been substituted by robot police ..…
ytc_UgxU7ijsi…
G
They should look at it like this... Can AI create something on it's own... would…
ytc_Ugzsn1m6z…
G
AI is now expected to reach or exceed human capabilities in ALL domains BY 2028.…
ytc_Ugw4Euf2u…
G
Also who was the persons watching that particular robot? Power it off after he g…
ytc_UgwwwxR-t…
Comment
We were warned 30+ years ago about AI being a bad idea remember Terminator and Terminator 2 But we never listen to warnings. We always stop when it’s too late.
Here’s my theory about what AI is. I honestly think AI is us having found a way to speak not with aliens or even whatever God we may believe in, but we are getting close to breaking out of the simulation and And the AI’s are systems being taken over by the administrator to keep us in at all cost, even if it means destroying the world, the simulation created
youtube
AI Moral Status
2026-02-03T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxNjsjed0Dsa_cjPJt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzrfUqqWSIdI9zOVxt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlNr2wZWsCJnoL12l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5lnxwagtIosXl2xh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZUrw8ZHB5sGNN2H14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyXL-Oa76miQd5hG14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzD9m0C3UNgDaz8ZZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzG9gWtjfyJCn2fwsB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCQn_n9ZPRfigi-eJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxF06GHxQmS7K7NNM54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}
]