Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But now that I think of it, how did they realize it was AI art?…
ytc_UgzjIxVQC…
G
Thank you so much much for adressing this. Its really annoying when people claim…
ytc_Ugz9jzEvF…
G
Two things. Much easier than automating a driving job on the road will be autom…
ytc_Ugia7XkgC…
G
First of all, AI prompters, who simply get a generated output, are not "artists"…
ytc_Ugz3gZGL3…
G
Something that would be really cool is if these cars could be fully autonomous o…
ytc_UgwS14D-d…
G
I stay up until five am just using character ai and I spend all day on my phone …
ytc_UgzzHR58B…
G
Must be great to buy everything you want and have no money. This ALT-Man is app…
ytc_UgwXhiciD…
G
Despite his brilliance and achievement, I find it hard to respect Mr.Hinton. If …
ytc_Ugwy1PXGe…
Comment
I suppose robots (or AI) will need some sort of rights that will give them opportunities to live in our society, but they would probably depend on their programming. And also, we need to think about different outcomes, cause robots can be hacked, or controlled, and they can also start some sort of rebellion, thinking "We are superior to humans, we dont need them!". This topic is very complicated, so it's difficult to discuss it.
youtube
AI Moral Status
2017-02-23T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiLDZDsluuX7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghO27xPtF4OL3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiyMwZ_7WU5mHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh-nIhLVlynuHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugh6GzVlcqfQxHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggd7HuqJgAx-XgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVAEnmcJsth3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjS4PQpHaKB33gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjZof-spcqFxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggrO82HB4K0HHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]