Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's wild that the potential of no laws for AI will be much worse than " electio…
ytc_Ugwlfv1GI…
G
The age of fitna, the AI itself is a huge deception and is out there to spread c…
ytc_UgyU7TnR8…
G
We're glad you enjoyed the interaction with Sophia! If you're interested in more…
ytr_UgzWbXy2B…
G
AI is not being effectively or efficiently used for a better safer, healthier wo…
ytr_Ugzy2FkWB…
G
WE. SHOULDNT. HAVE . TO . WONDER.
theyre giving you access to Ai so you approve…
ytc_Ugz8Rtpag…
G
Thank you for your concern! In the video, Sophia and the presenter discuss the i…
ytr_Ugy7LkktT…
G
AI companions are extremely dangerous and have already led people to take their …
ytc_UgwzaYKy6…
G
AI is not really taking over jobs because it doesn't really need a chair and a d…
ytc_UgwxMmY3r…
Comment
I would enjoy a 15min+ Kurtz!! Yes!!
Of course, lotsa’ work for you guys!
Robot Rights begin when one of them creates it. The biggest fear for man should be when a computer creates its own code, obviously it will understand how it functions...it will understand its own code and how it comes to its outcomes...and determine it is a being that is alike to humans in which it reacts to actions of its surroundings. A computer with code we cannot change is frightening.
youtube
AI Moral Status
2019-01-05T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwVREe--7fjdPHU_qx4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFceUcfDCVQZ6uZ-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvDHMdTVCJ5ojJ3OV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcbeOCfaB_OOi5yZ54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxc2LENyAbR_fNI_U54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxXuSouTEyt3CePakB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynGQYOn_LvvaH1mLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJkzll8kFmEICRRFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLcjrxahBGri2d4EF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2VqIUoeHsTsMxyCt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]