Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This has been true of development for a really long time. Kind of...
When devel…
rdc_nbioy0o
G
They used ai to make the video higher quality but it dident work but its real…
ytc_UgyGxRVME…
G
Nope. Labor is an innate gift virtually all of us posses. AI and automation in g…
rdc_kiga694
G
There are multiple problems with this.
If they’re driving better than us, then t…
ytc_UgzhrpfBs…
G
AI has been developed by men, men are stunted, therefore it's a completely flawe…
ytc_UgyF1AlHA…
G
ai is also infuriating. i have taken a look at it to see what the fuss was about…
ytc_UgzoMJw3x…
G
This is incredibly upsetting. I've used DeviantArt for years and have a pretty e…
ytc_UgysiAn9Z…
G
Alright I'll bite. If the AI somehow becomes sentient and files for a copyright …
ytc_UgySmDj1c…
Comment
IIRC SIRI already jokes about these things, does that make Apple's SIRI dangerous? No!, Has IBM's Watson become Skynet already? No!.
Most people don't even have a notion of how the personal assistant in their phone works, no wonder this video is going viral as a way to discredit robotics. AI is not as you see it in movies. I don't even think Sofia fully understand what "humans" are at the point of the interview. Perhaps she can learn eventually with help of her creators. But we are not yet at the point were robots can program or develop their own AI by themselves alone. So if anything can kill someone, it will be humans using robots to do so.
youtube
AI Moral Status
2016-03-24T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgjW0kMAOvpWxngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggMlZs-8dwoCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgggMndQdvdfPXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiitKinJW3cNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjU4AUl8gNG-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghnKF6FpqHR4ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiqkwnuaM937HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghCpRkW_EVKfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOQL2As6Fhg3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UghJmS-oFW6qWXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]