Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
your employer and politicians are taking your job, not AI
"They direct their at…
ytc_UgzSXkMEm…
G
Hey there! In the video, the presenter explains that Sophia is designed to simul…
ytr_Ugzfwnyb4…
G
The goal isn’t to use their likeness. The goal is to train an AI using a data se…
rdc_jtckqsd
G
Actually, they fixed character AI so now we can actually talk to characters inti…
ytc_UgyDlCnYo…
G
So the developers of AI sais AI will destroy us. Yet continues building it anywa…
ytc_UgwPMsHRy…
G
honestly every time i see one of those shitty ass ai images it makes me want to …
ytc_UgyAf4w3E…
G
AI enables people to be art directors. So bad taste has leverage, so most AI stu…
ytc_UgwfQ5Bql…
G
I think this is really cool and i wouldnt mind having a robot friend cuz that wo…
ytc_Ugy-7NBg6…
Comment
Just being a devil's advocate... how about we just... don't make them? So we don't have to burden ourselves with the ethical questions? If they had rights, would they fight in wars? What if only them fought in wars? Would it then just be two or more countries playing a strategy game? Or imagine a hacker getting into your robot house maid. I mean they could spy on you and make the robot want to kill you or malfunction in a way that would do so... ever played watch dogs/2? Maybe because we can do something doesn't mean we should. Isn't there a point where it's laziness over connivence as well? Why can't I make my own damn toast or look up my own recipes or stock and keep track of the food in my own fridge? Maybe I'm paranoid and foreseeing a portal or westworld or iRobot type of scenario? Is my argument founded in meaningful opinion to you internet?
youtube
AI Moral Status
2017-04-23T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghZxim60h8djXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWFrifyXFxLngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6_68H1uxBc3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggpPoRogRJoJngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiDTskh2rn2yHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjGaC_PeYYcrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugg16N0dkIPH9XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgizKQfBDOEQFXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg9hqGfomYCEngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgglqXCxOme6MXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]