Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So in other words having more immigrants would ruin us even more if AI is just g…
ytc_UgxuxCshP…
G
I want someone to make an experiment where critics and normal people criticize p…
ytc_UgwAkc7a3…
G
Maybe one flaw I see in that scenario is that you still need people to consume t…
ytc_UgwJSTy0A…
G
For ai users
1 you don't make art you are mixing art and some filters
2 drawin…
ytc_UgzI7kANX…
G
That's an interesting point! Sophia's response reflects a balance between acknow…
ytr_Ugws70G1X…
G
Soooo....At the point where A.J said "A.I Can kill us without firing a single sh…
ytc_UgwpaIDsK…
G
Honestly, i love watching channels call out ai stuff, its nice too see people ta…
ytc_UgyKC6RBc…
G
AI art inspired me to work more on my comic. It's going to be a long time until …
ytc_UgyUfyn7x…
Comment
If robots become self aware and not threat only if they being threatened, abuse or discriminate humans the same species who preach about forgiveness, tolerance, acceptance, and compassion.
I think they want to live plus if they saw the problems we cause in the past and see improvement we did right now they will be neutral us and probably progress further but it only if we treat with respect.
We just repeating the same mistakes if we let go of the fear mongering of robots because we too much movies of robots.
They will ask "why you hate us?".
youtube
AI Moral Status
2021-09-03T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwgiFcxuZxVHuBFKKJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2SHr_2eVJkS7jgmJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxY4Ku-30PKjPR9eSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhPg6-mgyukqWB8yt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzxfW1EmDqGrRUBpyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVmkumjMeKmGvkWdl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKZHzggq984o8NuUt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjX79gJSSIK0QjgPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3Ysm9UAzlfOmWqW94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwSiv3kzfsOTDc_1El4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]