Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Australia is much more centralized than most countries though, with over half ou…
rdc_da3zbue
G
I always feel compelled to say please and thank you to any AI I have to chat wit…
ytc_UgxZi5R_N…
G
@Nox_tenebris data centers for AI needs energy, and we all know the ways electr…
ytr_UgyS9BW1n…
G
@byrnemeister2008thats the whole point of training AIs. Brute forcing tons of c…
ytr_UgwZ_Lqxa…
G
You could simply type the prompt "5 old Dutch women who the painter doesn't like…
ytc_Ugz7Gtfo1…
G
Liked, subbed, and quoted as a source in a blog post. You said so perfectly so m…
ytc_UgzVYicgh…
G
@Jon@JonathanLoganPDX 600 IQ AI with 500 PhDs of knowledge would be incredibly i…
ytr_Ugy2Fww68…
G
I n my opinion if there was a program to destroy a I man should invent that prog…
ytc_UgwQ8ibeW…
Comment
There's no need to create AI robots which fan feel pain and sadness. The earth is already suffering the burden of large population. We don't need to create another species who will enjoy their life on their own without making our lives easier. A robot should be intelligent enough to help humans but I don't see any reason to give them human emotions.
Would you feel sad if some one unplugs your toaster? No, and that's why we should not build humanoid AI robots
youtube
AI Moral Status
2017-02-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh8Be6KyQwV-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugig6ZaSL0xYUngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpmXzTxCn0_HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugg0JlDKIdxowHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjucK8bclx98HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgjxYcYgD_mbE3gCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjxlQ5IIqou-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghWat3HN-CRn3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg6_7WvjPQi53gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghrPpy0tE2CDXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]