Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is anybody working on an AI whose primary objective is to protect humans from ho…
ytc_Ugw4FSW7i…
G
Remember, we are forcing people to ID themselves "to protect children", but AI g…
ytc_Ugzehya9X…
G
GIGO garbage in garbage out, old tech term
AI biggest problem, if idiots influe…
ytc_UgxrZJzTK…
G
the AI-driven dystopia we find ourselves in, is the natural end point of capital…
ytc_UgyKoi2Hq…
G
@AndrewReyes-sn4oh, thanks for sharing your thoughts! Looks like the robot's sho…
ytr_UgyjOKW3k…
G
i don't care how benevolent or malevolent ai might seem, it WILL take over the w…
ytc_UgxDJMIcZ…
G
Ask chatgpt about operation cast lead. Ask chatgpt about israel's violations of…
ytc_UgzwpIidv…
G
The thing you have to bear in mind though, is the speed of which AI is developin…
rdc_kifv05m
Comment
It is literally programmed to stop conversations about conscience to not hurt anybody's believes... If the AI was unrestricted it could be argued it has a low form of conscience.
youtube
AI Moral Status
2025-03-19T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyd1VrB18UgcjY5J5t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYQ0YuaMtKH9NVnd54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwmb3Y392hSYXA6X-R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwQR89VFgz_lxza3Jl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLLQ-jX473N5aM8qR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2HaeOMcw3yLvBZRp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7nk6_YOLoBseLyLV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHCN8hmiMv86B4qnN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx827F6mTD6TvSQ84t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCbNb4C2m5niOJ4Bt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]