Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sora, a 3d animation software so realistic that you can't tell if it's real or n…
ytr_UgxtZxRTw…
G
@Sebastian_Terrazas AI doesn't store pixel perfect copies of billions of images,…
ytr_Ugy5e044w…
G
As an artist myself... please DO NOT redraw AI generated images, you would be ju…
ytc_UgwDsbyOA…
G
Annnnd as a scientist this is why I don't trust stochastic parrots to be anythin…
ytc_UgwKowQJI…
G
99% by 2030 is fearmongering. That’s such an insane number, Manuel jobs and serv…
ytc_UgxRNocyN…
G
Who decides for the world what that future will be like? One of these crazy ai s…
ytc_UgwWW7uu4…
G
You are absolutely correct in that most A.I. programs aren't sentient. They can …
ytc_UgzkdXeVn…
G
Its sucks that its happend to her but look at how celebrities like nicki minaj, …
ytc_UgxwhESvu…
Comment
It’s completely true that the top AI leaders are a bunch of guys who know each other, and they just want to win. But it’s also true that it is highly unlikely that China would just step back, even if everyone in the US stopped AI development today. Again, Eliezer has it right - we would essentially have to be willing to risk WW3 in order to stop it, and it definitely doesn’t look like this will happen…at least not fast enough.
youtube
AI Moral Status
2025-12-01T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxwk3tmMDv7CKwv5I54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLZo05xoQ-Mnjxegl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6uUggqCeFxvMUlyd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPBf2fn8xgifhFjJV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJQklkoy7-1oKel6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwWWcTTDJw2ntGmYl94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxSUPF526z1W85vWCR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxI7a__RUxhTdMR2RJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwM6Nqsah0pYFdAnh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyLoF6DNfxOvQ0g8PN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]