Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Something that I can't wrap my head around is the response from the AI bros that…
ytc_UgyYVJYVG…
G
I loved what Brandon Sanderson has to say about AI “art.” He gave the example th…
ytc_Ugy00iB3v…
G
I think he kinda overlooks the most important point, which is the irony of claim…
ytc_UgzNWq92V…
G
ChatGPT ain’t to blame the parents are if they knew he was depressed and no medi…
ytc_UgzZ-ZRzh…
G
As an AI developer I get calls non stop about companies wanting me to replace ot…
ytc_UgwOyt7vD…
G
Using ai to edit has taught me more about the English language than any English …
ytc_UgxN3QTm0…
G
Thanks for noticing! While Sophia is quite interactive, it's actually our team h…
ytr_UgyE4bjJE…
G
Elon could not give one example of what the dangers of AI are. Isn’t he pushing …
ytc_UgxtfRVdt…
Comment
I am really especially terrified of how the discourse against AI has become so vitriolic (for example, us using derogatory words and inventing or appropriating or adapting slurs) that any superintelligence we create will have every reason to not trust, or perhaps even hate, us.
youtube
AI Moral Status
2025-11-18T03:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzygqCafbRLsp9Xr194AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugywgk6du9hbvl99LO94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyl3QgrWOTtl6hKe3R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPh9ySYWWVptvVjrF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyxM9y89cm6W4WC954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4E7InsIdi_3w7hNB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwyn9yX1AMEJtOc7114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9JSmCZTyTbp2N4NZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfrNfhl5S1I770on14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmRXUeGPtQkWYsN-p4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]