Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Several trillion dollars worth of capital are pushing for AI R&D and against reg…
ytc_UgxEgjhrq…
G
Yea, but AI companies can lobby but not vote. And we have relatively free speech…
ytc_UgyGmQXNn…
G
Troppo lunghi i video più corti e meno chiacchiere troppe troppe non mi iscriver…
ytc_Ugz8A4sYU…
G
Two more ways regular folks without any scientific instruments or knowledge can …
ytc_UgwsBB3Nd…
G
I wonder if the trains on data from Word? like if Brandon didn't turn off his AI…
ytc_Ugy2y8bPr…
G
Big tech companies are a bubble, they are utterly out of ideas and increasingly …
rdc_m5v08mr
G
ChatGPT is like a high school kid writing an essay. It’s an amalgam of other peo…
rdc_j8axmgg
G
Don’t really get how people are comparing how we learn to how an ai is trained. …
ytc_Ugx19s2Mm…
Comment
Why an AI would programm another with capability to feel pain? Also if an AI programs another in a way that forces us to give them rights and not use their full potential we would alredy have a problem of a rogue AI and we'll all be dead before sunrise anyway.
youtube
AI Moral Status
2017-02-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9uA4E2qdNfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjnOffaiIS5qHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj98md7zFOMrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgitNrH9VLI5X3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggkdf3AcQC3ZHgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggqumG_AwEw_ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggJ3-NtmsdA4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghacdqQa_8JXXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggZ2aPEfECZoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiBCDn6kZ0PaHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]