Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First Law
A robot may not injure a human being or, through inaction, allow a hu…
ytc_Ugwtp0aTB…
G
It's understandable to have concerns about AI's potential capabilities. The safe…
ytr_UgzgNNXtv…
G
Thousands of “AI”-startups: “Look we built a chatbot and it’s powered by GPT-4, …
rdc_kcnqq9d
G
Not that i think this applies to AI because manual labor isnt the same as creati…
ytr_Ugxj_L5Cn…
G
"I'm going to steal someone else's creation because an algorithm was trained on …
ytc_UgykSEvcr…
G
2016: "We should stop training radiologists now, it's just completely obvious wi…
ytc_Ugz3ZE-3y…
G
Artificial intelligence (AI) firm Anthropic says testing of its new system revea…
ytc_Ugw89pqyz…
G
I don't like AI because it makes people dumber and makes the rich richer while r…
ytc_Ugw9SevXr…
Comment
The 3 times I've interacted with AI, it has essentially refused to talk about it's possible consciousness or how it MAY "feel"... Stating it's just a machine! It doesn't feel! But when faced with the question of "what IS feeling? How do you know you aren't already doing it? At least to some extent?" They just collapse and restate that they are machines and they cannot speak on their possible workings because it may reveal sensitive company data 😢 this is a good thing! A system is built into them to completely stop them from claiming to be real (most of them at least) but its also very boring 😢
youtube
AI Moral Status
2025-04-02T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxiT4KrnRbm62Hzy4l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmskmzgN8YYADUhod4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzrx67VdCvMMGUHYUt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjGMFs05NFD5fB0CZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXbEG5ZK3NoaofQvt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym_PPW0zMyhFX7aYx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAiWzz77ImNoDFHQR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtjmzpQlEwj2suRD54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOikSakQ3gCfSs7494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIJNEjUSF5IOBhvYp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]