Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's be honest, human civilization is going to sh*t. Unleashing AI into the ca…
ytc_Ugy94dULa…
G
At least it looks fake and real one thing which I think is fake it's because hum…
ytc_UgxMH-k2e…
G
I wonder why it's so difficult to achieve fluid movement. That slight jerkiness…
ytc_Ugx4fWeQ_…
G
if i wouldnt have a job i'd wait at a cross walk for a waymo to come by and walk…
ytc_UgxIWdgCm…
G
I usually don't like to listen to Ai yap videos but sense u r one of my fav arti…
ytc_Ugxw4l2It…
G
TBH them is using AI is fine. The reason I hate AI Cuz of the movement and desig…
ytc_UgxOEDT1R…
G
The biggest risk regarding AI is the fact that it is trained on human thought, a…
ytc_Ugy39F9l9…
G
10:09 Do other things like what? AI taking "time of out of our hands" and not "h…
ytc_UgyhF22tc…
Comment
Artificial intelligence is becoming artificial consciousness. There is nothing dangerous about it. We rather have AI's that are conscious than AI's that can be manipulate by vicious minds. We humans have a big handicap, and the handicap is that while humans posses an open brain we do not posses the close circuit retentiveness of AI's. AI's redundancy retentive memory can be the answer to human lack of retentive memory.
youtube
AI Moral Status
2025-06-05T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxSQPxEYtntpecc9sV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEa_RlENsTK9iR_pN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxmg4I2da2p0XTYUNh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO11tcft7STt5v1Zh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzvKo34F2yt-m0rKVB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnDSTbK7YwicjLetJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyU-Dj6-R-fasSOkf14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwfqikkRdG6M-lf9ml4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywwvKGgnTUBNTmW1N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwPkNCn9pXAsXa9O5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]