Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had to laugh out loud when a trump administration official said when asked by …
ytc_Ugy3iBu7X…
G
You can defend that AI as you want, 90% of what you said is not real.
it does n…
ytc_Ugyv3qUxf…
G
We need our society to understand... and beyond understanding care about the fut…
ytc_Ugwl68Qim…
G
That story from the echo AI wpuld make a really good movie or something. I would…
ytc_UgxLyJuqv…
G
AI will *of course* be used for dystopian profit and control. How could anyone t…
ytc_UgxDotbea…
G
Developed countries will develop get AI tools for their benefit. Creating damage…
ytc_Ugz57Cvij…
G
This video is not aging well.
Anthropics system's incrementally are getting bett…
ytc_UgwZO7EGj…
G
what nonsense half of the sites you build are full of bugs and get hacked so yes…
ytc_UgwnohWhY…
Comment
i love Science. so you fed info into an algorithm that groups words and concepts into vectors. then you were astounded that when you reference an emotion the Vector Store pointed to where those words and concepts were stored, jumped to say "look, we identified emotion," when all it is doing is pointing to where its information is stored. no surprises here. no mystery. just a filing system and a misunderstood observation by someone who failed to realize the system was designed this way and in no way does this matter, how did you think AI knew how you were feeling? the words you use are stored in an associated location. NOTHIN NEW HERE, JUST SOFTWARE DESIGN, NO AI IMPOROVMENTS
youtube
AI Moral Status
2026-04-08T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyicWuBnm_eAXzxoEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbxNH6WTY8vstZK0t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMJFZC9Bo_c4KrGAt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6M7Rp0APULewzsDN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6v8L2GoXL-udrukt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzEzkbIxVEthjWBHrt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwXRcy9n99CqutAQqB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeGOon6aF8eX4K9nl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-4T7tRGyyrF2yvCJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwcyJCpquTqgV0pT_t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]