Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's pretty ridiculous. I used Google lens also. It was a pic of the end of a ca…
ytr_UgwhelrLU…
G
Haha, exactly! If you see it, you see it 👀 AI is already all around us, quietly …
ytr_UgyK3wefE…
G
Lol, rights for machines is gonna be a HUGE step backwards in progress. Driving …
ytc_UgxsMF7gH…
G
This is idiotic, AI won’t be able to do half of what they say. Just scary ai boo…
ytc_UgytwApj4…
G
shocking, you built a chatbot that mimics people. if I set out to program a mac…
ytc_Ugyd_jVi7…
G
1. Programmers will not be replaced by random people using AI to vibe code.
2. …
ytc_UgxoFVjEF…
G
Sounds like the best way to alleviate the lack of jobs problem is to burn down t…
ytc_UgxG5ieuZ…
G
Just because you can doesn't mean you should.Nobody asked for self driving cars.…
ytr_UgyqlNv3W…
Comment
Listening to a Ai CEO talk about Ai is like listening to a Hippy from the 70's talk about peace and love they are so naive beyond belief. At their core they have an extremely narrow mindset while simultaneously disregarding EVERYING THING else. They clearly only care about themselves and the money they will make and F the human race. No acknowledgement for the MILLIONS of people who have no clue what is going on.
youtube
AI Moral Status
2025-08-02T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrltEGWaKpaOPZqKN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxhFtGIEcLMXpR6AiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAV0K2jQQP8WWzCsZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj7aiPsZKx8jgx-nR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyA6AyHRqEiXEhW3oV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNDFQRp0aoxBKA2wh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPoXfaldCy0ctCBHt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw4OY83m-fdhN8CYdp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzSJt5lF0J4X-KmfPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8acS3PX-Fr3DcdB94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]