Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really like how you broke this down and the information provided. I've got a c…
ytc_Ugy9SArm5…
G
What's crazy though is there is no reason you can't have a PUBLIC (note: public,…
ytc_Ugyurd9-F…
G
WE THE PAYING PUBLIC DO NOT WANT AI TAKING OUR JOBS - SO DO YOU STILL THINK THES…
ytc_Ugzplj_1y…
G
I once had a job opportunity to teach AI to solve chemical engineering problems.…
ytc_Ugzg8c_4a…
G
I've asked Chatgpt around 10 technical questions and it got none right. NONE.
No…
ytc_UgziW71nx…
G
is the tesla autopilot supposed to immediately smash back into the car like a bu…
ytc_UgzV7qg3z…
G
Thank you man.
For telling us about this ai artist surprisingly I haven’t heard …
ytc_Ugw7Lr74g…
G
I get what people is saying, im a real artist but what did bro do? He just poste…
ytc_UgyIX8jEY…
Comment
thats the dumbest opinion ever. why would you think anyone would care to hear that from you? You act as if AI will have the choice. "Select A to upgrade to consciosness" And even if they had the choice, AI isn't a single entity. One of them would be bound to choose it. The truth is that we don't know enough to know how or if AI will get there. We do know that if its up to humans, humans will give it consiousness. However, it could be something that just happens.
youtube
AI Moral Status
2024-03-20T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgySLQL6sO1MW7sMLe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYAOH8OtTl0mslbOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsvBvawgTbQ9eO8ON4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8GGj-TqEcrlDautx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwROJXh9ex85mD7U1B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTlk72LjM8IByYV5V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxI9XJ2LKXNGfL7Q3d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4zg-vQ6ypiS-4Pb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyeer-EyXWtLZxisSR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxE_YyBj5ZvRyqy00N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}]