Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It like playing Minecraft with cheats on like /give full natherite armor. That w…
ytc_Ugz1uPMtb…
G
What's with this we shit? I didn't kill any rhinos. Don't volunteer me for inclu…
rdc_deuriy9
G
Learning AI with python at the age of seventeen, this is going to be very useful…
ytc_Ugx-HBxcg…
G
That UBI future looks incredibly unrealistic, as long as that truck driver is no…
ytc_Ugz2Ci4yr…
G
There will never be a self driving truck before a self driving car and do we hav…
ytc_Ugzyo0D7v…
G
Thank you for composing this. Communication through words alone is more of a str…
ytc_Ugwb_pcwb…
G
Stop helping the ai. She already accomplished more than most vtubers. The stream…
ytc_UgyBe2cWo…
G
Anything made my humans will always be better than anything made by ai, even if …
ytc_Ugxxf4Exx…
Comment
The biggest problem is that therapists are also using AI.
Source: I was a suicidal man and got sent to the hospital. In the psych ward my therapist told me how I felt and I said a word similar to feeling down. She said let’s see what and pulled up Google Gemini. She types in words that sound like down feeling and she highlighted them as “Worrysome?” “Depressed?” “Saddened?” And this was not a new therapist she said. “I have treated a lot of patients. I have been doing this for 25 years.”
youtube
AI Moral Status
2025-08-23T19:4…
♥ 29
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzhr5YiABoG43VH3st4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb7yOgmKRYbyqtYZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxrYrmW-_5_bzZ3GhN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv827oB3mGGRYCsPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4m2gpG_JAu7EawZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzR32nZ9h536SKCenJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3irVGwQOBvRW7qYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ28DKC06ddlWByMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2ODPQ0c5D19iFcOB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXK7Uxa_8P5XU5_gN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]