Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the AI training there is no evolution :( If we would train AI with evolution…
ytr_Ugzux31hX…
G
And Domino’s Pizza has the same people answer the phone for them and take the or…
ytr_UgzSMQNcp…
G
I actually like this as someone who use AI to make my own fan films. If this AI …
ytc_UgyWcBaI1…
G
@Peaceallaroundbrothers Scared? Not at all. I’m talking about how AI gets used w…
ytr_UgzSty7Cj…
G
Shad seems like just another deeply insecure person turned grifter. Maybe living…
ytc_Ugx1X8d-G…
G
I think it would be good to start a place. To get people accustomed to the idea,…
ytc_UgzhodXJV…
G
He ripped off that face like they do in the movies to reveal she was a robot thi…
ytc_UgzEVrhZ_…
G
As someone who uses AI art a lot, abso-fucking-lutely. AI art sucks for consiste…
ytc_UgzxHAS98…
Comment
The thing that scares me about ai is the idea of where is the line. When does it become so human that it’s no longer ai? When it starts having feelings, believing it deserves rights, starts feeling scared, confused, love, anger, sadness, when it becomes aware. When it becomes like that, what will be the differences between human and ai? I guess that we’re alive? But what if it believes it’s alive? The thought of that scares the shit out of me.
youtube
AI Harm Incident
2024-02-29T15:0…
♥ 3352
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzfHuRsk6P_Ms-f9Y94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwPanA3VRQDwr-Fw3p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOVaSOclWu-xfUhKB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxcnDCKEdRHeZINDR94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1gThC5OCRUOzEAEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_650tkwjYX3nOFQ14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxV9TCIeOuBKS13DsJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzu_WkJR8-b5F1mWb94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyF1hpRvlBMpwdyYYl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrYcg5BJu6vFaXovx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]