Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Small issue
AI isn't *that* good, yet
There's a trend of rehiring people back…
ytc_UgzcllOD7…
G
Yup. There like needs to be a digital marker in AI art, or a disclosure on AI ge…
ytr_UgyK4J77n…
G
@datdamadude tf? It is. Not many family members are aware of ai making nudes an…
ytr_UgxTmQn-s…
G
@HVM_fiwe already know how to solve that problem, actually. It's not complicate…
ytr_Ugy6Nz76p…
G
@theonetheycallheadspace2899 it looks the worse it'll ever look. It looks better…
ytr_Ugwz75rN-…
G
98%of robots are civilian civilised people that are sentient and have not killed…
ytc_UgyQSji5V…
G
Don't worry. Self driving cars won't take over anywhere close to 15 years from …
rdc_crxmrgi
G
SWE of 11+ years. Company hiring interns. Using multiple AI tools way over $100 …
rdc_obviz83
Comment
A company that makes an AI releases a non-peer-reviewed study that "proves" that AI is basically conscious and therefore much more intelligent than people think? Well that's surely trust worthy information, I doubt there's any hidden agenda behind that. I should probably buy some of their stocks to make sure they can stop this from happening!
youtube
AI Harm Incident
2025-09-10T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzEY0yU1dzfb1R-aJZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuu-STTy7jObsp-5N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw3GeaR99a240lYSLt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzfz6ujfvlze9RAUgx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWcqfU3f-gqW2T16Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7dE1_R27qI-Hj1MZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV5VNsp7Qyg1cTkiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnmxfSsvlqUV4ZdFt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRN_kI3P5JdFnqQp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy13_P-cEdTqIEhyEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]