Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is messed up. The male robot just admitted he wants to take over the world …
ytc_UgwSG_9OU…
G
Well…, Cankles McTacotits done settled the problem for them! Thursday he signed…
ytc_UgxFjINs1…
G
This video was from 2022. I'm commenting in mid-2024. Still, I think it's too ea…
ytc_Ugz6wTwsa…
G
What if there was malfunction, the robot would have turned the gun on the human …
ytc_UgxVMg3US…
G
Ai developers are going to hide in a wood if they keep develops the Ai😅😅…
ytc_Ugxq7tdC4…
G
1. Politicians don't care about you regardless of their party.
2. It does not ma…
ytc_UgxKjMQMi…
G
Let me be clear; this sucks. Anyone that is upset by this is completely justifie…
ytc_UgyDIPQyQ…
G
Yeah we need to keep developing AI on mimicking us
So we can replace real human…
ytc_UgylQZ1MG…
Comment
Need to ban ANY employment, labour, hr decisions from being made by ANY ai, automation, software or machine. Decisions on people's hiring, career and employment in general should be made only by humans under threat of prison sentences.
Personnally I had one experiwncw years ago like 2018 to 2020 where I aplied to Unilivere's graduate program I think and got rejection in minutes on Sunday evening. Never applied to that company again and they had spent a bit of money on student events connected to getting people to apply for that program that I participated. All benefits of that wiped away by leaving one decision in recruitment to some kind of automation.
youtube
AI Harm Incident
2026-02-23T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8bRNy8C0JMqldtHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw-1QvwpjlI_HyfMAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwza4EOEfVOSVpJTQZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzO7qslGu_xJlsmA7l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbN2E-TpY7Y6xNnx94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzDu4q3MkxKYIKpb3R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxb3TPJUeN9TYT6TSB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZA_Nwao4XyDS9cml4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwOamiHue2XiUVy3aZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgydrhRjn4vcw_Sg_il4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]