Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im far from an artist but i want to improve to make things i enjoy. When i first…
ytc_UgwnYRPsU…
G
Well said, Lavender! 🤘🔥
A lot of artists have been told forever that their lives…
ytc_UgwswvPXW…
G
Soo… I have a friend that uses janitor ai… SOME KID ON THE BUS ASKED HER WHAT SH…
ytc_UgxsPOnFW…
G
The actual issue I take with AI has nothing to do with the "ethics" involved. Th…
ytc_UgzOZOWWc…
G
To the 'younger' software developers who are struggling to find employment, stic…
ytc_Ugxl9gYrL…
G
They are not hiding anything..the bible tells you what a.i. is about and people …
ytc_UgwO-csWx…
G
Slowing AI is pointless because one would have to slow down all versions, and we…
ytc_UgydFORy-…
G
The reality is that AI companies don’t need access to copyrighted works at all t…
ytc_UgzBR911L…
Comment
But even with those incidents,wayno is far safer. Yes, it will run over the cat, but so will human driver. Human driver will run over anything from time to time. We are chery picking automation errors like if what we have in place now (humans) will not do any errors. They will do different or for different reason, but statistics are still against human drived cars.
But yeah, we are illogical emotional beings, so we will point out errors on others to hide our own ... and so is this. Usual duble standards ;)
youtube
AI Harm Incident
2026-04-25T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0ujQQ05Gjz0p3d5h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdhR6DlwcUjcMB1AZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQvH5KQSpgnpLTHnV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-Qiib9xKCEb-eblV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWP9Y0akQ9s3KKOT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwX2T0ZQy_rw7K2B6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2lffcLF2s0pmCvL14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7Zn79pQdu1giLQBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8abM8XEKGgyOqKq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugws9YOJj50G70cfTWV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]