Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you are selling air xD.. Ai is not that intelligence, you are responsible for th…
ytc_UgzGAQkU5…
G
In the short term, humans may be involved in trucking, but there is ZERO chance …
ytc_Ugx7glTXB…
G
Just retired so glad I did not bring a child into the world to suffer this socia…
ytc_UgxSe_HzB…
G
Sick of seeing this guys face..How do I get reddit recognition of this for my ne…
rdc_hj4fm32
G
I remember when AI became a thing and all the grease ball managers started touti…
ytc_UgwFYqZWd…
G
I'm disabled too - with both chronic pain and mobility problems as well and I ha…
ytr_UgwCgJPfT…
G
What I can't understand is why Tesla is allowed to essentially beta test AI driv…
ytc_Ugw16LvDU…
G
That fear is real and not irrational. But there's a difference between "eventual…
ytr_UgyKMJzAT…
Comment
It's important to understand that chat bots don't have a sense of "self". They're just inputs and outputs just like any other program. If a chat bot seems smart it's because it's generating something that someone has already said in some capacity. That's it. They're not intelligent they're only capable of mimicking intelligence. They're more like a mirror than anything else.
youtube
AI Harm Incident
2025-11-25T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyBT9poAAMTZCikqcZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzgUe6Zwi3KFYjq-4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwp6itWhN9NK_yJWU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgxLTecsnkYpLpPn0rF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgylpSPoKXckh-4WczZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwzQV3gRkI-5pjk4Nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzQwY7JjRucGYFY1bJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzRnD2Me5GfRxcS0nR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},{"id":"ytc_Ugx5KvVz2ofbLUET79p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx-sYy84YtMcKXJ_tN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]