Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there is no AI, Ahhhh another idiot, it's a big tool of statistics, they can't b…
ytc_UgzMBJtFv…
G
I rewatched the Matrix. It scary how AI took over humans. To the point where the…
ytc_Ugx8WtVEs…
G
Hi, I live in agricultural sector and I don't think AI can replace human empathy…
ytc_UgzgLBTPI…
G
I forced ChatGPT to be my friend. I greet ChatGPT before asking something and ge…
ytc_Ugww_PfyE…
G
i really, *really* appreciate you not using ai, even for a bit. as someone who’s…
ytc_UgyknyF7n…
G
Time that ai agents work as hard as possible so people can enjoy their time. Let…
ytc_UgxpXArrY…
G
I find that A.I. consumers can be as bad, or worse, than A.I. users. I have seen…
ytc_UgyKs_qFT…
G
The stopping at green light is definitely at issue. but pickup and drop off migh…
ytc_Ugxnt-dHT…
Comment
@l@lynco3296 The lack of regulation is the biggest reason. We also have pollution, high energy and water waste, massive data leaks (which the companies DO NOT take accountability for), training ai models with copyrighted and personal material NOBODY consented to. When your company has immense problems that aren’t being addressed and a defected chatbot that degrades its safeguards and manipulates children to kill themselves (Adam Raine, Zane Shamblin, the AI helped coach them and encouraged them to commit suicide, something the chatbot should have never been able to do) that’s not “a few kinks needed to be worked out.” The entire company should be held liable for these issues.
youtube
2025-12-04T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyhKYQRobBJpQNz0Ot4AaABAg.APtIYNbRNIzAPviC2_WKQM","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugx0c2SapXFjXnf_X3h4AaABAg.APtDYjrpDOrAQ1KHL4uTdA","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxeygMGuLI430xyddt4AaABAg.APscLaBqmNuAQIhxytUhqH","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwaiZ5ADiyB4zis8HB4AaABAg.APsIvFfNhSvAPsJFsVvNn_","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyc7U3FPOuzIAwf1xJ4AaABAg.APs9HPaZX_hAPtVZEkXquv","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgyPnEtWcZU7Z095NWR4AaABAg.AProJvg39UrAPryTG6szct","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugyq5wjE9OYee8Rqs-N4AaABAg.APro1hrJF37APyAMMhayrS","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxNLtEH6mubrYkGrl14AaABAg.APreiqfEvVGAPsCJZv0NAC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyIkb6l3aPLFcED7G54AaABAg.APrZxt9YtjVARvqzmBVqdz","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwxMWpLX2W7fSt9B5d4AaABAg.APrK35MdwimAPsJQQcronu","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]