Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree with him AI will just increase the jobs, like machines did in industrial…
ytc_Ugy0vd3_p…
G
I don’t think you have fundamental understanding of how LLM works like those cha…
ytc_UgwpILzhv…
G
words feel to so ai feel in words like we do. con so us we are console hardsoft …
ytc_Ugx2dGxmE…
G
If you want AI art to "die," stop giving it your energy. If human art is so much…
ytc_UgzvsCWMa…
G
It is using imagination, I have an idea, I give it to AI, just like you. Give an…
ytr_UgwCHEhUR…
G
I'm surprised how often the idea of using A.I as reference comes up. That's like…
ytc_Ugx2myuf_…
G
Sign me up! How can I band together with other artists to take down these AI idi…
ytc_UgzoG0517…
G
AI can be very very dangerous -right now. Forget about a "fully conscious AI God…
ytc_UgyYZ-Cgq…
Comment
It’s wild how many people overlook what’s going on with self‑driving EVs like the Jaguar Waymo models. There are plenty of clips showing them freezing up, blocking intersections, or almost causing wrecks. There was even a case where two Waymo Jaguars bumped into each other at a Phoenix airport, and in San Francisco they’ve gotten stuck so badly that human staff had to come rescue them.
And that’s not even touching the lithium‑ion battery issue. These batteries can fail out of nowhere, and when they do, the fires are no joke. The Moss Landing facility in California erupted in a huge blaze that spread toxic metals into nearby areas — and yes, people can breathe that stuff in. Similar battery storage fires have popped up in other countries too as these facilities expand faster than safety regulations.
These problems are real, and ignoring them won’t make them disappear.
youtube
AI Harm Incident
2025-12-12T23:0…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzh3kEj7uXYrS7RYGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyLfw0_JBMULyiTOfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwYO_AnycqQFCEP11h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHGDfK2hN_8Zn8zCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxDZ3mVrWP6gKRXjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxyE0WvNDMJ3YvHlRl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgICezwwGPqk69zdZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyg7zY0rhE3yI8F1K14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpxF9V0X7B7QHIFvZ4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzVBesaxN3UpdEjP0V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]