Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude FUCK the people defending AI. Not the ones that are like "Hey I use ai for …
ytc_UgzYJaRFB…
G
Please change title for clarity. It reads as the parents of the whistleblower go…
ytc_UgyAx9nGp…
G
They're even developing AI ROOTS and Gonna use these aswell for Military pur…
ytc_UgzIbqx8z…
G
We shouldn't have AI... why?.. just to increase profits and increase productivit…
ytc_Ugwnmh5De…
G
I have to question your journalism WSJ for intentionally conflating Autopilot an…
ytc_Ugzb9znGL…
G
Thing is Chiang Mai is pretty much owned by the Thaksin family - and they have p…
rdc_dy8n4k1
G
I will worry about AI on the same day that Elon gets his hyper loop running. So …
ytc_Ugw-RiuIs…
G
Even $28 for a 10 mile trip is way too expensive vs owning a car, or even better…
ytc_Ugyp_tHww…
Comment
I am not saying there is no danger, but I don't believe in these scenarios just because I think we still have control over AI software and hardware. If you think AI can wipe us out and take over the world, where will it get the energy to keep working? Who will do the necessary maintenance and administration on servers? If it becomes dangerous, it will just stop by itself. It can still have dramatic consequences, but I don't believe in the extinction of humanity.
youtube
AI Governance
2023-11-12T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcXCUZVDyI5ZsLaXh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy86nOF82nPD9E49pt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzb0YmjhrygCkYkHlR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxFJ13-3Uh6v1Yczx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxez2teLZCc6QJC17N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-VxD2DwnmlvtfeK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyCODqq7IcVVYpftCJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQ1E0Zr-OuRTyJj3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxO9j3QvrkDlitugFN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbFTnpMjHf_iB9THp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]