Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that there are ethical uses of AI, but not in art. The closest I can thi…
ytc_UgygpmCP0…
G
I said it once and I’ll say it again.
AI is gonna be the death of humanity.…
ytc_UgxU5r9HB…
G
Power of AI is unimaginable…
And radiologists are doing nothing special..
Radio…
ytc_Ugyufqj5c…
G
Dont use AI!! Our water resources are running out, glaciers are melting, and the…
ytc_Ugx2wfJE7…
G
God is both inside and outside time, and has control over every subatomic partic…
ytc_UgxURAk59…
G
I remember reading the word "Artificial intelligence" in my 3rd grade computer b…
ytc_UgxHklIRc…
G
i asked Chat GPT to tell me if it knows terminator movie, it confirmed it did. …
ytc_UgwTxSv2g…
G
There is no evidence, it'll achieve consciousness or free will in the way humans…
ytr_Ugwtodw50…
Comment
if they care about safety, all teslas should have LiDAR and radar as an always active backup - for cases where the cameras are unable to recognize objects. i get they want to use machine learning for everything but some things need to be hard coded in for safety.
youtube
AI Harm Incident
2024-12-16T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgymJKDes8yGZ2J4I3F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyli0XSnyajAQLXU914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy_LAp4O74brh-vrrN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBa7yaKBnZTeElBR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz0g8-AXLEXCx2HJxJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0gIkC4YJsZ3zDHUJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzhnfEVLo1nG5-hnZV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzjv_dKGB0TMrYuaM14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrYP7u8EmZb7PgKiN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyixLMQE_vZZbBPCXp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"resignation"}
]