Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am so glad that you are addressing this topic Sam! This isn’t a situation that…
ytc_UgxoNa13g…
G
I did a thought experiment involving consciousness... Incredible... Its sentien…
ytc_UgwzjIKe-…
G
25% risk for humanity ending outright... or you could just not. Hrmmmmmm, seems …
ytc_Ugy6x9zZN…
G
@amehcakeface While I disagree w/ his statement about ti being **EXACTLY** the s…
ytr_UgyB5hIPz…
G
Allah means God in Arabic as ChatGPT said, So to translate it, making Allah equi…
ytr_Ugxw0v8ky…
G
Think about possessions and realise that AI is without a soul and how easy it wo…
ytc_Ugw36ht6A…
G
Model X 2019 did not have a cabin camera to tell if the driver is paying attenti…
ytc_UgzpCongX…
G
Tech firms like Block (see yesterday's news) can turn AI on itself real fast. Th…
ytr_Ugzkn4OZs…
Comment
To preface, I am not an Elon glazer. But, if you are holding down the accelerator pedal what is the car supposed to do? What if it hallucinates a stopped car in the middle of the highway, should it just brake in the middle of the highway? You have the ability to override the automated systems using the manual controls as it should be, this is user error through and through (in my opinion, not legal advice).
youtube
AI Harm Incident
2025-08-15T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgySchxXGxxnn8-BT0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw7UQJaa_1z6pDts4h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxLdhRKmmnbJGFMUYt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxDZOliu7ZEYgQVA_B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxgsHxIVKO-wRHH28B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxaQJAsbUlF8U6X7Ld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgwLD_a_nEkGJxxNTbJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyYFekFEimRpd-bUId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxlxyKFJUzWT2zL2Dp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw3O72FrmFFZknKsKx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})