Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI image analyzers are hopeless - if you tell them to find guns they are j…
ytr_UgywNxqCS…
G
The real reasons for this man’s talk is exposed at the very end. He wants more f…
ytc_UgyMuG-wf…
G
I predicted this tech a few years ago when A.i. popularity juat atarted getting …
ytc_UgyHAoKRm…
G
The mother said loud and clear her son started to change she thought it was grow…
ytc_UgwahnabA…
G
I so wholeheartedly agree. I am a computer scientist, but in my spare time i cre…
ytc_UgzcfW_Ta…
G
Also, the story about the AI drone taking out its controller was a thought exper…
ytc_UgxCyGX2o…
G
art has always been my way of bringing out my stories and personal experiences/i…
ytc_UgwOR0xzq…
G
I still believe that entropy assures that AI left to itself will degrade, not im…
ytc_UgyNGpS2x…
Comment
I've had a Model S since 2017 and have never had FSD. I did upgrade to autopilot after about 3-4 years later but prior to that while using cruise control (no self steering) I was driving in the right most (centre most) lane of a 6 lane divided highway and slowly passing a car that was in the middle lane of the 3 lanes of my direction when suddenly the car veered towards that car. Luckily I had both hand firmly on the wheel and pulled hard towards the right to stay in my lane to avoid the collision. I spoke to Tesla at the time and they said it shouldn't have tried to take over.
Anyhow sadly it's in the very nature of AI that we never really know what it's thinking and we won't know when it's ready to go mainstream. I think it will get there, but probably not while I'm allowed to drive.
youtube
2025-06-01T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLTdiPyBK3U44rfxd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxxg9d3cARS4v_LS2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhiCQScUlo5qMCiRB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-dqrTtk_fe0tqEoV4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3HeLHriecSH1RUAx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwUzbLLZZn7F78Q3Wh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyd4ejUfa9hb9gdL0p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXdVhIu79M8cfuznZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWK_LJSzJYcAhSkfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwGKyUFFUUKGLXfW_R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}
]