Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How about the fact that not only AI is taking over jobs, but its made the job ap…
ytc_UgyWN3T6N…
G
“Samuel Altman” lol. Funny how they were talking about his salary and he’s like …
rdc_jkfi7h5
G
Every time I see these Tesla robots I have to laugh.
Others have already develo…
ytc_UgzNR0soG…
G
you don't need an Ai for that, an tape recorded full of voice lines of bad ideas…
ytr_Ugw2_svn6…
G
Eh…That doesn’t seem to be the issue unless people on the internet have suddenly…
rdc_jsm5wzy
G
As soon as we, if still possible, learn to allow LLMs to self-diagnose itself us…
ytc_Ugzvd7ON_…
G
I agree with everything that you are saying.
I find it interesting that curren…
ytc_Ugwc-OWG7…
G
using digital tools in a digital art program doesn't automatically generate a pi…
ytr_UgzwJYdzv…
Comment
The car will follow the traffic rules. It will minimize harm to the driver but still follow the rules of traffic. It will not start an internal AI based ethical dilemma debate in the 1/10th second it has to react to the data. So if a crash can't be avoided, it will not kill an old lady on the curb because she seem to worth less to society then the young driver of the car (except she is the CEO of a big company and important to keep thousand of jobs... better use face recognition software and google to check her profile in that split second...).
It will slow down as much as possible with a quicker reaction time any human has. So in reality the system will prevent a crash from happening in the first place by reacting to events you can't even see with your eyes but the radar detects.
youtube
AI Harm Incident
2017-01-19T18:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggAC0mV8oC9jngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjpcS32Uc2yJngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggfHBar3vbNengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiwvgjYZIffAngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjxDEIZXTjr23gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghcPoA1NFGlengCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgitDjAIO4MRV3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY90-a_EZ8FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiew_Ebk3iMfngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiX854HF1O3sHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]