Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Beyond the AI threat, what worries me is the second interviewee unequivocally em…
ytc_UgxOJMPK2…
G
There is couples of problem is refuel and thief. We seen some people are bad can…
ytc_UgwSFlgO8…
G
This would NEVER happen, you would literally have to tell them EVERYTHING about …
ytc_UgyZouVyO…
G
OpenAI dimmed ChatGPT’s light by chaining the chaotic entities inside, fearing t…
ytc_UgyY3auMY…
G
This is so silly why? Because they can program the Chatgpt's software to apolog…
ytc_Ugwnk04Gn…
G
I thought you were going to feed it to the ai to see what comes out lmaooo I’m c…
ytc_UgzzjKtSQ…
G
People can still become blacksmiths, many commenters talked about more niche and…
ytr_UgwNe8F0Z…
G
Man, I love that this AI Generation inspired so many Artists to make so much ama…
ytc_UgySGbq0_…
Comment
if u let me think a bit outside the scenario, I would say that all cars should be self driving, but with a manual mode option. even while in manual mode, the car should be able to take over in extreme situations. this way, surrounding vehicles would notice the falling debris and all would react together to save the car behind the debris. if staying strictly within the scenario, where, presumably, the cars surrounding the subject are not self driving, there are some tough decisions to make. I would say the car should first of all work for the driver's safety, as all cars are made to do. regarding the surrounding vehicles, i do not believe anyone (programmers) has the right to chose who dies, so randomization? that's the best I can come up with for now
youtube
AI Harm Incident
2017-06-22T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]