Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t know there’s lots of accidents involving Teslas on auto pilot. My robot …
ytc_UgycS9oa_…
G
AI is destroying people's lives . I don't even know if the man above is AI . Thi…
ytc_UgyMCQJND…
G
"whats the business case for all this" @17:20 (regarding AI). --> Can you put a…
ytc_UgyyzhWmk…
G
That is the only arguement of AI haters that I agree with. Not art, just images…
ytr_UgwVZCTNd…
G
Loved this, but two thoughts:
- Start a new thread on ChatGPT at certain points…
ytc_UgxuBnKfq…
G
For repetitive tasks, AI is useful, but Axalem enhances my critical thinking whe…
ytc_Ugw1A1EDo…
G
I feel like we will be cyborgs before full blown robots. You get the man power w…
ytc_UgxOM-KPc…
G
I don't hate AI, I just feel bad for people who can't distinguish between AI and…
ytc_UgzhGndu7…
Comment
If every car on the road would be self-driving and intelligent, I think these scenarios would be quite rare. However, it's difficult to produce an answer if faced with these ethical dilemmas. Rather, we should spend more of our efforts in preventing them.
Still, Murphy's law is a bitch.
youtube
AI Harm Incident
2015-12-08T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]