Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They dont have intelligence, its just a program of inputs from a user, similar t…
ytc_UgwQ_v4x6…
G
Even if Ai becomes sentient. Odds are people are still not going to be nice to i…
ytc_UgzCTYcsa…
G
That sub fucking sucks. I get some skepticism on AI but that sub loathes any tec…
rdc_ntggy7r
G
Sure they fed up because they cant really make money from it ai are much cheaper…
ytc_UgwqJDkVW…
G
I will never defend AI art, but as someone who tried drawing into their teenage …
ytc_UgyBcTNLX…
G
I always thought the parents were so ‘non-chalant’ about talking about their son…
ytc_Ugy8JAAxW…
G
I'm a certified Medical Coder and AI is taking my job within 2 years. I have to …
ytc_UgxTBYz9z…
G
The issue with Nightshade is that it has to be finetuned for each specific versi…
ytc_UgwzfLVkc…
Comment
Sorry all I heard was someone being afraid of the future and trying to come up with the most unlikely scenario so that the future seems like doom 'n' gloom.
The car will just hit the breaks. And it will be programmed in a way so that it can always come to a complete stop before hitting an object it is behind. You'll probably even have sensors that detect what type of road you're driving on, if it's icy etc...grip...
But whatever I say you can always come up with the most crazy encounters that cannot possibly be pre programmed. Maybe there's a car on top of another car that hits the gas and rams into you...w/e your mind can invent it could happen.
Yet the self driving car will only hit the breaks.
youtube
AI Harm Incident
2015-12-25T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]