Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just reacting to the thumbnail… because I’m not really interested - that was alr…
ytc_UgwiRaNkB…
G
I'm guessing that maybe since it was a video conference the quality wasn't the g…
ytc_UgzaLJMST…
G
For what? Them trying to benefit humanity but instead people who use Ai in bad w…
ytr_UgwOJAKEn…
G
If the ai is self aware i think we have created a form of sentient life. We stil…
ytc_UgziUcCGO…
G
The same way we do with politicians. The human representatives can come up with…
rdc_jj8vkss
G
We are already seeing AI when it comes to customer support in restaruants and pl…
ytc_UgyyhGjNr…
G
Obviously it would be foolish to give Ai power, such as controlling an airport, …
ytr_Ugzp4gPrO…
G
Interesting! I guess there are some who would hope for extinction by AI rather t…
ytc_UgzYjl8_o…
Comment
I can see it from both sides but the defendant’s claims that this guy wouldn’t have used his phone if the Tesla didn’t have auto pilot is so dumb. I see people day in and day out using their phones irregardless of their cars features even cars that do not have any lane safety or breaking features. Him being on the phone caused the accident and would’ve happened whether or not he was driving a self driving car. I don’t even have a Tesla and I know that when you put it into Auto pilot it says you must remain aware of road conditions. They just want a payout from Tesla
youtube
AI Harm Incident
2025-08-16T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw0F3XLEUsfyFOVobN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZi3YnyHLC_KGKT-d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzb1C-4nFtWBrui0JJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxr1fXN1qD5Aw68emh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8qZAjzVbaFYzapzl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycVmgrDs7TRXJF90R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy8K9nkAsuSO_2tOCt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1FwFF1iyjqD3NyZ54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_QWat5FfUuvuMAyZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyTv1eVoVxFtR3zybd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})