Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think this will be for a while. I work at bmw as a technician and I driv…
ytc_Ugia_1ljV…
G
I haven't watched even 1 min yet, but here's what I realized: many jobs that wi…
ytc_UgxzLcpD5…
G
Why Human being is developing AI? Answer For making better world i will say a b…
ytc_Ugy-a9G9r…
G
So please ai, take their jobs, because i dont trust those people doing any other…
ytc_Ugz35kmnn…
G
Too bad they don't exist. We are no closer t a realistically moving robot than w…
ytc_UgzjMp_TA…
G
This is ridiculous. If you knew what an actual AI looks like you would be dying …
rdc_kvhvmi3
G
YOU ARE SEEING THIS COMMENT AS A SIGN TO WAKE UP. JOE ROGAN IS NOT WHO HE SAYS H…
ytc_UgyYCy20v…
G
if he was right, "i dont grade ai bullshit" wouldve been a really iconic quote…
ytc_UgxHSp35B…
Comment
My main thought is accountability. Even if it becomes as safe maybe even safer than a person driving the car, who becomes responsible if an accident occurs? The car manufacturer? The dealership? The person inside the car? The developer of the AI or algorithm used?
If someone is killed in an accident, unlike before where a person could be held accountable, would somebody whos lost a family member, maybe a kid, be told “yeah sorry it was an accident, theres no compensation we can give”.
youtube
2023-05-30T14:4…
♥ 2880
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqIwROMLlXrXOxkYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyW1zUrMur-RmkwOJt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwYKJ1agdBYdsgDLGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH2UcbhyFsRoiCKOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysSjP8K4PF68CwP-p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUMF6KofN3wh9UWTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOCWqbwSo_xvUanK14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeldY1jQUV8PkCS1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuW8zsH9h6gViKa4d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5eJPK0U2CBwv9kTl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]