Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
looked forward to listening this researcher. but once again interviewer asking b…
ytc_UgzX2l6WN…
G
(Animation filmmaker here with 25 years experience) - The ONLY thing that will s…
ytc_UgxtBEF_i…
G
It's not even about AI it's about the idiots that use it for EVERYTHING therefor…
ytc_Ugw38Km7Q…
G
They say it wasn’t roleplaying in such and such a situation; but isn’t any conce…
ytc_Ugyj25iGd…
G
@C@Compos-Mentis-123h you’re right on the idea of a robot mine. If a mine is aut…
ytr_Ugxq5hGXK…
G
I focused on AI when I did my computer science degree. What surprises me more th…
ytc_Ugy9h7IPa…
G
Is there such a thing as an Ai that only scrapes public domain art and photograp…
ytc_UgzNZ4hGC…
G
@thepizzagod420 You can go to the library and make photo copies of a book, take …
ytr_UgzumWCoH…
Comment
My main concern is when/if there are 10% - 20% of self driving cars on the road how will pedestrians and other road uses work out if a car is self driving or not and how do you make "eye contact" with the "driver" of a self driving car. I think during the transition years towards 90% + of self driving cars things will be very messy. Also as a fire fighter who has driven over 300 times lights and sirens in a big red truck I wonder two things. First how will will self driving cars manage emergency services vehicles driving under lights and sirens. And eventually will we ever get to a time when new firefighter recruits will have to learn how to drive for the first time before they then learn how to drive code 1. Because there will never be self driving fire engines. At least not in my lifetime.
youtube
2025-10-01T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzvGcjUsUro21ZwU494AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNfoAP2Nf4eEHjbUt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdapuSd3U5ncy5KxZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybpTt6P7Xz2TlO17p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5m4aYonipIo0-SCJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqxGvgvlsglg9HW2Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXLWx8Pv43I-OYv7Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyh97W5XCMyTSFdn0J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy8UKcC3TYyQ0w3Uz54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzu6FOpHYQdkjFhvx54AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}
]