Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the reason I will always prefer actual art then "art" made by ai is beca…
ytc_UgyoCBYpa…
G
meh a lot of tech, finance, etc workers everywhere became lazy during covid and …
ytc_UgzOa5TQd…
G
The reason AI keeps advancing is because the big studios like Disney and the lik…
ytc_UgxyX6DCp…
G
You know what's the best part of this? It's the AI or automated call centers tha…
ytc_UgzgAFW1z…
G
Does AI become the customer too? I just don't think humans are going to set on t…
ytc_Ugy2N6q9U…
G
Just terminate ai crap right now it's making us all lazy and stupid here by the …
ytc_UgyqFabGa…
G
People have watched too many movies Y2K didn't happen the pandemic didn't take u…
ytr_Ugwn4R3SH…
G
Think EXPONENTIALLY, not LINEARLY. AI and robotics will easily solve so-called p…
ytc_UgzQpxHXo…
Comment
What's bothering me is the transition phase.. we aren't in FSD yet so people who want to relax behind the wheel have to monitor the road etc..
But they dont understand it's not FSD, they have bad reactions (swerving instead of braking, sometimes not even reacting at all) or just they let the autopilot do everything and not monitor anything.
At least in France, i see people with manual car have way better reactions than people with semi-autonomous ones. They know if they misjudge the situation, don't break, don't do anything, the car will just continue and crash.. you can even see in the US videos, so much Tesla accidents could have been avoided if the owner really knew how to drive/was fully* aware of the situation. That's sad
youtube
2022-01-04T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqzqZCuaxc2Xi3Rhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7VkeZfk6iMyLHMdx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYyo3j8GDhqvKlEVR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxexkQg6tZdTYpmKgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxC0uBDLUrtMBMyFZx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_CFZRHmgDJIkCXKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzix9MI7_81v0ODBxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzEuAJQ1XsUCCnexfp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdZWdMNE5pYggbp3x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxN98Cl05jycq5Zf1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]