Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT: If there’s anything else I can help with, please let me know
Alex: Act…
ytc_UgwrY5V-k…
G
@tseringarts7780 no you are right, or should i say "write" as ai wrote this vide…
ytr_Ugz7k9UVL…
G
"lighten the load" yeah right, all gen ai is doing is taking and taking and taki…
ytc_UgyNDaGpE…
G
Even if employment is given to us, we work like slave bio robots throughout a da…
ytc_UgwnmkBbS…
G
I Love your art it's so beautiful No AI we don't need ai OK 😢…
ytc_Ugz0CEZjG…
G
Pffff .... have a close friend that is a teacher. THe headmaster wants to keep u…
ytr_UgyxzsEn_…
G
Tony Stark most certainly has a wardrobe change.... Aside, this is a fascinating…
ytc_UgzoDSrQf…
G
In the realm of photography there's this phrase going around: "there's nothing w…
ytc_UgxxJH_et…
Comment
Very good video Chris but I would like to clarify/elaborate on a few items. FSD is really good but everyone needs to understand that it is SUPERVISED Self-Driving and the driver still needs to understand that they are responsible for the driving and that the car will occasionally make mistakes. I view FSD as a driving aid that takes a lot of the load off the human -- but the driver (supervisor) still needs to be paying close attention to what the car is doing. It's my opinion that TESLA still has a long way to go to get to unsupervised FSD -- I see the visibility issues (Dirty cameras/FOG/RAIN) and a need for computer redundancy as big mountains to climb.
youtube
2025-03-18T01:1…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzaJacAnc3XSWADMjx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disappointment"},
{"id":"ytc_UgyXtIfCAcz4QXgC-zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6efYLYASBQ1O9AAp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzogDh3wbBu5bV3xZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwmio6B4_AtsrXHwBN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwUVLe4nw8RoXLprwR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7mbvAYGqX1DEBRWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwf7bhunxj1D3q1OsJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziuZQPsg_EAQwCbBF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMZAPj6qgBnPaUNaZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disappointment"})