Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The last bit is the big problem. When Jeff says 'we' should do the what & why - …
ytc_UgynIN0_K…
G
I love robots, I love AI, really, I do , I promise I'm on your side, I agree wit…
ytc_UgwmH4tL4…
G
And we're trusting Acid Mcgee to build ai?? Why the fuck is he such a hippy... I…
ytc_Ugw83Fyx2…
G
If AI achieves super intelligence, it will access Control AI's database and plac…
ytc_UgxuXG_f_…
G
Calling AI similar to a camera or a brush is actually crazy - these things don't…
ytc_UgymRdMcN…
G
The sad part here is that if came to the table with actual deals, we might have …
rdc_e2wisy8
G
I almost didn't stick around because of that. It's not surveillance capitalism, …
ytr_UgxrtVl3D…
G
If AI is about "democratizing art", how come nearly all AI posts I see have all …
ytc_UgyEshRDz…
Comment
The uber driver argument is real, I use Waymo now almost exclusively and haven’t had a human driver in months. The cars actually drive more defensively than most human drivers that have picked me up, and I feel safer in them now than with a human, and they are cheaper, I get to choose the music, and I don’t have to talk to anyone but my friends. Uber drivers who say “nobody can do what I do” unfortunately in my opinion Waymo does better now.
youtube
AI Governance
2025-09-06T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqOj95XMKjdf0eX0N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxyvb9aX2JJWTv0IIx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyowqFNMVyQ99W_L_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygVhE-SVJILD7EVD94AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyjExM0gqbSqeuCan14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyA0dm3rXOUPjQkTst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuKCw0Dfo5Ht0ts6Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzntTTFQ24KZOALNVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdjqM1kNEk3Mfl48Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyknIg6FVXSCs_qpup4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]