Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone suck at making art at first. With Ai people will not even bother learni…
ytc_Ugz3n7qnG…
G
You have a right to face your accuser in court, so you had better haul that AI i…
ytc_UgwfIgJsk…
G
Now now this is extremely hard right here but im going with human but if this is…
ytc_UgzP5JGu3…
G
So does Waymo stop when a fire truck is trying to pass?! I think all the emergen…
ytc_Ugz4BNBYK…
G
You just hear in a robot voice from the other car “F**k you peasant, I got your …
ytc_UgyDzERpa…
G
You know I was going to make fun of you for saying "infinite love' as a bio for …
ytc_Ugx5Q0OEF…
G
And a court just rule that anthropic can rip stories as long as they buy one co…
ytc_UgzOyDPHl…
G
@shadowdump2902 She's not being forced to do it though. She's drawing as normal…
ytr_UgwITgeWs…
Comment
I mean, agreed....but from a man making a fleet of AI connected EV's, brain microchips & rocketships to Mars? 🤷 Color me confused about where he actually stands. I understand that he's speaking on "hyper intelligent" AI & I'll be the first to admit idk what the difference between that & what he dabbles in is. But this interview is in the 🤔 part of my brain presently. I have loads of respect for Elon Musk, but if I've learned anything in the last 4ish years, it's to question EVERYTHING. I find his spot on perspective about AI to @ least potentially not entirely align with his many presently active projects.....
youtube
AI Governance
2023-04-18T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXMu57JeeayEzcSkB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyxhUeht46iRzz_WD94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzslWNQqsycGXHQSyJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqdKoChngAkOPx5Gl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy09EU5moxy0fpGYr94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw6_mKk-2ZHhV6qUN94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxfMiDCU2PBxlGXmsd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuarHlg0mimytWy5Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwN1uNeLh6sJwetszJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvSq6fcfiBAwXWL_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]