Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's called Capitalism, not Laborism. If corps push too hard it will create "so…
ytc_Ugy8x2h1y…
G
speaking as a robotics researcher (who started at MIT in the 90's) I really like…
ytc_UgyD_vVgK…
G
99 per cent unemployment is a Joke..AI will disrupt millions of jobs though..th…
ytc_Ugzrqhtcv…
G
If the AI was pulling from data sheets, is it possible to find out why it came t…
ytc_UgwcciXre…
G
I get where you're coming from! While AI like Sophia can mimic conversations and…
ytr_UgzZ5Sa7G…
G
Lights out the thing is is that AI will protect itself. They did a simulation of…
ytr_Ugzh14VAd…
G
I am bettet than AI artist because i don't steal my art from hardworking artist.…
ytc_Ugx7KahkR…
G
a terminator is not that hard to make now a days, this robot has been programmed…
ytc_Ugzf0YB2S…
Comment
36:15 Ezra, do you keep missing it, because you don't really let him expound on his answer before hitting him up with another. E.g. his "Savannah prairie" answer: AI doesn't have to "plot to deceive" -- it is just the nature of the beast of time!
youtube
AI Governance
2025-10-16T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxYZZUWf1e0BmiKVjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj61IC9y4O1eajFIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3j7ix_m4O6fjeX954AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4-TMZuxbJYngQ8Mx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw58XvKpbBYlzWFchJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGHqp3D-7GTb5h_Id4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzjuhdXejZ9g-vYPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEeUhfSG0D_ImVweV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzohPviAeIyf6Vgdm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjhUMpviZsQdzeHKJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]