Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need 175,000 nukes so all of the ai will die and and we need 17 thousand tank…
ytc_UgyHGsB1p…
G
I definitely agree AI shouldn’t take art from other people, but I would like to …
ytc_UgyfZM4Y1…
G
There’s actually a website called AI or not which lets you talk to an AI or anot…
ytc_UgxzLjETA…
G
If they automate every job using AI, and human beings become consumers only, how…
ytc_Ugxn083UH…
G
I agree that AI has the potential to be very bad but I think it's far more likel…
ytc_UgyqKYANB…
G
AI has been the hardest thing to understand and accept it just don’t feel real b…
ytc_UgzLofTik…
G
What do we do if AI takes 45% of jobs in ten years? How does it affect our consu…
ytc_Ugyo90phP…
G
I love that artists can do this. If only it were so possible for musicians to po…
ytc_UgzS-4v_W…
Comment
This episode was very interesting. I believe Lex represents most citizens whom have some notion of AI, yet very clearly ignore the very real threats to humanity. Having said that, we need more more awareness, more debates, and more education on AI innovation; so ppl realize where the world is most likely headed.
youtube
2025-08-13T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwprATfFV36HDtMryd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQQD1DH02Ch4ywd5F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvSfnbJpdRu6ptCHR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1ZTEOhLM3wtuZjAB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkGC0CE_7Lt4DWmxR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGDwPcMoRiQbeUhAd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQ9Db389WW2yzzBCF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymQt_83X-2JdfliQx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5b_1ODkaHnfvmbMJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIxh9EARldj4G_Aep4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]