Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Read Hugo de Garis's 2005 book, "The Artilect Wars". It was a $30 book in 2005, …
ytc_UgxftmeJI…
G
It's easy to be safe. If all fails - massive reset. You wipe everything with the…
ytc_UgyDitXDD…
G
for the second one, i feel like it's completely different, because AI is just ma…
ytc_Ugw1F5M5O…
G
This has nothing to do with AI, they will hire H1B workers or move those jobs to…
ytc_UgzTSYmMW…
G
Shouldn't they master the concept of redundancy before they try to have a space …
rdc_cjpkq6m
G
It's funny because back then I would've taken it like a compliment thinking they…
ytr_Ugw91CTL1…
G
Nothing new.... and it goes far beyond. Using Facebook sounds ridiculous , last…
ytc_Ugxiej-1G…
G
If someday AI decides that humanity is no longer "needed", so to speak (around Y…
ytc_UgyolxysN…
Comment
Every time I see these Tesla robots I have to laugh.
Others have already developed better robots and have gone further than a scammer like musk.
But ok, the whole video is half-baked and vague.
Like we're pre-robot/ai revolution, how ridiculous.
The world has more problems with the fact that its population is slowly becoming stupid and some professions have fewer and fewer people to build or maintain certain technologies.
Depends on how our engineers and planners seem to be getting less and less done.
Space travel is a big topic. (Where Musk is also x years behind and sells it as "progress")
youtube
AI Governance
2024-01-14T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgvwysCOOqiBJHEyN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7RlpkQp9mj75ztYV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGv3bocHm-8qL1oZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGDFIcG55KSKJ5Q7l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXC2c7pVt18XFh5yd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTrZ-8jkwF6ihK4kl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNyiggGumQcU-HVnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzNR0soGx0UH2znu-F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXiOOtFW7FrC4JBNl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4ihfRTilYJBadxth4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]