Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks Sam, so sad to hear about the lack of empathy the AI bros showed and just…
ytc_UgxCeKOsf…
G
If we think like that then how should software engineers and AI engineers progre…
ytr_UgwMhYNNB…
G
For all i know if exurb1a turned out to be AI i wouldnt be surprised…
ytc_UgwQEEGjd…
G
My thoughts: We’re not doomed. The trailer with the monsters was visibly messy, …
ytc_UgxsfEBTE…
G
If all decisions are made by a human in charge, and all instructions are given o…
ytc_UgwG0NXix…
G
If Google continues to spy on our every move and movement we will be replaced ea…
ytc_UgyE8iRCB…
G
@JD_2020 Interesting take, but you’re assuming a lot here. Knowing that YouTube’…
ytr_UgzSaqkb-…
G
It really... doesn't look like something straight out of a professional studio.
…
ytc_UgxnndcJL…
Comment
In no way am I trying to be "smarter" than this guy, but... We've been hearing these types of warnings for many years now... The truth is it takes a long way to get to the masses. Not everyone is a rich elite who drives self-driving cars, not even electric or authomated cars... I just don't believe most jobs will be extint, that would make the economic sistems colapse. I am not saying we aren't getting there, but it's going to be much slower than all this podcasts interviews affirm. It's my take and my perspective also based on what I see and experience.
youtube
AI Governance
2026-01-30T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNAkTJxAsXuDYIc9N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgznXzioPq2elAwruY94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxwFhl4Q3OL_pGmiOd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLNYjeCK1PF7jBGFx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwa-72hXb59ZNP-vO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0Ilv2q78OY-ZbNml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyft-0UAEGOtDhHXh54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx4W-2hMILUljVvoZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysnJ2DFaa1D3jx1e94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbzL6Yk8ZruN4LTZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]