Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“A just machine to make big decisions, programmed by fellows with compassion and…
ytc_UgyeAlHPY…
G
1:03:33 no people dont repeatedly spitting the exact same wrong asnwer 1 billion…
ytc_UgyI2vsmm…
G
AI is not replacing these jobs, many of thee jobs are being out sourced which is…
ytc_UgzeEK_nc…
G
If I had this when I was younger I would have been the next mark zucker musk.…
ytc_UgwAL8ytF…
G
I only ask this as a normal question, what if it was the other way around? would…
ytc_Ugw_6dFPZ…
G
moodiness is an inherent property of llms and they never successfully programmed…
ytc_Ugw2fGGZs…
G
Lovable now does frontend and backend, with fully wired up db. Only a matter of …
ytr_UgzJ6zhrX…
G
Thank you for your comment! In the video, Sophia discusses embodying wisdom and …
ytr_UgyaNzF47…
Comment
I’m only 12 minutes in, but am I the only one who listens? The whole claim of the video is the AGI is gonna come and take all our jobs in two years. In the beginning, he says that we have what scientist 20 years ago would recognized as AGI. So therefore that’s good enough for it to be AGI now for his purposes ( making money scaring you I imagine) because people in the old days thought that’s what it was. But in reality we don’t have AGI now and ‘smart people’ (notice it’s always the ‘smartest people’ they claim make AI) posit we may never have AGI in the way it’s been sold and we almost certainly won’t have it in two years.
youtube
AI Governance
2025-09-30T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSh4VNQQuDW_sN0aB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVGf25fOUhyKbajml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGmaRQp0JUp6HEVoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtCoHubbCTL2gFC7R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOhiqZhsW2J7gze_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8CcfDR6m4hHeIb314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxC_vpAEDw_TjVpz0d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyX6nnqqZ_qHjot8_V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwngZICcsDSyNegXEp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxbrdXoTDacse0RN94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]