Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
'Elon Musk'.... We're 'Summoning the Demon with Artificial Intelligence?.. Ephes…
ytc_UgyV9ETgw…
G
The strand of highlighted hair on her left side (our right) is a dead give away …
ytr_Ugzzb-lV1…
G
I wrote a book in which I proved that AI is false for so many reasons. One of th…
ytc_UgzGHVtn-…
G
If all vehicles are self driving though, there will no longer be such thing as c…
ytc_UgitMxhB_…
G
Scenario 2 where ai will take over most of the jobs it also creates lot of jobs …
ytc_Ugwoqq07-…
G
Do NOT use a chatbot for therapy. It is programmed to be validating, but cannot …
ytc_UgwveiGJK…
G
Stupid AI engineers don't understand that if they will be undistinguisable than …
ytc_UgxE8M4SB…
G
AI is trash that can't even give me a solid exercise plan. 😂 It's literally just…
ytc_Ugx5XI-M3…
Comment
I'm only 11 minutes in and Geoffrey mentioned his friends with the two extremes. How about someone like Ray Kurzweil? His futurist views discuss the technological singularity and how AI and nanorobots will make humans immortal. Couldn't transhumanism technology also make humans equally as smart as AI? In fact, couldn't it enable us to nearly instantaneously learn new things that would have once taken weeks, months, or years to master?
youtube
AI Governance
2025-06-28T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxzfUOJ1keoL9Jjpxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzt7a4uxgpRvR6eP594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLk9X0RtnLmC70nfl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytFYuE8b08MUGLtHp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwWSfMalkrmYln0cvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwhGaYPkcAGurR0AAt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdI-FBtc79Vj3Vx_B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzs2bJ3Dnx_HwGrh894AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuPBt1JATOk_FxvH14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyivmIDdTInhwXtpp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"resignation"}
]