Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
By experience, im a junior python DEV and i did a whole site with Ai because i w…
ytc_UgyO7a2Mm…
G
Let’s all understand something here, digital art is on the same level as traditi…
ytc_UgxJFZbDm…
G
This has to be single handedly best AI talk on startalk. What an amazing guest a…
ytc_UgzfZF0xz…
G
the tech elites DON'T CARE ABOUT THE REST OF US - actually they would prefer us …
ytc_Ugx5KRLDO…
G
YouTube really does recommend me a LOT of random things and it’s usually right t…
ytc_UgzBIZL3P…
G
Why won't AGI take our jobs? His example was horses to cars for something that …
ytc_UgxuPj0YU…
G
---update---> came back to tell a troll to F- off >---watched more - 3 points
…
ytc_UgzjogXIp…
G
_"Do not become a product of an idealogy that sounds sensible"_
Like adopting a…
ytc_UgwbJKyRr…
Comment
I am just curious, If we live in simulation, then why does he care about AI and infinite life? It doesn't matter, since the simulation loses sense if it runs forever. The goal of every simulation is to yield result, but if the simulation is infinite, the goal can't be achieved, so it inevitably will be cancelled by those who run it. Like we do with the software that went into an infinite loop, you simply kill the process.
youtube
AI Governance
2025-09-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz0ZoTGeW5tkDY3jKd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyO4LiTX6n-ghy7RKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3JgU_6iUZpejvNb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaCYgjseO0F00ezG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzYmU4f6sOub8Vi4954AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-7WTfycmsA04yU2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4dyLDCEmSfaIL6Th4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKao-hPMKf2q5JgHF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwO5jifwY-Cr93RNEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-QPJ_2BWAf0iC7uZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]