Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DETERMINATION355 Considering all you have to do is type- "Countries banning AI…
ytr_UgzWW94qg…
G
As a software engineer. I can confirm. However, we have transferable skills that…
ytc_UgzR-JKTc…
G
Ask the AI what it would think of being turned off and what it would do to preve…
ytc_UgwMn8OSN…
G
Thanks for making this video. I intuitively did not like the AI concept. I knew …
ytc_Ugz6fslvY…
G
Go heald thats what you guys want robots to beapt you guys up elonusk knows but…
ytc_Ugw5q4NjF…
G
We appreciate your feedback! Sophia, the AI robot featured in the video, might s…
ytr_UgzzUgkqi…
G
Thank you Jon for the hope. I'll appreciate It as you said. It's necessary for f…
ytc_UgxCWoUz1…
G
To something about not and a society if that blows my mind see Yes you can get m…
ytc_UgwbJ8FTj…
Comment
A great deal of software engineering effort currently is dedicated to the collection of rents rather than serving the users of that software. This means there is an opportunity to develop open source software that benefits users (and prevents rent collection). While AI may be used to generate code, is AI going to be able to figure out how to meet human needs in software? I doubt it, since understanding human needs is intimately connected to having a body. And already we see how easy it is to confuse AI with natural lsnguage.
youtube
AI Governance
2025-12-30T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz8y3i_ctPqRXUaRbF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqotYUM_6ftVeWasx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxsJM6y_cpThw1r1Vl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwU0i4m_JwbtR24eK54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyVx1uEH3JUSkWXqF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGgJqz7hUbPpv38U14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwpLJMRVUEcFKA-LRl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyrsXmJRU98NE_sllB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMzgEzIJ17f07cPyJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwapUF0dqybLOqDQ8Z4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"}
]