Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nuclear is an existential threat.. Just like A.I.. common denominator to prevent…
ytc_UgwKinVxU…
G
This was an important conversation. Stuart Russell is right that the real danger…
ytc_Ugx5VZM7v…
G
Without getting into AI art good vs.bad, just take a look at some of the commen…
ytc_UgxUnbI8v…
G
The cat thing wasn't a smart choice AI:
Do lobsters need water?
AI-overzicht
…
ytc_Ugz_SsUfA…
G
@HunterS.Catson I think your is a bad representation of reality. Companies bene…
ytr_Ugx57elO4…
G
It's only for those who can afford to buy robot we poor people had to work hard …
ytc_UgyfIk9vg…
G
Why does the guy who’s narrating sound unintelligent and the robots sound more i…
ytc_UgwA_wiVM…
G
I feel this is both gullible and dangerous. A lethal cocktail of misinformation.…
ytc_Ugw_sQOp0…
Comment
“Hi there! I actually wrote a book called ‘Sapient, the rise of Artificial Intelligence’ that dives right into the very topic discussed in this video. In my book, I explore how a global regulatory body set up by national governments (WWB), together with a superintelligence called Sapient, manages the use, functions, and overall operation of AI. I also talk about a society where everyone has their own personal AI superintelligence as a kind of personal assistant to help rebalance social and work inequalities between people.
youtube
AI Governance
2025-12-04T07:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwy-KvYmjs1umysa4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdrLipZe_1DyVc-Xp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWipm0DRSgc6Cquy14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyox_-tFBy3xzLbV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyoRZHXFhSVPft4V5F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHk9HbfQUZOFGE1pd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzy6xVz4jZmATIE9Sd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_kHXVLJmIMxDohhl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxAWFS0DQQOv5BC_vN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyA0lTdJ2IcVgZMTJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}
]