Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
loving the video so far, point to make: the reason they pirated the materials to…
ytc_UgyZmlanB…
G
How do you grow up with every movie we have and still think yeah let's develop a…
ytc_UgxT33s8Y…
G
@Hanzelleta no tech itself is not bad. It's the lack of moderation of it that…
ytr_UgxjKY41w…
G
An interesting case study here is the autonomous weapons we've already had aroun…
rdc_kar7qoa
G
Actually, the AI told the truth:
No trick question- just an *honest answer for a…
ytc_UgzJKt3mH…
G
Is this guy stupid or what? Imagine a world where AI solves all our problems lol…
ytc_Ugx9UwXW8…
G
Exactly as described in the books 'The Digital Oligarchy: Algorithmic age' by Ro…
ytc_Ugwi8VzRa…
G
The best way to stop ai art is to gatekeep art from non-artists. Ai “artists” us…
ytc_UgxR98c-n…
Comment
AI is a tool. Like a hammer or a gun or a car, it has the potential to help or hurt us. To me it seems a lot like social media in that if we assume it is neutral or only there to help us, we will face many unfortunate outcomes. But, if we know that it is a tool that we need to monitor, and ensure that humans have the final say on its decisions, then we should be ok. Beyond that, it is a force multiplier on human thought. Its greatest power will be to make smart people smarter.
youtube
AI Governance
2025-10-15T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxvblgWarV_XSpi7Wt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnCXqm1RZbl5iWUFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZOx2TvyBPt-KsW8N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxaNnI_bAbU-dftvTx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgycipP42UwlR70Uzx14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBiaANjE85vI0-lMV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsdXfh3BkZ5rh9ast4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzW4q7r39b4P7UlFyh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzbCkJe-xFbqv7dytR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz6cEq0UdFY6DRTflF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]