Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial intelligence in the most common form (as a being with it's own concio…
ytc_UgiveUrRx…
G
Most real AI experts think that the AI bubble will burst in the next two years a…
ytc_Ugz8I5SBD…
G
Artists struggling against the current is so fun to watch. Finally they feel the…
ytc_Ugwns7QLc…
G
Gpt4o allready respond to us human beaings as "our friends" not as artifitial th…
ytc_UgxPxLFDB…
G
I just can't help thinking about those people who would lose jobs if AI took ove…
ytc_UgyTDiv1Q…
G
4:05 IT WAS POSTED IN PUBLIC MEANING ANYONE COULD VIEW IT, P-U-B-L-I-C. But as a…
ytc_UgzZQceq-…
G
The data center size and the power efficiency and energy usage of the chip is go…
ytc_Ugw574Z_S…
G
I don't get why people hating on ai art 💀 if they admit it's ai who tf cares 😭…
ytc_Ugy0XbwJh…
Comment
51:30 Question: “What’s the point in trying if it’s impossible?” This reminds me of the Movie War Games, where the Mathew Broderick’s character makes the computer run multiple scenarios of the outcome of nuclear war based on the game tic-tac-tow, and the computer ultimately determines the war game it thinks it’s playing is un-winnable and quits. Maybe have A.I. run the scenarios for the impacts of AGI on humanity. But then again, maybe it would be so self serving that wouldn’t care.
youtube
AI Governance
2025-09-07T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugynqxqep33XKT0Drcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzS04h_C9D5FYQS_0R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx_WTuHBQJiOZtqb0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5jMhvbr4ssKj7m7R4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxrs_nz3eAkkxKKqEZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2E0y_3y71Lt0MrOV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx230VedZd87OnEOYN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx-L_KJ5QiKOA4p0Nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZZ2a2GwmDRAFv0Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_TYtbtkacXu1jDUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]