Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have AI running on my computer at home, are you (or these protestors) suggesti…
rdc_mbydh5m
G
The fact there someone who actually trust self driving cars is crazy i dont even…
ytc_Ugwvo1yUo…
G
If these companies really want to save money they need to replace CEO's with AI …
ytc_Ugz2P0Xap…
G
This video is very interesting, however, a self driving vehicle, shouldn't find …
ytc_UggWc282B…
G
Hinton's fear about Youtube et al getting corrupted to c rate indignant viewers/…
ytc_UgyriiNfi…
G
This is a compelling video. It's interesting to consider that artists are inspir…
ytc_UgzEapk5H…
G
Why do employed journalists hate AI and all things tech? Answer : They know ther…
ytc_UgwqjmpjC…
G
Perhaps the solution is to achieve human superintelligence through science drive…
ytc_UgwBYP2JF…
Comment
The logic with "I'm old, so I'm going to be out of here soon" is flawed - and also dangerous for a society long-term. You do NOT know that this is the case, and since you've come here once - it's sound logic to assume it can happen repeatedly.
Furthermore - when he's arguing that complicated algorithms will have emotions, he really has no argument? He's just believes, just as others believe not, that the code will eventually spark authentic emotions. This dude is smart in some ways, but not very wise.
The "boat argument" for sentience does not hold up either - it springs from metaphysical materialism, which is still the most prevalent (and most philosophically problematic) metaphysical viewpoint we have. Yes, you can swap each part in a radio receiver, and it will still work - but it's useless without the signal.
youtube
AI Governance
2025-06-16T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzqiwu2RCG59s3tPLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxrBwQzEF7KWJ826M14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmjuzLcPFyJCw1eyV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy4jrszMCQ31L8WbPd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwi7f3XlkJb-RktjnZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJNsA-p6MxTL2kCdd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzF1y4vpwHMJvXzJMx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxu9tuyKFyfGQkr-cJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxuyiY8He7Gc1oAHGt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfpvnjseBXbS6G5jB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]