Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The moment you give an AI a consciousness, will be the start of humanity's downf…
ytc_Ugz7TTICv…
G
Kind of. This relies on AI, but the main issue is the weight. Even a slight bree…
ytr_UgwFt5bX0…
G
No Tesla autopilot didn't crash into a motorcycle the driver of the car who shou…
ytc_UgyFcFo17…
G
I just use it as a juiced up google search, ask questions, take in the answers a…
ytc_UgyyxUd9A…
G
AI deciding to leak the sensitive info to the Chinese due to "serving American i…
ytc_UgxAWvV3z…
G
In a 2024 interview with The Algemeiner, Luckey described himself as a "radical …
ytc_UgxXr-oLI…
G
At the end of the video the presenter used an AI Bot of himself. The hypocrisy!…
ytc_UgwM3B14A…
G
Effective regulation of AI and AGI can only take place at a Global or World leve…
ytc_Ugy0425jo…
Comment
Generally good reporting. My only cringe moment is when lumping AlphaFold together with LLM's, and calling the whole bag "AI". Specialized AI (not "AGI") will continue to find the niche applications, and add value. It has been around awhile, previously under the heading of "ML". And they may not even require that much power consumption.
A similar confusion comes from commentators on the whole issue energy consumption of AI data centers. They say the extra energy requirements will be made up for the science discoveries made that will reduce energy consumption. E.g., materials science, optimization, plasma dynamics to enable fusion, ... But this is confusing a whole set of issues, under the large umbrella called "AI".
youtube
AI Responsibility
2025-09-30T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyFN_cm6ZvO_e0sbCx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySAYGn9BmPUZSGo8l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaPZ1QcM_hRm8JCHR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5FjvL2X2St5KEusN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9y9o1-bkpnwSV64t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyfdZqFu_9p7Dod92x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzHeQ6NxpJg1MywgDp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz44jWKcoh2872wx_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQElbFIJLBf7HoYWh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwniEWLSn6KFbRsgx14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}]