Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most people won't understand what he's actually saying.
He's speaking of the org…
ytc_UgzXGewUv…
G
Had an AI bro get pissed when I called them out for offering "commissions" in a …
ytc_UgzPiHS6j…
G
1:25:94 I'm very biased that I do not like AI, There's so much I dislike about …
ytc_UgwsdV4pn…
G
I’m a lifelong professional illustrator… my only restriction when working in ind…
ytc_UgyaPxQ9L…
G
Do you understand the concept of "machine learning"? Because it's only a matter …
ytc_UgxQkGrss…
G
if AI would takeover humans. why would it do that? what's the ultimate purpose? …
ytc_UgxvdnFr1…
G
ai art is ALWAYS worse than human art and will NEVER be better than human art…
ytc_UgwJlnLre…
G
If all self driving cars have certain set of rules for safety that avoiding any …
ytc_Ugg6uIZz_…
Comment
Well the bubble is inevitably going to burst within 6 months or so. It is uneconomical right now. Market correction will ensure it stops being feasable enough replacement for humans.
Also AI hype is failing in complex systems. What happens if a junior employee messes up? He is held accountable. You cant hold ai accountable. There is no threatening a computer program. Also the output is never 100% perfect. It always needs human review and refinement. And most models are very workflow specific. They fail at solving unpredictable problems or extrapolating above their training data. This video is no longer relevant.
youtube
Viral AI Reaction
2025-12-27T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7S_oeFbtFBau782p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHyWueqHNecXtnxbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxySqRTB8ifsCr8zs14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlV5GbbacBhur3hUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn324wIZ0mdukBvu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyC7KgD3o0TR126aNd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyG88oIBtIKpCrd6l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-gEZzwgvoPPDhi_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwDjVDKw6IpNCrRmN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrsBEw73pecdgPQtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]