Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah and it will be increasingly every year as long as open sourced ai stuff exi…
rdc_kjklojm
G
The pattern of companies rushing to replace humans with AI and then backtracking…
ytc_UgyGGmGAU…
G
@anatoliyzhestov3915 I still believe the backlash would have been *way* smaller …
ytr_UgxoQPSqN…
G
Calling yourself an artist just because you entered a prompt and got a drawing i…
ytc_UgxovhGPQ…
G
I think AI art is another new branch of art. Some people will be good at it and …
ytc_UgxiEhRKd…
G
He’s not lying. While watching at around minute 17, they were talking about almo…
ytc_UgwnzT7KO…
G
Exactly, humains are getting exceeded and useless now that AI is taking the lead…
ytc_Ugwe_LOw1…
G
One of these days the companies will turn you on, steal your information, sell i…
ytc_UgxYLmM38…
Comment
It’s wild hearing them list off the predictions—2026, 2030, 2035. The timeline keeps shrinking. As a dev, I feel the pressure to learn every model (Sora, Veo, the new LLMs) just to stay employable. That’s why I started using omnely; it bundles the video and text models together. I can’t afford to bet on the wrong horse in this race, so I just get access to the whole track.
youtube
AI Governance
2025-12-07T18:1…
♥ 45
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylOcMtmfYPRLyA_uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0s_5F0fL7Yc6h9pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCTFAC3tuaqQyd4rJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmHU68lQswaDEmhOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9c5M8aiFACFvwDkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzslRkuK_KSVVjq6CV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxqiq20CC4lLEtT6Oh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAcj9D3tb7hktFuIJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUI-XmKN6ijviTF6N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW72yvXfzgauYvOVF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]