Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Although I agree with you Bernie, it does make me wonder. If what you say is tru…
ytc_UgzsrUbnt…
G
Its hard for me to believe this isn't seeded data. For example, there's no reaso…
ytc_UgxZgZMZj…
G
All a truly sentient robot ,with all access to knowledge about real world physic…
ytc_Ugyo7Egjr…
G
Does anyone that works closely with AI have anything not terrifying to say about…
ytc_Ugze3xImu…
G
To say it better: If AI succeeds, we lose our jobs permanently; if AI fails, we …
ytc_UgwMv0Smt…
G
@R11T16_ yea he didn't source his information but if you are actually interested…
ytr_UgwWNFzNs…
G
it sometimes seems like they're rushing this shit to try and create an ai so, id…
ytc_UgyoD0ODY…
G
Lanes? What the fuck are lanes? It's %90 dirt and gravel roads out where I live …
rdc_d1kwa3j
Comment
I've met a lot of tech-oriented people in my nearly 60 years, and I gotta admit... I don't trust that they are able to distinguish among imminent, immanent, and eminent. Also, for the record, the AI super-intelligence argument is total bullshit. In theory, it's possible but we are so far from that. The only people who are bought in are tech nerds with poor social skills. That is not to say AI will not have positive applications... but the much bigger problem is the over-valuation of AI companies. The values are so disconnected from reality... there's no business model... what we do have is endless hype, by guys like this, who for years have argued that superintelligence is right around the corner.
youtube
AI Jobs
2025-11-19T09:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUadrAKv9oBXXz-4R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZFtJBiolJI31NIpF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxAGCzRp_BxCQWTKzd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC3I7NE59mKEJrd-N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyW0oB4yKhAcu_3EWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRoz76g39vvMdQygR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw3Gga0E7a27pNF-yp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_6abBSyRiNlc7c3V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxEgow7xcFSjJovfyZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyw55DyWdr8YEDZI-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]