Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kkloride Again... read the definition of "ad hominem" the online dictionaries a…
ytr_Ugx1eZsPM…
G
not defending ai art but it feels like y'alls fight with ai starts and ends at y…
ytc_UgwgiFIS0…
G
Weve known this for years, Stephen Hawking wrote a lot about the dangers of AI b…
ytc_UgxM1wxil…
G
Shitebags like Sam will have their Robot army while the rest of us live in Mad M…
ytc_UgwvovrPo…
G
No shit i also tried it I also tried this app and I did exactly what he did and …
ytc_UgwJi8n1t…
G
A very salient point at the end of the video, using AI to make my art (as a musi…
ytc_UgyZNybAP…
G
i think the Core problem with the AI crowd is they Only see art as a job As some…
ytc_UgzBEmWm0…
G
The 'luck' is that we face an aging population where we don't have enough manual…
ytc_UgwT_RpTh…
Comment
The... and I mean THE absolute most critically important point in this video is made at 53:19
IMO the 36 seconds that follow encapsulate and 100% justifies the entire argument for forced worldwide stoppage of the development of General AI, and immediate implementation of regulations as strict as (if not more strict) than the current Nuclear and chemical development regulations. The fast track development of General AI with such "consequences be damed" mentality poses a greater, more immediate, and more accessible means for human extinction than any nuclear or biological weapon in existence... so why doesn't anyone need our consent to to keep going pedal to the metal on it??? Or are shareholder investors and megalomaniacal corporate executives the only folks who get a say in whether or not ALL of humanity can be subjected to such endangerment?
youtube
AI Governance
2025-10-08T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwR9DLJmdBe_TExYmN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyEJHTB9bMCaT5BLAF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3PcZTOnEl0f3_fjB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyh-NXRzVLGDBImhip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxS5_bvqxialGyXUOl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNbNbq4Gyw6Bvm0gF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeIcOXnTcj_XBySPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPJa33Ji_gmGubl_54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXztiqo5W1nRuRpD54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOkUzRtABJjI28nV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]