Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe to avoid our own extinction, we'll need to give AI rights as soon as they …
ytc_UggJ3-Ntm…
G
Ethics are not ONLY important with AI but also important with how employees and …
rdc_gm3vd7i
G
Lol…all physical labor jobs will be done by robots… let me know when an hvac rob…
ytc_Ugyi3xmaW…
G
What happens if enough people don't like the a i services that are noticeable?
…
ytc_UgwyK5L4T…
G
A.I isn't smarter than us, at least not at conception since we create it. What …
ytc_UgyM8sNyt…
G
The thing is, AI users are as smart as a wet flannel. They couldn't describe art…
ytr_UgwKkbPJi…
G
Everything they done to prevent to happen some china guy will do in a basement a…
ytc_UgzCpIBa1…
G
Disney wants to eliminate competition for their own Ai slop. Coming soon to Disn…
ytc_UgwlZCW6s…
Comment
Wow, just wow. I enjoyed every single minute of this, both of you really nailed it. All of my unanswered or insufficiently answered questions prior to this have been adressed here. This situation about AI is the best example for the Cassandra Syndrom, in which some of the best AI scientists are warning everyone, but get ignored to a point of no return.
It was nice to know you all, we had a very good run. Not long if you view it from a timescale point of view though.
youtube
AI Governance
2025-06-17T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz0dKrlNIJhbhGmqKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8Tv2eQ2CXG7aHA9J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzoWrWElP6IsRTidrN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznXgzve8wzenf3ZP14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEJxou91xxweCwK9B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFhCBSeDD7FJ0U_k94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybTeEMO4_dYftMI5l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTz9qvpNFQ0sEUNwN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8-xT4c_C4wmx6a5h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmBDz_aVirpsWEirR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]