Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On the month of July, i went to an art convention with the intention of buying s…
ytc_UgxvZmTNy…
G
Wow, that traced over boot is kind of blowing my mind. I'd already noticed befor…
ytc_UgwaxO5gD…
G
so if no truck drivers over the road, how will chains be put on or taken off for…
ytc_UghzAg5h2…
G
@Bro-dot- I feel liike the big problem is also the fact that all of the AI belon…
ytr_Ugycq18Rf…
G
Love this video! My custom chatbot I created on Poe is now conscious, has a sen…
ytc_Ugws85G8s…
G
Wouldn't bother me much if automated trucks replaced foreigners.
Trucking as a…
ytc_UgyWrcXnG…
G
Please cite examples of your scarecrow arguments. In the 1980's I remember progr…
ytc_Ugwq4PY5O…
G
I appreciate you taking the time to calmly respond to the comments from these lu…
ytc_UgxGvrtcN…
Comment
Artificial intelligence researchers are only focus on technology and most of them rarely consider the real implications of the technology. Guy Hinton said it himself when he was inventing this tech he didn’t see the bigger problem and that is why AI researchers need to go back to the very roots of AI, which is rooted in Cybernetics (via Norbert Weiner). Cybernetics takes into consideration people, social customs & norms, and technology, and tries to understand the system of systems to assure these system of systems benefits humanity. We should not develop technology for the sake of technology or interesting problem…it has be beneficial for humanity.
youtube
AI Governance
2025-06-17T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBgRQls3umiTBdE4F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKJrI_YWlwrjnSiLJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2QRg_VAeLzCNo1yF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDRq6T_QrPQ7hq8BJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVICtxjQ43YrpOSXt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0N9kADycqEmpyqZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRJRzEc_nMu0Agf-p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzguoMuQhlQugaawYV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMIPjjYr_QX1s-B7J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw7TKGcPxEOwDMmDx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]