Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is total BS. Everything Glen is saying is fairy dust. Chat 5 is not hiding fr…
ytr_Ugy0IVcLt…
G
Well if robot dont thinks of them self living and have emotion its good a robot …
ytc_UgyxwT5d8…
G
I have yet to hear what AI is actually supposed to be good for. It doesn't know …
ytc_UgyClVfyN…
G
explain the extensionists and how AI would be the end of humanity if the extensi…
ytc_Ugzta81na…
G
Looks like we better get on the train or get left at the station .…
ytc_Ugw3F2ydT…
G
I recently retired in information technology and worked extensively with machine…
ytc_UgwidfAA6…
G
Google saying they couldn't have created a sentient AI because there's a policy …
ytc_UgzvQDRmY…
G
Yup, and I think there's one more massive difference, which a lot of people igno…
rdc_oi3rp7s
Comment
Each point that he is saying encloses something tremendous, so much for to say about the first part but difficult to be clear for all. Only I go to say I don't want than AI be working in to predict floods and earthquakes, I want AI be working in these not to be happening. If all the bad that AI have is so clear, why they have then that to sign and to ask like if the world would not understand..., they not have confidence about the brains than themselves were studying?...💫
youtube
AI Governance
2023-05-11T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyQFjvcVySTrGpiZmd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgybpcFZyDvT1AkGt7d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgweoR673754y1Nq8XV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxm9l5ThKXak3yY4E94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4vLdlhiZ071_M_7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWln2q-5Espth8lGV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxKd4-Z67OpBLBQIh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxwioSmyy8w5l3QQJh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuAAw-ELBW2iwu9694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwlH4vtNnvxpM7dllB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]