Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am a data engineer settled in Melbourne. AI is not something you can rely on. …
ytc_UgztkQa_s…
G
Because it boosts their share price.
The only people who’re going to get upset …
rdc_nmaane4
G
I'm really getting a t-shirt: "Idiocracy was not meant to be a documentary", but…
ytc_UgyUa_f9P…
G
The idea that AI needs to be used because people are "creatively disabled" is in…
ytc_Ugwy1PpQc…
G
The USA practices individualized medicine that is largely subjective, often base…
ytc_Ugy8Xa97X…
G
They said the Nuke is such a fear and they never happened now they need a new bo…
ytc_UgwycqRdH…
G
It's a no-brainer that any AI will take any steps required to insure Its continu…
ytc_Ugwpsa62p…
G
AI computers require huge amounts of power. Humans will always have jobs, power…
ytc_UgwFq9NNm…
Comment
We already can't "read it's thoughts". Having its own language doesn't make sense because it isn't how these models work. They tune parameters to predict the next words in a sentence. There is no logical process underneath, which is pretty apparent if you've actually tried to get these LLMs to do anything more than fancy google searches. To get AIs to use their own "language" would require jettisoning the current models, and would require US to invent the language that supposedly would be so much better than English. Also, this has pretty much been tried. Before LLMs people were working on training AIs with logical symbols, called symbolic AI. It had lackluster results, at least for anyone hoping to develop an "AGI".
youtube
AI Governance
2025-11-27T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzFdwW5frjuYVxv2214AaABAg.AQ0CTnnK2-5AQ0TaupbVSf","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzFdwW5frjuYVxv2214AaABAg.AQ0CTnnK2-5AQ25WwS9vD-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzFdwW5frjuYVxv2214AaABAg.AQ0CTnnK2-5AQ2N36kRjwd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyLpFL5Q_QDzkcHJyp4AaABAg.AQ0Btg4EEmWAQ1OTf7Vyv1","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxAsRX1zhyUyEEDw3R4AaABAg.AQ07W4gVk7IAQ0Ixn9dERp","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyT4BscfbZpin4pNY94AaABAg.AQ-ulkRCqYfAQ1J9Mr21L8","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyT4BscfbZpin4pNY94AaABAg.AQ-ulkRCqYfAQ1onWknfp9","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyT4BscfbZpin4pNY94AaABAg.AQ-ulkRCqYfAQ23n0xXJb1","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyT4BscfbZpin4pNY94AaABAg.AQ-ulkRCqYfAR83hpZCxzI","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyT4BscfbZpin4pNY94AaABAg.AQ-ulkRCqYfAR9U6yh_RMQ","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]