Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai we'll be hearing very soon that american airline companies are considering us…
ytc_UgwfyjiFc…
G
The problem I see with artists is that they don't know how to separate art from …
ytc_UgyPeZRLE…
G
So AI is giving teachers something they've needed more of for ages that seemed i…
ytc_UgzxT2AQZ…
G
Kate: "we should make AI beneficial instead of trying to get to AGI" **claps**
…
ytc_UgwJ62VZb…
G
AI should be spoken too like a stranger. But, mix your emotions during your res…
ytc_UgyVWknTD…
G
@koustabborah4959 Average AI user brain* , since all they do is typing the same …
ytr_Ugz3h6scW…
G
I dont want to be watching fake ai crap anywhere , I dont want to listen to any …
ytc_UgzksDwpK…
G
It won't be the AI to antagonize the human being provision, but the developer wi…
ytc_Ugx0aw1d6…
Comment
@1:50 Hinton really has no clue. Surprising. A Bot "explaining" why jokes are funny is not thinking, it is constructing syntax from reams of data where humans have already explained why jokes are funny. The syntax is not the semantics. If, say, a rebooted Monty Python came up with a new trope sketch, the AI would have no idea why it was funny. But it has to be a new trope. A new original joke using an existing trope within the Bot's data set has a good chance of being syntactically "explained" by the Bot.
youtube
AI Governance
2023-05-12T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9QSfJH8BO7_0HfrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzdK9RowD-QjAL1h4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo2VP3HKOJNpOn3Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw17CXthwiuZDpG2bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx0mCUjC33Y9x0EHmJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugw0LgMQoei9z38xeRh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzFdUDxUGJ79DAAUBN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwrLNu8DICIzuPlARJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzpH11BQ-HRDLIkxq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwdI9cgvLSrBd8gbqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]