Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
problem are humans, human is hell. I was thinking about a possible solution for …
ytc_UgzFOeb62…
G
So, my question is: if humans stop creating and innovating art, what are ai supp…
ytc_UgxumONVz…
G
In US I would guess that it is the same as over here, that most work places also…
ytc_Ugx_1eR6S…
G
We are glad you found Sophia's responses intriguing! It's fascinating to see AI …
ytr_UgwiX7HM1…
G
I've seen to many AI patreon's holy shit dude! makes me sick! and they got the n…
ytc_Ugx3-ishf…
G
Good.
Be mad waste your energy bro nobody cares💀 using ai and saying. “Ooo I’…
ytc_UgyZMSlkY…
G
Hi Alex, I was actually inspired by this video to make my own version of it afte…
ytc_Ugyu6LzWP…
G
The Supreme Court of the United States recently ruled that AI generated content …
ytr_UgzbFipqm…
Comment
But I don’t get it, why do these autonomous vehicles need to honk at each other? Shouldn’t they be like the way computers talk to each other? Unless an autonomous vehicle detects a non autonomous vehicle, I can see why use the horn, but they’re all autonomous within each other. I don’t know, I’m just spitballing here because I am curious. Not a math genius or anything like that.
youtube
2024-11-02T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx42Qlrx6UF2uasr8N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzM8nNdgIi2cgEq7aF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTsx1Zuw4AhGPPIGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxszYWmQ_JD6bTSRG54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwRi1_pf-MZWC0LvXZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw-wPqFRFfkJq8jN8x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLsfpnEN97Ntl_xAt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz0QS4oeekyMal8yRJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWpClDfpL5gA70L6R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwshAGZsSs0TSZ_R_d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"curiosity"}
]