Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Comma AI does the same as Tesla it's just they're behind a few years. Btw Tesla …
ytr_UgxHjf_i9…
G
Yeah same. This is the *worst* AI will ever be. And it’s already pretty fuckin s…
rdc_k7lcik9
G
For some really uneducated people who have zero music sensibility,AI may work fo…
ytc_Ugz_3vZ8E…
G
We want to stop the rise of ai we need to start spreading misinformation widely…
ytc_UgxobmxRr…
G
I think the main thing like Elon said about the cars is scalability. These other…
ytr_UgwYjzUyO…
G
There are already AIs that are being applied in that way too. A hospital in Norw…
ytr_UgwzeWYgv…
G
Ow. That hurt.
I respect your point of view and didn't wish to come off insulti…
ytr_UgzWToaaF…
G
Cool, I'm gonna challenge a long distance runner on a race and while he runs I'm…
ytc_UgxuQJNEI…
Comment
Based on the assumption that AI will reach a point where it matches human intellect, is it not probable that as well as the creative, AI will also adopt negative traits that come with self-awareness? Such as "The Self-destructive human complex"? i.e start to devalue their own self-worth and other percieved inadequacies. My point being, as long as some mortal equilibrium is effective then I'm sure Bender will have use for a fleshling to for-fill banal inconveniences such bringing a frosty beer from his cooler. Which raises the question why would they have a human do this? My answer, the same reason we have Rumba whizz around after us. Novelty value. I'm not saying this is where all humans will end up. But its certainly motivation for kids to stay in school.
youtube
2013-12-03T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj8AXUuhgfjUXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghEfYIiBlCtyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6pJ8sg8sIuXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh4Izu1dFDCBngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggStT0fkttiU3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjHME_FVR-RjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiFPP6fP-f4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjk-OLPfqT00HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugj_Nwoh-nEukngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggdtWoUYVl_S3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]