Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ask it to define intelligence. It can't because there is no definition. Therefor…
ytc_Ugy3dpRKC…
G
First, they came for the factory workers, but I did not care because I wasn't a …
ytc_Ugw1k79uL…
G
Massive amount of trash and some crap sticks to the wall…. ? I mean…. This guy C…
ytc_Ugx7Bzkai…
G
This is a very interesting take and one that I think I agree with. Whether somet…
ytr_UgySm_d7c…
G
I’m not concerned with AI run the world.
I’m worried about the people running i…
ytc_UgxmsAYWG…
G
The only question today is:
Will humanity be destroyed by artificial intelligenc…
ytc_UgzZNnXXI…
G
Yup. Didn't vote for him either, but he's killing it. Especially compared to Tru…
rdc_fn5kbbs
G
I have a prediction.
I think that coding will adapt to the constraints of AI and…
ytc_Ugx2fxktc…
Comment
Yes, Čapek describes this in his book called R.U.R., robots are doing certain things and if they, for instance, put their arm somewhere they're not supposed to and some machine smashes their arm, they wouldn't care. Potential pain is increasing their awareness of danger behind certain actions.
In AI, actions are usually measured in fitness. Your AI is trying to get as much fitness as possible, so if you were supposed to (let's say) hurt someone, decreasing a fitness would be a virtual "pain".
youtube
2013-06-08T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxbkTqLkjU3uXlH5yV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfaB7o_9NV7eTuvud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJE9RpB941BwPu5UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6DkWZnmJ8bc1cgy14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwSVeMt9jSBCsBJQ9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4Y_1s-dX3uM54znx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlX9V7X6EzlbA6WTl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxia4Nuzlppp-ihIAh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwefLJ2c6fGafnjCW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8coPjG5_AdbX1Qh54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}
]