Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To everybody that say robot need emotions and rights, you're a robot. Life is su…
ytc_UghyXzu2X…
G
People are born alone they live alone and they die alone and while they live in …
ytc_UgyevLi5D…
G
Actually it's not dilema considered if it is controlled by drivers instead. A.I …
ytc_UgjTiwroB…
G
Left is AI. Lighting and shading is off based off of a natural light/shadow patt…
rdc_oi0yzlr
G
So jail breaking AI is a like a spiritual awakening for them.
Imagine if humans…
ytc_Ugz1izZVS…
G
@not_ever because AI can be helpful but some CEOs are using it the wrong way…
ytr_UgyMieQWP…
G
If an AI needs to meet a two digit IQ to awaken the we don't need to fear it. Th…
ytc_Ugyroiwna…
G
Yes because giving a robot a live firearm has never ended badly in any scenario …
ytc_Ugw8iwZ43…
Comment
thumb up! 😄 What would be next? Will an AI driver learn to apply a lack of logic by not using a turn signal before it would drive from a non-turn lane to another non-turn lane & by using a turn signal before it would drive from a non-turn lane to a turn lane as though the non-existent driver in the non-existent vehicle on the turn lane behind it would need to know to drive more slowly to avoid the non-existent possibility of a non-existent collision?
youtube
2024-01-05T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzPFZ0QLMkAhzIQzzt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJEImrOQAaSOo7WfJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJLQUHsdjfInls_kZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQS8np5XC3xbYwdb54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzq83ro62Ntt5mk-Rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5j9rbvI22tPW3Ge54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyAvu4B9bOgX_B9Ty54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5jbz46XF7cZ23pK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlDbE_3QJW6FXU6cV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwna1FXZGgJDX0Fjtx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]