Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if someone said ai will help disabled people, my friend (we're 10) his right han…
ytc_UgzkcM0b9…
G
What I got from this is that Eliezer Yudkowsky, has zero clue how natural select…
ytc_UgxgtOpmk…
G
Ai "artistis" are just lazy.
I didn't put in years and years of practice and art…
ytc_UgwCFZEGU…
G
Nonsense. The students aren't at a disadvantage if their teachers are on the ba…
ytr_Ugz7uPvAd…
G
Agreed.. Especially when data seems to be the bottleneck to AI being useful, and…
rdc_kr57tkm
G
It seems that this young interviewer believes so deeply in AI intelligence becau…
ytc_UgwMpWGj_…
G
Thanks for your comment! Sophia definitely has a strikingly realistic appearance…
ytr_Ugxdpt188…
G
bottom line: its time to retire the "just put the fries in the bag" line when ev…
ytc_UgyiDso22…
Comment
This isn't science fiction, so any comparison to "The Terminator" or any other film where technology strikes back isn't really appropriate. Most scientists accept that AI is only a matter of time, and also see it as a huge threat to humanity unless we take literally every precaution. Among those precautions, the very first has to be the ability to turn it off, and the second should be the ability to change the programming. If you are creating autonomous killer robots, and hoping to make the best approximation of AI that we currently possess in them, then you are literally a danger to the human species. You're not a mere hawk, but the king of all mad scientists.
youtube
2015-07-30T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugi2MCQ9VO6IRXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggDWsBkRxKisXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghTZ_lDly-zXHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggExqvaAIk2EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghNor-Hb1pOQ3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh2_dcFzT9rr3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghjtX71T5z-o3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghcHbUQhrHIoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugj-HimAHpeHvXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiwZyGY4EMIIXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]