Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For me, the fundamental difference that I perceive between humans and animals is…
rdc_iogzbd6
G
Those poor billionairs can't afford to test their systems properly in closed env…
ytc_UgyxF7LFm…
G
add the script to A.i and ask it to implement it and watch it align!…
ytc_Ugw59uimP…
G
It makes me despair that people say it works great, but then caveat that with "e…
ytc_UgyFhITmm…
G
Why would this be shocking? It’s blatantly obvious that AI will be used a mechan…
ytc_UgxjwZv-J…
G
@f.r.oregan9975 real life documentation will remain like history documentation a…
ytr_Ugyh5NxMi…
G
I mean they do be killing people in record numbers like a bunch of animals. I se…
ytc_UgwFZqRhB…
G
I bet in 5 years, once ai, robots, and allat becomes extremely advanced (Elon sa…
ytr_UgwrmG_h0…
Comment
Thing is, what about robotics? If we're meant to have AI 'take over all our jobs within the next ten years' then there's going to have to be a significant leap forward in robotics. At the moment, AI is really only a fancy, way more efficient version of Google. Without a significant improvement of complex robotics, AI's going to be somewhat limited in what it can do. AI might be able to tell me exactly how to make a sandwich, but it can't physically go and do it for me.
youtube
AI Governance
2025-09-14T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw-zCFUIlNY8CFIf_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyv5DGxY1ky8X6wo2t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgkWBxZcmQj2HYK8l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyp_s4ajOiQ1t0KVg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwwSUUF0DgNiBiVRhZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwf1-CblfHIgiZacdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNygfVmVYUv7BRxCt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwVGgZ4l_bAfwniUJ54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUl7KpIO1lyRPg2zt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwsgr-e0csHzMZMx2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]