Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes. I agree. But we don't know the details. Would you like to get a free PDF co…
ytr_Ugyk-HG9n…
G
@novilunae Sorry, I forgot — it’s not a real opinion unless it’s handwritten wit…
ytr_Ugww8JLYl…
G
Have we not got enough existential threats to worry about. Are we not maniacs br…
ytc_UgzmRjtgS…
G
i recently saw this billboard on a construction site for a big office building "…
ytc_Ugy6_C-ZL…
G
LLMs are just one variant of AI. Specifically they are large Artificial Neural N…
ytr_UgxdXusJM…
G
I wonder if the video talks about practical tasks for automation. Does it show r…
ytc_UgxHrkM53…
G
This also explains why so many AI companies ‘donated’ to Trump’s campaigns, and …
ytc_Ugzt-4Lqd…
G
i've said it and i'll say it again
majority of artists hates the unethical pract…
ytc_UgwnTIrWO…
Comment
AI in computers still relies on pre-defined code. You get the illusion of AI because as computers hold and process more data, they can react to more situations. As far as what goes on in Sci-Fi with robots turning against humans on their own, I don't see that happening any time soon. I would think that would require a processor "brain" to be able to rewire itself to create new code, kind of how your brain works. As you age, your brain rewires itself with experience. Wires etched into silicon can't do that. To avoid what they are talking about in the video, you just need a fail safe for every decision you program into the machine. You have to do this with just about any software you create.
youtube
2015-07-30T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjHpoi4MMGqgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Uggc-9bes9wUWXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj9aX1JiUSK3XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggnJEnC7z1pzHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugg4you0I9WF0XgCoAEC","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugh984wo3xCWJngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggnR24j2_LMwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggNnprVproRXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghVP7t4IjdXLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggSCIMbCmQoD3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]