Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea that AI has to exceed humans to replace humans is not needed. A lower a…
ytc_Ugyh5Jwdz…
G
@Aethereon01 Murica baby, where your life only matters if you create value for a…
ytr_UgySyw-A0…
G
Ai looks realistic despite its janky moments, if an artist uses it and they work…
ytc_Ugz7UXMhD…
G
I use AI at my job and it saves time in some places but causes Massive problems …
ytc_UgzJlQL1f…
G
Abusers love silence. Call them out on their behaviour and you will instantly be…
ytc_UgxXMih-p…
G
No shame or anything, I’m just trying to say A.I cannot take inspiration, becaus…
ytr_UgyCoUpUH…
G
Okay, I get not wanting to have students use ChatGPT to write an entire paper. B…
rdc_kgrknmv
G
Interesting video. But already at the beginning you've established a situation t…
ytc_Ugi9KaGK7…
Comment
I don't know if we'll have STRONG A.I. anytime soon. Musk and Hawkings probably understand that as well. Which may also be the problem. Weak A.I. is what's being applied now, and what they're attaching to machines in the next few years (as far as I know of). The issues these guys are talking a lot about is having easily reproduced and extremely dangerous tools in hands of people who don't have humanitys' best interest in mind.
youtube
2015-07-31T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghyligspsInLngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiR781rAxGYn3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj5te92XVT4QXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggsHT54JBbk6ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugju1o0fz003UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjIsz0jcDy5yngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgieaIBYq1XbdngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgibTCMNgh18engCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh6qLpEz9mISngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugis0cWW2lpi3ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]