Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't know how I started watching this, feels like i've been click-baited by AI,…
ytc_UgxQxXc7Q…
G
If ChatGPT is any indication, AI assistants that can do actual work for you like…
rdc_j1y0ok4
G
I don't get why people shit on AI so bad. They still put in the work with clean …
ytc_UgyQnzX0G…
G
I love how they all casually came together to mock AI 😭 The internet is sometime…
ytc_UgxIo6HZ0…
G
I agree more with Elon Musk. I think both have been active in pursuing business …
ytc_UgxCFVgXU…
G
AI from the 90s and the one we have now are fundamentally different. Even if bot…
ytc_UgxktCehw…
G
They’ve warned us years ago with the movie, Terminator and the movie with Will S…
ytc_UgzyK-jmu…
G
You have not solved immorality problem as AI wasn’t alive in the first place.. s…
ytc_UgxinNyo1…
Comment
The issue that people like Elon Musk, Thiel, Gates, etc.. have aren't with A.I., it's with ASI, Artificial Super Intelligence. The big issue being that it will largely be military tech. that once let out of the bottle will behave in likely unpredictable ways. We've never succeeded in stopping countries from having a weapons race, and an ASI is debatably the weapons race to end all weapons race.
youtube
2015-03-12T12:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6_c_fnxJFiXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjBrm-BO4E1Z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggwq5VL_P9YvngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugi28m3CG46xzHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggmA4p100IU0HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgiqEwaXkqSM-ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi0PpcKcA8VCXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgghsB3quoCVXHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjzbO8DgHLWlngCoAEC","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiclBN6LTRIL3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}
]