Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked AI if it understood the Geneva convention its reply ,
The Geneva Conven…
ytc_Ugwi7vBOa…
G
I'm glad this is just AI because if this was real the world would be in trouble …
ytc_UgyXokL1A…
G
idk if im right... but ig we need deep curiosity? at the end, i felt so relatabl…
ytc_Ugxb72RPj…
G
I welcome the world of A.I’s. Sure wish I could live long enough to see it.…
ytc_UgxV5V41n…
G
I say get rid of the corporate social security checks and give them to the peopl…
ytc_UgwGjPnwY…
G
F**k chat bot, I don't ask it anything. I research whatever I want there's enoug…
ytc_UgxaV_fJX…
G
i felt the existential dread when AI art first came out. then i tried out Stable…
ytr_UgxG7Qm8a…
G
Statement:
Based on Dr. Roman Yampolskiy’s remarks, we are accelerating toward …
ytc_Ugx_WscFT…
Comment
Unfortunately, you cannot stop AI, as anti-ai people suggest. You stop 6mo of it. The adversaries to use ai for bad will be ahead. It's not reality. Proceeding to create security systems that protect against ASI will be needed instead. Not stopping or pretending you can put it back into the bottle.
youtube
2024-05-14T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyiOM3SZ_5p0gRaWUV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUmxvQ17p8o2uJSMZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGtJDDLXtgQPRqloZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyT7jrUo409g04jzpx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRAj1Bh7LkXOL4HXB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMKHZGxn8k2LI47Yd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzu519XUaVjsnIkwBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyD3g2tOz8-BWmK81t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqYLA5C32NFfw9jER4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxO2ZlFk368oh0Pgep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]