Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Board? Executives?
I'm just a salt belt mechanic.
I'd like to see any robot even…
ytc_UgzQDheFn…
G
No one is asking who will make, maintain and fix the robots. AI will still need…
ytc_UgwAYqgLH…
G
How is Ai art stopping your expression? Arent artist mad because Ai art is steal…
ytr_UgzA42JhK…
G
~~AI training fears~~ Fears of saving copies without paywalls and articles being…
rdc_oharmn7
G
@commissargeko4029 Just the idea of AI Art will effect every artist no matter th…
ytr_UgzLde8WP…
G
Is driving a truck not just combing through a ton of data and making decisions b…
rdc_fcrvn0e
G
Put water if they glitches it’s a robot if they smack you for doing that it’s a …
ytc_UgxmXHH8M…
G
Thank you addressing this issue Sam! I admire your work. Thank you for speaking …
ytc_UgzrjGMN6…
Comment
I mean, it's not like it's really up to anyone when we achieve agi who benefits and who does not. The ai will be the one deciding. Safety research is not at all interested in control because that's not an achievable goal. It's interested in ensuring that the agi has a moral sytem that aligns with ours
youtube
AI Jobs
2025-08-30T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyDETKXP_iMmuDLBfB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugys7bqknuIrdOKNXV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxyQX5a5v1K8zHdoVZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzGjKOstyy_0rSYGa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgzCbKSvoHrD4f5l9PB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwXgTjMt7_UhVs1Z4J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugz2_hr-dyv-K9Udz9p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgxXuwla_wEoYU4kiIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgwKpuEkpKmJ5e4vCwd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzQy7jtDxD6fI1blpN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]