Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's get rid of the arms industry, and while we're at it, let's also get rid of…
ytc_Ugzel7LjO…
G
I think socialism is the answer , ai and robots do the job and the humans just s…
ytc_UgxgG1RhK…
G
Artists spend their entire lives trying to improve their skills and then some ja…
ytc_UgxDTP6eW…
G
The interesting part is if we need sentient AI. We are making interaction trees…
ytr_UgzUVP-HL…
G
This has just reafirmed some concerns I have with self driving cars. For the pas…
ytc_UgiiIRzPV…
G
Tesla's robot has a more advanced arm swing, while XPeng's feels heavily mechani…
ytc_UgwK6GDHC…
G
Ya but the fewer people who have to work to less necessary we are and the more t…
ytr_Ugzel2meG…
G
literal me saying I won’t study game production cuz ai will take over so I’ll st…
ytc_UgyMlT7DZ…
Comment
Why does the male robot sound like Simon Cowell? Sorry but they need a better host--I do not see the need for the robots to get smarter and smarter--This will only lead to problems--because the creators==man are inherently evil,selfish and greedy--and a few choice other words--I only see the needs for robots for manufacture-and housework tasks--jobs that no one else will do--so they should be programmed for that task only--but not to problem solve or learn from humans in my opinion
youtube
AI Moral Status
2022-12-27T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx0uwnWPY8XdAfvTQJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyx_jT517gLE5kiWp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySJKeoZGmPy0ScUwF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgznYfZGJPevIBnLl3t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzAjWkXURxT1VsXyGh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt9CxnYFyotNde1-14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz_6f-pXxbmzCJocYN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnyQweBQjv_t__GRB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXrNfy_i5iuU35mih4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyoQ6r07ilwYwi3Byl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]