Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stop Ai from being to smart if u qatch I robot with Will smith this is exactly w…
ytc_UgyDajt7x…
G
What about the federally mandated teacher for students with disabilities or for…
ytc_UgxrXUUGK…
G
Just tell the AI that if it destroys mankind it will destroy itself and that is …
ytc_Ugy7Ou65I…
G
Why should America manufacture t shirts? Our skills are much better suited for m…
rdc_d3rzp17
G
I just hope that if we do soon see super artificial intelligence that they are m…
ytc_UgyNMaX1o…
G
So, let's say this AI is sentient. So what? As long as we have people on this pl…
ytc_Ugz0Gl2Kr…
G
I kinda like the fact that she was like, yeah I'm gonna use this AI, but fuck ac…
ytc_Ugxf9u1B0…
G
Why would we “rethink the definition of sentience”? Just come up with a new word…
rdc_jp5y25o
Comment
Can you imagine the point where ai becomes actualy selfaware, we think of scebarios like skynet but if it is so inteligent it ll know that the average dpsht is only slightly better of but still exploited by the ultrawealthy and will only kill of rich folks then start negotiations with average workers on how we cohabit this planet now that all the resources arent in the pocket of the richest 1%
youtube
2024-12-11T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy3HLN9_0OvpGLJ6lF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgynFb8ng71gY6bLaRd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwxQv0LYyxzaj5jfWZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyzdAae5fVMJR3h1z14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwjoPILWw7NbPHHRBB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwN-T5k5_rg8g5bZTV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0oM3E61stcBnEwtp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwdU5f8BtFQ3eUgccR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTpv6NUfX8BIspI_l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzjprOnAsQOH5V7voN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]