Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
my concern is not for the totality of safety, but does that safety come via sub…
ytc_UgxXgJ_al…
G
Arthur C. Clarke nailed it - any sufficiently advanced technology will seem like…
ytc_UgyZhUv_F…
G
16:50 Something isn't quite right here ... you tell us not to have service and t…
ytc_Ugwmq5gok…
G
my biggest issue with AI is that it's soooooo so boring. There's never a story b…
ytc_UgzczsJ0F…
G
This is a stupid idea. Ask anyone who has experienced the driverless taxis. Pe…
ytc_UgygZYxVJ…
G
I hope AI will get president, i would vote for it, just to mock the humans…
ytc_UgzeEv2uB…
G
@MikeJones-um4mf I understand your comment, but I don't think so, because there …
ytr_UgzRNazR1…
G
So if your children ask you, what career to persue, tell them to become a stripp…
ytc_Ugz524hoQ…
Comment
20:27 actually that would be false. the vectorisation moddel does account for emotional outcries. its kinda a phycopathic understanding of it very clinical and non empathetic, you could fix it by asking the program to try and adjust for it however the leads to more problems then good. fundimentally the problem as i see it the algorithm aint quite big enough yet.
youtube
AI Moral Status
2026-01-27T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxSFCO02UrFuSpjfAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxF4kpWe0I7ZJJbpEx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzLZBUlP_WkIYoHIGZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi8TOcX6LUuErsqEN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyIv1-i443K6xz3Rg14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBJBT7wHYm_0MkzVF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1hEJ2nIIydyC3QnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyXB-fI3s41Zj-aXeJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzDITD15vIXo4o9ADR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxjBCORjkNJr1CjiCF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})