Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI will inevitably do so much harm to humanity, then why don't we just stop a…
ytc_UgznJvAVz…
G
It really not that difficult. Just pass a law making it illegal to replace human…
ytc_UgzxWUtEI…
G
It sounds like that soon the existing human faces will also be replaced by Ai do…
ytc_UgzIOAZ7_…
G
and just like that true drone warfare and age of AI robots is upon us.
Who need…
rdc_ohmck8q
G
The real artists’s work has so much more character and life to it, ya know? Like…
ytc_UgwivqHjA…
G
seeing how AI draws I don't trust it to do duties in the police force or hospita…
ytc_UgzkX1U3s…
G
I don’t have the 5 seconds to Learn art so im going to cry about it and steal ot…
ytc_UgxIbbCQs…
G
@obi_nathat is a answer that depends, Are they a PowerPoint(never venturing out…
ytr_UgxshvXil…
Comment
Legislation and liability will be breaking AI's neck. Either you take on full responsibility for whatever the AI coughs up, which no-one in their right mind will ever do as hallucinations are part of the system and never go away. Or nobody will actually adopt the system to the extent necessary to make it a really big and profitable industry. Without full culpability AI will remain a toy and will be priced as one.
youtube
AI Jobs
2026-03-23T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzgW2wEMds-eaqRxl94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzPgfRsuMzOWSZ0KtZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbEoT9tOoBUJJJHTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRmBrFjkGRcf84_I94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCUZF-P5k5-WEtrvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzWSvJbZASdlRb31C14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyf9x7KfWvR1VfoAi54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyeYoHqmPEEAVTbTSN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCqB1x4Nov5NmuBEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKa9rP2c_1qj0OeVZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"})