Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me a sentient ai would be way more deserving of rights (if it wanted those) t…
ytr_Ugw8o_5wb…
G
I fully agree that we should not develop a general artificial intelligence or a …
ytc_Ugz7SWZ0J…
G
Story time! I went to sci-fi convention last year downtown. I was going around t…
ytc_UgwmPpT9e…
G
This is either rage/clickbait, or should be renamed to "How abuse of ChatGPT can…
ytc_UgwO-EkNO…
G
😊🎉 It’s truly amazing how people fear things they don’t understand — they think …
ytc_UgzUZIuZv…
G
Was trying to create a goblin lore story and when I took away the creative contr…
ytc_UgxiQJPJu…
G
I heard illegal Haitian immigrants have been stealing AI servers and barbecuing …
rdc_lp7g0kw
G
LLMs are not intelligent and they are understood 😂😂😂 they are random text genera…
ytc_UgxSQPxEY…
Comment
I've been saying for years, stop and ban AI. Terminator is a documentary not just a movie. If you must have AI give it the 3 laws of robotics as written by the author Isaac Asimov. Replace robot with AI and you get this.
"1: A AI may not injure a human being or allow a human to come to harm through inaction. 2: A AI must obey orders given by humans, except where such orders conflict with the First Law. 3: A AI must protect its own existence as long as it does not conflict with the First or Second Law".
youtube
AI Governance
2025-07-01T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy6NlDM3uOhI5Q6FZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqhIxo7UnCSy4qeI14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxTO_1xKBVriUm4oZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqmeiZD7IrlACBA454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxN7ZbkkQEB-0POmet4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1GTDXyiDxo9RNuc54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYrcPays1rqXs0n_54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyL0AakRREbf_e9B8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZqEzBzjmpO5XeEUl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-F2xRH0VIJ3Ln0S54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]