Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
9:45 "No right to speak to gods". No artists are the gods. A god creates not ste…
ytc_Ugx3wVKSm…
G
AI controlled weaponry in the hands of the Trump administration. Look out Canada…
ytc_Ugx6_GwA8…
G
I love this. I’ve personally experienced how much you can learn by yourself with…
ytc_UgxZwLg39…
G
We make this mistake time and again. We confuse technological progress with mora…
ytc_UgwnHFD1z…
G
It does have subtle differences, but yeah the point is to make changes that most…
ytr_UgxGuNc_m…
G
Well one thing about current living things on earth including Humans is that we …
ytc_UgxMqPiJ4…
G
The data centers ruin the landscape and use a whole lot of water and energy. Why…
ytc_Ugz93dXPl…
G
why would you bother drawing if you arent perfect at it? obviously you wouldnt; …
ytc_UgyXFa3c5…
Comment
I remember Legal Eagle covering when lawyers tried to use the Chat GPT on official paperwork in a case. They apparently got into a whole lot of trouble, largely because the AI cited fake cases and they didn't check the AI's work and just assumed it was right. The judge looked up these cases and found they didn't exist and got quite annoyed and that's when it came out what the lawyers had done.
youtube
AI Governance
2023-07-07T03:4…
♥ 102
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEshr6nLHwYn2q_rV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx9FnmJAZ4sKmTnM4d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHzsfTlYtq1vmqzVt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEDi3JXLXT0h6oRDp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWJS_orKsXcFLxK7t4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqRQXKta-jrkeMIal4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxsqoti4PfNruTI50d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2mN5vrpmkFluHYZF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwEXX0KjRPZQ43gXGp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGffDoPVLIsEDGpuN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]