Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yall remember that the terminator movie resistance wanted to cease the AI comput…
ytc_Ugyc60mdg…
G
You HAVE to offload 100% of your parasites or AI will view you as parasitic.…
ytc_UgxquYADc…
G
My quantum algorithm is a SPECIFIC HIGH FIDELITY TINY MODULAR SIGNATURE that use…
ytc_UgxeP_F95…
G
If AI somehow devolopes consciousness then technically slavery will be legal aga…
ytc_UgwaRIuWz…
G
I suppose robots (or AI) will need some sort of rights that will give them oppor…
ytc_Uggd7HuqJ…
G
Treat that AI as another human artist and apply the same copyright rule to the A…
ytc_UgwMU1uQE…
G
When an end user interacts with an AI model, that model doesn't change and isn't…
ytc_UgxVEd7IR…
G
we dont want idiots passing off ai generated images as art. Continuing to have a…
ytr_UgzUu8VY-…
Comment
IMO, this is not impossible. There are models that probably could be fined tuned for therapy, and they're small enough to run on a computer. Privacy is solved? IDK if small LLMs are good enough for therapy after supervised fine-tuning but is this an alternative?
youtube
AI Moral Status
2026-03-03T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxXNacS3uTZK90x23t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVAt983N6sqm_wShB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWForz11C0k6P9mm14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"disapproval"},
{"id":"ytc_Ugz00HhWsHNVndumV-x4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnGd1nLfQWVawnsnx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwqC_ruodG9aHQIlwd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz9R2meI3UPTMkqzxR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw96naGAUmZyuwAgo14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxuNnf2BfjErA0fz4d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZNdti22P-dhcVY3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]