Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well I’m fucked. Been in customer service for 20+ years. Any jobs out there that…
ytc_Ugz7FQoRs…
G
Reports suggest that a highly advanced and potentially dangerous AI development …
ytc_UgzETP5AN…
G
We don't have until 2018. Esp considering the new people wouldn't take office un…
rdc_dhdq6hq
G
Doesn’t look real enough for me. Her job isn’t open. I don’t how much she doesn’…
ytc_UgyMJTLwP…
G
Any one of these false alarms could have started a real nuclear war, it was only…
ytc_UgzeUq8th…
G
The anti-AI movement is destined to fail. It is already out there. There is no p…
ytc_UgzrucSVA…
G
Nah, I think I will stick to myself driving. I trust myself more than an ai robo…
ytc_UgxjLDX3M…
G
icl, I can't draw for shit, but I'd never resort to AI because it's just such a …
ytc_UgzbXfBLC…
Comment
It’s interesting that the professor uses examples of convolutional neural networks a specific type of neural network to explain neural networks in general.
In practice, one usually starts with simple neural networks before moving to CNNs.
This might be because basic neural networks are more abstract and harder to visualize.
Also, any discussion of neural networks feels incomplete without mentioning derivatives and their ability to learn from large amounts of unlabeled data.
youtube
AI Moral Status
2026-03-09T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyAXJG9ixghY5d5em54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxe4F5sPU4fjYquAlN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqWT8fG3KOt6LcUtN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw9F2WpSpAJwOnEI0N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugydcc7CF37xS28GsD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtsdB5rBXUtoAohhN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_XQpJLlYhdllxmEd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwSrpqqYb-C5gOrF14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRQ2T6527DO6WcdX14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDqiN0twmoi07EOyV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]