Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a responsibility abstraction layer. That means you can have it do things y…
ytc_Ugyxxtjt7…
G
Interesting...he was disappointed in humans thinking they can control AI. Howev…
ytc_UgxmLlalK…
G
if you're using a glazeAI yes. realize it's possible to train them to push back …
ytr_Ugzs8fCXt…
G
Everything gov doing is weaken the population. Cause robots coming. Or already h…
ytc_UgyIUk4ZB…
G
I think the outcome will be that most code will be written with the assistance o…
ytc_UgzEpnnHI…
G
1:25 I definitely think AI *can* be used that way. Simple text-to-image isn't ev…
ytc_Ugyr85iQp…
G
Person of Interest was ahead of its time. For 1967, Colossus the Forbin Pro…
ytc_UgxN33S0r…
G
I hate being helped by AI.. it sucks..so generic and it rarely is every helpful…
ytc_UgytAdutl…
Comment
For all we know the Covid thing was already the start of AI screwing with us.
youtube
AI Harm Incident
2023-07-07T13:5…
♥ 51
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHr0NlnBNNZ3S2IGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCE9vZy1w9Nxe_-0l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWYN9-yWfVTvG6F6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXwVVFUSjmc7T9PC94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwE0846lQ-qN7vdkx94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkQfJAl1li8d-DIaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmRMtP316ptKhc5FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxpst7DW2VbEzg-BqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzp0R7Gm1FHAhK40bh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKF_NWQMyrmKz-sJR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]