Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are we living in a simulation or a black hole? I watched several very smart peo…
ytc_UgyCn80x0…
G
the more ai art grows and takes over human made things, itll start learning from…
ytc_Ugzejos7A…
G
Absolutely. Consciousness is only the creator of its own intelligence. Secondary…
ytc_Ugx9kFch7…
G
Logitics will be the first and biggest hit initially. Postmen and delivery pers…
ytc_Ugynk6CPU…
G
Summary: 1. AI is just a bunch of closed loops without its own intelligence. 2. …
ytc_UgxS3uGLS…
G
Came back to say we're at the crux between the old and the new. If greed and f…
ytc_UgxDD1zbo…
G
There was a primary study done in Detroit in 1980s as automakers wanted to repla…
ytc_UgzS35xQy…
G
People are to lazy to do simple tasks so they have to make a robot to do them bu…
ytc_UgygZq0Ag…
Comment
Before calculators were widespread people used to be good at calculations on their own , now they've grown a dependency and it's hard for people to do even simple calculations by themselves. AI will be the same, people will just get dumber as they depend more on it. Yes, our capability will increase but ability to figure things out on our own will be severely hampered
youtube
AI Governance
2023-04-18T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy_ixTx6aJ5RIHYJMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyIluiG6Jl7rIe76_p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyKbBxH5ZmTrfF7vR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpL4HfYQR3UHJdgg14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8yAKUc9skfDIAuxt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8yWC_Yk0Pc-iRVRx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxOuhA8nWTVNzWu0gF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzppKNxQDzxT0tS1ql4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYHtDbMzk-1Tv_H3t4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcfgbiIOnN4UWXEyJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]