Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As always, great stuff. I wonder if the only way we can truely have what we imag…
ytc_Ugzscw7uP…
G
We were promised the world and got a backyard shed's worth of surface area. Only…
ytc_Ugw82BkYF…
G
imagine an AI going into a doom spiral, like at the rate they can process inform…
ytc_UgyLgnMEC…
G
what if we think like if they were robot they would cover the side mechanism but…
ytc_Ugzu--cRw…
G
Notice how he said “The ai is pulling from biased sites”
THEN REMOVE THOSE SITES…
ytc_UgxcLZtzb…
G
"How do we get from point "A" to point "B" is to take AI way from billionaires.…
ytc_Ugw_Tfp_p…
G
There should also be considerations for humans who become close friends or life …
ytc_UgwNnDK-y…
G
Its simple. When our children can't handle AI anymore, they will terminate it an…
ytc_Ugz_5LSLj…
Comment
"We're building the road as we walk it, and we can collectively decide what direction we want to go in, together."
I will never cease to be amazed at the utter disregard that scientists and inventors have for *history*. To even imagine that we humans are going to "collectively" make any decision about how this tool -- and this time, it's AI, but there have been a multitude of tools before -- will be developed is ludicrous. It absolutely will be decided by a very few people, who will prioritize their own profit, and their own power.
youtube
AI Responsibility
2023-11-21T21:5…
♥ 348
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxxkYEBG7B4u7c4H654AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-3PCgR-UrHE9l3Yt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzfl3rn8xehNcpOA4p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugweg_CK4C2IswW1E_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7mPr5ksVB_eT7WKh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxj4xWPxl7vJ5-OVJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7Lh1aMrznFuRK0T54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCP7HbaWodzkGm6Ct4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFuvwMcwKE4D84JNd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbQh2mQGxrF9NnFlN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]