Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI needs to go. There's progess to benefit humanity and progress to eliminate hu…
ytc_UgzBd-kyi…
G
The problem with the reliability question is that it looks at the mistake rate o…
ytc_UgxiUOD8I…
G
Honestly I think the both looks like absolute garbage, the way the mouth moves i…
ytc_UgxfOiih1…
G
Universities are like flamingos with their heads in the sand over this. Fact is …
ytc_UgzJhQ0y6…
G
39 and learning to code. Agree that LLMs are changing the game, but for now I on…
ytc_UgyQ-1GcQ…
G
😒Literally ai should not be used to generate images and be taken as “I made this…
ytc_UgzMKoevj…
G
Your concerns are definitely valid! Sci-fi movies have shaped a lot of our perce…
ytr_UgxQmDy9_…
G
No offense it's 10 time smarter than most already here is why: knowledge is reta…
ytc_UgxLkSYjd…
Comment
In reality most of the regulating agencies mentioned FAA, FCC, DOT, FDA were built by a government collaboration with the major industrialists at the moment. They mostly crushed innovation of smaller challengers.
Pan Am/Boeing and the FAA vs Hughes Aviation
GM/DOT vs Tucker
Kellogg/FDA/Dept o Ag vs Post and others
The EFF (Electronic Frontier Foundation...a volunteer think group) has been thinking about this decades before Musk became an IT guy and came to live in Cali. Maybe we should listen to them first.
Musk is lagging in AI development far behind his promises a decade ago. What better way to catch up and stay ahead than to regulate competition into the basement.
youtube
AI Governance
2023-04-18T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwfLP0cUJyzkNmv95x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxI1kCIu41A4GS67T94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzS_pEd-qLuIj9j1Pt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIurwijJ3I3sdYy_R4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwTP8IU24uhNtIrTul4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwIB0UmPzM6eU0L1094AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzQxMdl2sZHmQkuKel4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyxTjmRqqfcjqQRq1d4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyEswdxkPkOXCP_RbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxIeOv6Q3sR66RW7IB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]