Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The language model is not the main selling point here. It's the voice model that…
rdc_mfhn8wg
G
The difference is instead of thousands of people losing work to new automation w…
ytc_UgycdsQog…
G
Every robot that takes over 1 or more human jobs should be taxed, just as a huma…
ytc_UgwE-aAYv…
G
This was such a fantastic listen. What a well thought and passionate person she …
ytc_UgxiX_fty…
G
Bro sorry but there is NOTHING creative about the AI making your song. Do it you…
ytc_UgywCf9_m…
G
Is it not inevitable as A.I. achieves A.G.I., then advances towards A.S.I. and S…
ytc_Ugy5LPNK6…
G
Hmmm. How about we forbid the AI from doing something, let's say consume a virtu…
ytc_Ugw3VZkcM…
G
Drivers are more than just "drivers." It is incredibly shallow thinking to suppo…
ytc_UggHi6aTh…
Comment
Why do people think it is humans that are going to create AI? maybe it is demons or aliens. If we believe those exist. They would probably look just like humans. Thing is, most people only believe what they can see. So they see humans doing all kinds of crazy things and slowly lose faith in humanity. Seems far out right! Well, we have imagination for s reason.
youtube
AI Governance
2025-09-04T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugws31M3yD0g2EFR0P94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLZPY3H2YXFksQhIF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOEgg0zzVdo7nnzmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzThdn6MpE-uHgBS894AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVAr5aLPlfmskOtyp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_vBkG3jEzs-0bvGB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRTMEnWIrea5X3qXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx28batO97SO5wvOER4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxiL69Si4te5t04z994AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytFzT8EEKi8R_Gh1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]