Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Rather than train humans and put millions into better education systems we are p…
ytc_Ugz6Zn1oP…
G
A bus driver honked at a Waymo car. I thought it was hilarious! Car wasn't movin…
ytc_UgwPq5mcO…
G
This is the most “get a load of this guy” comment I’ve ever seen. No because wha…
ytc_UgyKg6Xqo…
G
I don't speak english well, but if you're interested "artificial intelligence" i…
ytc_Ugx2anC7q…
G
May be I am wrong but I am just saying
If Ai is becoming that much fast for eg. …
ytc_UgxLKOWQD…
G
I wouldn't have released AI Chat until I could get it to "LEARN" from it's inter…
ytc_Ugz9aRh5H…
G
Dude lost me when he started talking that simulation BS. It's a woowoo theory th…
ytc_UgwPpeyUO…
G
AI is biased towards efficiency. It sees no color. Humans that are left behind…
ytc_UgzcwNCC5…
Comment
Oh wow so google wants this super AI to basically put them and keep them at the top. I really don't know what google wants and what their intentions are but I don't think it's "positive" for the people. Meta universe, the matrix, etc... And everytime they test these super AI they always have to unplug it. Facebook, Google, Even quantum Computers. I think I can get a good idea of what they are trying to do.
youtube
AI Governance
2023-04-18T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbKAHOwWMHYie6h7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzHbjn68cChDqNQ8h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzedawhG8_lOgkb1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSXFXSlFlKPxGXCrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-nfbPQCIXek8wjnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw36Ia2Z-e2wtdNNkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX1zwFnL4jW1ZBrZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1BlHzLf5RT3xhort4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6qsnc_WWI9ec3CRZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznJOuiw3qT3sm5fMJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"none","emotion":"approval"]}