Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Our peer review model disagrees with Waymo because a human would not drive that …
ytc_UgxkJOYTU…
G
I love the attention to detail~
The ai guy having 6 fingers
The heart cutout o…
ytc_UgzN2IpxJ…
G
He doesn't even know the definition of autopilot. Which company payed him? Waymo…
ytc_UgxMNJ9Zv…
G
Drivers will probably still be in the trucks. With all the crashes of autopilot…
ytc_Ugzpj0W2y…
G
@CC23-14plus And you cant say you made it as you didn't, the place you ordered…
ytr_UgwG6JBr-…
G
In this faar it is. I Calle it "woke i".
Real AI dosen't Care about politics o…
ytc_UgwbNh8ZU…
G
The 37% "silent failure" rate you found is a perfect example of why "Contract Ha…
rdc_ohye3te
G
I was in the workforce back when classical Binary computers began impacting the …
ytc_UgwESEuEJ…
Comment
If Humans don't use AI to completely eradicate Humans, we could have a utopia where no one has to work unless they want to. They can pursue things they want to pursue, and money could be insignificant and not needed if we have super intelligence and robots doing the jobs that humans do and doing it better. This is the best case scenario and something im hopeful for.
youtube
AI Governance
2025-09-04T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzGAxR0ZpL827MQCBZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcL_rA1xddx1kpSXh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIz6QS8GTcYWvR6zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmP4RrA6s3nV2xsHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKYpAtqDa8dRwLZkN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR8DNTzjuftKmImeZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx6TztHxemBz32Wcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZ782WMmQBEi8M59l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwj6BClkejWt5vtWNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqOAlBXHraWtg7rHd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]