Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Smartest everything out of Grok is ffd up and your own A-I turned you in 6 to 8 …
ytc_Ugy_NPC2J…
G
@wbiro I wasn't thinking too deep with my word choice there, I was just trying t…
ytr_UgznYCuNH…
G
We understand your concern. While machines like AI can seem detached, the goal i…
ytr_UgwRDYPUc…
G
The guy in the red suit’s comment 1:08:28
Is troubling. Others laughed at the t…
ytc_Ugzz6UjD7…
G
Could you give an ai multiple llms to converse with as a way of simulating syste…
ytc_Ugw2zUO-e…
G
Mother is in files released by FBI.
Photos may be AI but it does or take away hi…
ytc_UgxBZn2HG…
G
"The optimistic idea of an AI future where we can have machines do everything fo…
ytc_Ugwte6bBd…
G
I like your reasoning. But, don’t attack until the ai generated stuff gets too c…
ytc_UgwtnKE4s…
Comment
As I see it theres 3 possible futures. 1 we all fly around in flying cars 2. the human race becomes a slave race to AI ( that despite apparent threat still malfunctions an ATM network and expires/time out a weblink) 3. "The aliens" arrive and point their intergalactic remote at the computer and says right you lot, the computer's been shut down, everyone back to work. It clearly isnt going to be all 3 so which barmy future do you choose?
youtube
AI Governance
2025-09-04T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzGAxR0ZpL827MQCBZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcL_rA1xddx1kpSXh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIz6QS8GTcYWvR6zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmP4RrA6s3nV2xsHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKYpAtqDa8dRwLZkN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR8DNTzjuftKmImeZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx6TztHxemBz32Wcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZ782WMmQBEi8M59l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwj6BClkejWt5vtWNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqOAlBXHraWtg7rHd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]