Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It may one day be possible to live forever if we master advanced knowledge, simi…
ytc_Ugwbzg8HP…
G
In the endless pursuit of profit and power, neoliberal capitalism needs to demon…
ytc_UgyKzs76t…
G
That’s frightening. Cop : the AI said it was him so I’ve gotta believe the compu…
ytc_Ugxg7--uA…
G
4th wall break here:
This video is scripted and even my comment could be AI-gen…
ytc_Ugzf-oEot…
G
Inflation is causing many firms to lay off people, and many of those companies a…
ytc_UgypOPMIJ…
G
There will be a spike in vehicles that will fail readiness as a computer will be…
ytc_UgxafMs3K…
G
I Wonder how all those giant companies like Amazon will make further profits whe…
ytc_Ugwwxmib6…
G
Dunno maybe just me, but I would only consider something like Bethesda Game Stud…
ytc_UgwAGAknW…
Comment
Arguing for AI by saying humans won't build dangerous things because we have "agency" is without question one of the most ridiculous statement I've ever heard uttered. One of the first things humans did when raw ore was discovered was to hammer it into swords and spear tips. AI is the raw ore of the 21st century...the question is not whether or not it will be used to create a weapon (it will), the question is simply whether or not AI is the weapon that will end humanity and possibly all life on Earth. The odds are 99.99999% in favor that a truly sentient AI will see it's human creators as a threat.
youtube
AI Governance
2024-01-18T13:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyreUEyy6bDsfa3NdN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaWRVCOllZYcVvmQN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgywanQgjQKLMLrS0H14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7Sh_rbDPPSl0o6Md4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLhHF6ZDYG1ICN7S54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwJZwpUcLpHC-C0x1l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwLdo5ox4NbEUb8qXV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyg85VDYc8ZQCvtqHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx7-bNdT3R1vS6cZoF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGqMRtgXoqtGIqKUV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"}
]