Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
US tech companies use AI for marketing purposes, gather as much data from the pu…
ytc_UgxoQowPT…
G
AI really is just another form of automation. There's always certain roles wher…
ytr_Ugy2Hdku0…
G
Nope! AI will NOT take 90% of all jobs! It can take some jobs from the IT sector…
ytc_UgzlBCF5o…
G
It's just a matter of time before everything becomes free. AI doesn't need a mas…
ytc_Ugy5CDJ2b…
G
AI can be used as a tool to help artists but what the hell does he mean that you…
ytc_Ugy7RKxGy…
G
Any materialistic civilization needs AI as the next step to grow higher into the…
ytc_UgyMZwNMe…
G
Ong if i care if its good ai art its cool, not cool enough to sell it but still…
ytc_UgxAMMN-n…
G
Sometimes I start to lose hope at a future in art cuz of AI but I'm not really g…
ytc_Ugx1bu8ea…
Comment
we never learned from James Cameron's The Terminator.......
that movie alone spurned this anxiety in me when i first saw it in home video. ai technology, if not done properly in its development, can and will destroy the entirety of humanity. of course, we won't notice this since we're too focused right now on the many human conflicts on the rise around the world. and since we are not aware of the progress of ai, we will never see it coming.
youtube
AI Governance
2024-03-25T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzC_9shsfHCg5Uf4dp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDZzAc4bl4fmonWGB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkxxDRLhtk58mCQsl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6yt3y1wOtXbpCuPZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxvrvs2c8C7ik3aERF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBrGR7J9Va1bBFwOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-pXcynnjwfwegk0x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwTEUuBu1DZFlqJawZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwLCPBPeONu4qgcJZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7J7xCIB9kg8GIXNN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]