Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3D artist/Graphic/Motion Designer (do it all) guy here. I've been freelancing fo…
ytc_UgxSHokea…
G
Lmao between ai and this editing, ALL YALL IN THE COMMENTS ARE GETTING FOOLED BY…
ytc_Ugw51CR9k…
G
If AI IS that smart, why not ask how to build a ship that can travel the speed o…
ytc_UgyvH2hYP…
G
Hello, tech bro here who dabble in AI, both local and API. Not an artist.
Y'all…
ytc_UgxchyFSo…
G
We have had the technology to implement autonomous cars for at least 10 years no…
ytc_UgzzYHKOn…
G
AI can only learn from the art humans have spent millions of hours and dollars o…
ytc_UgwRiU__r…
G
Boss: "How's the AI integration going?"
Gen Z: "It's... a journey. Needs spac…
rdc_ofh0vu3
G
Ai? One thing I see here is a knockout that isn't artificial😂😂😂 Way to go Johnny…
ytc_UgxNARS_r…
Comment
If he says we have about 5 years to put some sort of containment in place; that’s a very short timeframe. We are already beyond our capabilities and will be seduced into deeper water by exactly what he says — the benefits and conveniences.
No one forced us to allow Alexa into every facet of our lives. Alexa knows everything about your personal preferences, habits, timing, and it’s not AI yet. But it has all the data that AI needs to control our activities. People choose to give over their autonomy for convenience.
I am not a religious person, but this sounds like the apple in Eden. We are not forced out of our current safety and security; we are lured by promises of something better.
youtube
AI Governance
2023-05-25T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjAeXNnDhoJZZPeB94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPAJXCWT6w_1PmC_R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwAwIcPqQAp1RGKe1l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwuR-BHkyoNjHg1I454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMalVyFophqIKxhgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyExllfujrnBuY6iG94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxILtbbV8XLZmbHzxZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTb1BfjunWxzj2jXl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlC702j4PudGHjlwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgybpSZE4Uf8KcxpHpx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]