Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*A PROBLEM NO ONE IS TALKING ABOUT* what happens to your Ai costs when th…
ytc_UgxDkv0wD…
G
AI colonialism? This guy's a literal joke. The only real concerning thing here i…
ytc_UgzsLpnYk…
G
Sooooo… if anyone writes a song or poem or something and it makes it big.. is th…
ytc_UgydJbJKo…
G
your distopia is my utopia. We might need a separation of cultures along AI line…
ytc_Ugx5JCcwO…
G
Only semi-skilled and unskilled jobs are threatened. If you are in IT or other f…
ytc_UgwnyR0xb…
G
AI may be the end of humanity but I don't think it would be in their best intere…
ytc_UgzUKGoRD…
G
Maybe the medical ai and the police ai are working together to reduce crime rate…
ytc_UgwSeV3bX…
G
The most important thing when it comes to training A.I is the raw data you feed …
ytc_UgzWhVH5A…
Comment
I think AI is very useful, but many people use it the wrong way. People stop learning and rely on AI for everything, even for the simplest tasks. People can be really lazy but still demand perfection. Maybe that’s why I use AI differently. I use AI a lot, but I don’t stop learning.
For example, to write this comment, I told my AI: “I want to improve my English, especially my writing. Help me check my grammar and vocabulary. I’ll try not to use Google Translate. I’ll do everything based on my memory alone.”
Of course, I make mistakes (10 grammar mistakes) but AI helps me correct them and explains why.
So, AI is created with good intentions—we just have to learn to use it wisely to make progress in our lives, not to become even lazier.
youtube
AI Governance
2025-11-18T03:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgymImFukjkKowDi_QB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXPgfq5rxq5l0eYIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyflcqIrd_7nxte2wp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7mEYUujAL8NIVAl14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7TcqM0tP7DSFZAr94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLhiyQyqPzH4oE9-d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz05xUd5JKxSoAqdzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0_g7TvHZhyOCF8Et4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8doUzyuWvAZ4d9rF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgyudAJGKHVyYNg3NI94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}
]