Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Question is if AI replaces humans by 2030 and humans will be out of their jobs, …
ytc_Ugw22XpG4…
G
Anyone who utilizes AI, in any way, is directly responsible and part of the prob…
ytc_UgwL5Ehiv…
G
The only argument I have for AI art is: Let's say that i ask the AI to draw me a…
ytc_UgzBT8q05…
G
at this point I think women really should just go for revenge. we're under oppre…
ytc_UgzVvmoky…
G
I said it at the dawn of the first big chatbots: the second they decide the only…
ytc_UgxJzFN0q…
G
Brilliantly said. I think there has always been this assumption that technology …
ytr_Ugy5RI-3R…
G
Pretty sure AI "artist" can't even draw in digital with them saying that the pro…
ytc_Ugyp75P8N…
G
Just makes sepatate road for driverless trucks. They can do the crap shifts and …
ytc_UgxhRg2h4…
Comment
Hello Wariko Sir,
I hope you’re doing really well!
I watched this your latest video where you discussed how the future of tech looks in terms of jobs. It was a great insight! However, I noticed the discussion mainly revolved around AI.
I understand that these days, whenever we talk about “tech,” AI naturally becomes the center of attention. But there’s also something many of us — including me — are preparing for: DSA (Data Structures and Algorithms).
I’d really love to hear your thoughts on this. Do you still think it’s worth focusing on DSA in today’s AI-driven world? Or should students shift more toward AI/ML fields?
If possible, please share your views or make a short video on this topic.
I’d be very grateful to hear your perspective.
Thanks and regards
youtube
2025-10-16T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwEnCfzfmGlDmJTlmR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHMdl4s9w9g-qE-nF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyU3m3_697ZzA7AeAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz9wGSRETXgHom-BqR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZB5hg_5-M0VP7ERx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxkv6m3rE3TO9GtSj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz3fJKmRxFTtLPJEiV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwdNFUU7CxTaTDLL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxE98xBw4Zzlqk9SCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynND-3K6HFKWUY8JB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]