Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yesterday, a Waymo rolled through a police standoff with guns drawn.
What are w…
ytc_UgxC36n0B…
G
9:30 glad you pointed this out because the whole benefit of deep learning is the…
ytc_UgyycN8su…
G
Interviewee: We are all going to die. Then people be like: how do I lubricate my…
ytc_UgwKcAoGb…
G
AI already has, look at the music from the early 80s til now, artificial, the mu…
ytc_Ugw8aKzbI…
G
Exactly. It's not sentient, it just spits out reorganized portions of petabytes …
ytr_UgzpvYM3X…
G
Ai needs to be regulated, what happens when a robot kills a human, a human child…
ytc_Ugy9LB9rN…
G
the one job I see humans are innately built for is compassion. people are using …
ytc_UgwSMuEqO…
G
I will become teacher so that I can teach AI how people are and feel.…
ytc_UgxeYINTY…
Comment
the job market's a bit chaotic right now with all those layoffs. seems like a lot of roles are at risk, especially in industries leaning heavily on AI. there’s this growing concern about whether jobs are getting automated or if it’s just restructuring.
i’ve seen discussions around how essential skills are changing. adaptability, tech savviness, and a strong personal brand seem key for staying relevant. it’s also interesting how education's role is shifting. like, do degrees matter as much anymore? people are saying it might be more about skills and real-world experience, especially in the AI era. some even suggest a 3-step plan to make yourself irreplaceable which sounds practical.
youtube
2026-02-28T07:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKSNgDy0NVaE9BJFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfeYkmhsCt2TXOuP94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJW_ZP9MCrnyrxc254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKTkcvkyGjHEyeI8d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTdQDPNCsiymUTKTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugygb6ksRZofs8d1HCN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKJh3F9LeQhraBV_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNwFMH-cuKotLjIu54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxk6TjkA_kgYUuMDZ14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBS5hcib5bJrGuMmh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]