Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI proponents always duck this question: When
AI robots take all the jobs i…
ytc_UgxyTizTg…
G
the thing about ""AI" is it's just an enlarged version of neural network trainin…
ytr_UgypfZlNR…
G
It's fine to get mad at the 1st controversy but you can't do anything about it. …
ytc_Ugz8HTz3K…
G
I've been using AI daily for 6 months. It may have more data at its fingertips b…
ytc_UgyeOPW4y…
G
I mean you are using co pilot at least use something like Claude code or Windsur…
ytc_UgyaOyuLd…
G
The only thing I’m sad and worried about is AI basically ruining everything ther…
ytc_UgwIsBDNW…
G
no ill take ai movies, no more nonsense and no more hollywood who know who, let…
ytc_Ugys6dFDB…
G
32:50 This is why anger should be directed at Sam, Dario and the leaders of thes…
ytc_UgwEXpw0p…
Comment
Everything we hear about AI is it's bugs.
But wait until scammers with sophisticated social engineering skills start applying themselves to manipulating AI agents to do things they not supposed to do. AI is going to introduce an enormous new attack surface that that no one in the industry seems to even be talking about.
In one case an AI app that translated police bodycam footage into a written report ended up documenting how an officer was transformed into a frog. Apparently their was a Disney film in the background and the AI got confused. But wait until scammers learn how to use techniques like this to deliberately trick AI.
All these tools are being rushef out now because there was been huge investment and they need some return now, even though the technology is clearly nowhere near ready to replace humans.
youtube
AI Jobs
2026-02-05T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJAGJMGbfIvM-4xoZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKgBEgmNjQ9XZta4l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDbgMux-XYZ8gJTdZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTRhcPzdp5hdodyNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXeK7Tyq6IVMzhI6t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6_JlFlgX1OoXacvN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0vGZQ1LaMz7TZNOF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy_m8S-PXGeLTm4vhZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvV71TVVQDxReztaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZuQ2ZfvoA0NraHIB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]