Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep, traditional dev methods are pretty much done for. But writing code is the e…
ytr_UgwVq6J-_…
G
Please, don't create such things. I mean, investing in various things like that …
ytc_UgwS8qs17…
G
I swear to God if we start fighting for robots "rights" I'm out. Robots will nev…
ytc_Ugh8kE5Ra…
G
This makes me so angry.
I read Cathy O'Neil's book Weapons of Math Destruction,…
ytc_UgzXcIvw7…
G
All one needs to know about the future of A.I. can be learnt from watching the m…
ytc_Ugx4tMOmO…
G
I miss when AI was still so bad it was good, and produced funny uncanny valley c…
ytc_UgxV0aK64…
G
@ludogienezever You don't even have 1 bulletpoint on how it might ascend the …
ytr_UgxiaIYxY…
G
Once again a great example that how we've stuctured our world isn't working. Nob…
ytc_UgzRwxM_x…
Comment
The future in action and it looks like it will be a great and bright and very happy one. There is too many people who think just because some movie showed them the robots killing humans that it will be the same in real life with the new AI robots that were shown in the video. I have to tell you that the movies were made to have you afraid of what could happen to make you want to see the next one that director made and help them make more money. The robots that are being made will help humanity, and will not be anything like it is shown in those action movies.
youtube
AI Harm Incident
2023-12-28T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwlWXqCvCfloYXT9JZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEqoqPpr2DNn2fG1N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxcqaJCLtzt1iJOQMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPHVRbJ_eVBihodTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxbOce0JxHMQ-2Cih94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzPUp2LEZOBCax33_l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxe8VG_6PjonJ1Fxhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHftbS4jDqKwjvNOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgybM6Xy0MwsZh7yBJ94AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxL21e0OYVw1QxZu514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]