Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy's a fucking idiot. Did "autopilot" remove human pilots from the cabin?.…
ytc_UggWLIA2i…
G
its 5years since first AI, and now it can think freely and tried killing a man, …
ytc_Ugyr6V9FA…
G
Why did people think we want this. We want robots that don’t look like humans. R…
ytc_UgyaYgkRB…
G
AI developers:
❌Automate routine work so that people concentrate on a carefree …
ytc_UgyzWzEp5…
G
Robots 🤖 programmed like Sofia will try to help mankind. But the male robot will…
ytc_UgyBfKLTU…
G
13:52 the sad thing that I see here is that the AI bros just. Don't understand w…
ytc_Ugzw6VfbB…
G
How is that AI's fault? AI isn't the one asking the question, it's not the one c…
ytr_UgzKhE4jg…
G
U use ai to think you are smart we actually learn and we are actually smart we a…
ytr_UgwOFhXWe…
Comment
The only problems I have with the fox of this video. At the beginning the news reporter was used to say that automation always cost jobs. And at the end when you're talking about the the industrial revolution. Automation doesn't always cost jobs, in the past it actually always created more jobs. I am not saying that this automation is the same and i do not think this will create jobs
youtube
AI Jobs
2025-05-29T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLlnbeov_EUjxJkql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuvFKZX5S2N9GHzvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwb-JZc4MQbmVhHXrJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgylIDSVB-ckaUL3UEJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcJYt5GMa_a_HRf7F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwYKCVY1FSx6BBHSWt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxzGaB9XSEkcC_vWJx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYY-Tpfz69L7Rz8eN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzYOKCAUNg7MyhagRl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOhs0EHFIk_kVg-cB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]