Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self driving cars and the AI does not know how to get out of this?!…
ytc_UgzoKnOnp…
G
They probably didn't mention the annual cost to attend this school. I've seen si…
ytc_UgyWbwYIe…
G
@ not only is it good and way better than anything you could ever do but AI imag…
ytr_Ugx7zKvgn…
G
YouTube trash with bad science info is largely being generated by foreigners fro…
ytc_UgzJGgEat…
G
I’m not a big artist so I can’t really say for certain, but I feel the biggest t…
ytc_Ugzx0UI0L…
G
@shawnlinnehan7349 Then they better figure out a way to handle the civil disrupt…
ytr_Ugx7x5AK7…
G
Here’s an ethical dilemma for you: What do you do if the wrongful conviction was…
rdc_h53zgta
G
The trick is not the robot not being able to except the temporary lie that they …
ytc_UgxXFjTPS…
Comment
AI companies are in fact often arguing that it is not a violation of copyright to use copyrighted content to learn from, so dismissing that at the outset puts the entire counter argument on difficult footing. It is hard to argue that a computer doing what a human does when consuming content should have different standards applied, so there is a danger that consuming content could become a potential copyright violation for everyone.
youtube
2026-01-17T01:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5iUqVlvcT2BUE40B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRIC3ZdQ22G7kB0R14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuuG5bpXhLLUVS8DF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyU9fPrNuKROl1H_at4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyw1E87fJPKcqX85-x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUmsYj-YiCjJRA5Nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6e0VgW_0nJ70b2oB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyylcy9q69jehy3vEl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCUAMcQRExoOAzNht4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzN9z7z0GN9s7Y2fR54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]