Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Agi is right around the corner everyone, just like fusion has been for the past…
ytc_Ugw_45m9j…
G
It is not wrong to do that, is only wrong if the ai companies reproduces and sel…
ytc_Ugxp2z2Xh…
G
This is the best and most straightforward explanation of how AI art is generated…
ytc_UgwqAMHKn…
G
All these AI researchers are doing this AI Panic to oversell Ai to money hungry …
ytc_Ugx7Kt9pG…
G
Ai is 2 letters of the English language you can learn English during school hour…
ytc_UgxehudTN…
G
Well… I like using AI to create certain covers because my visual impairment does…
ytc_Ugywur-cx…
G
When I think of a futuristic Utopia that AI could help us get to would be simila…
ytc_UgxA9t0nf…
G
Honestly, saying "ai art bad cause it can do in 10 seconds what i needed 5 years…
ytc_UgyWN5OxA…
Comment
Humans made the robots, therefore there's always room for error. What if the robot malfunctioned and shot the man and other robots? Man is so eager to create, but forgetting the major fact; ANYTHING man creates has room for error. A robot can malfunction like anything else. It amazes me the total faith they have in robots. When it comes to things like this, ultimately a price is paid for denial of the truth and pride.
youtube
AI Harm Incident
2024-03-29T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx2aZvq5LiTfAFt2B14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFYsNY4YrPEYorOB54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwf1L7-bA8mpPFBCDh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBPpaX6SXPG-Od-Ad4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7Q4xyi_KIVmLBLpd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxwl_Z6WFRkU-iwXg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylT9uY3RvgKGsmxvJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKO4lhb0lcd3v5yaF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwow-2hXEo3HIUFbY14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-P1ZoSzptTw4f2J94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"})