Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is no "let's slow down and regulate it". Even if corporations care about t…
ytc_Ugzf9SNZ3…
G
Lets have this conversation in 2036! Its happening right now. robotics have been…
rdc_ohnch6v
G
That’d be pretty wild wouldn’t it?? Like, it’s the kind of thing that would be i…
rdc_fnxiitn
G
I don't really understand that logic with accessibility of art . When I was a li…
ytc_UgwXTnwSO…
G
too bad you wont get the most prolific ai's as they dont steal art like you are …
ytc_UgybjuFVa…
G
I will continue drawing as a middle finger to AI bros. They can goon to AI gener…
ytc_Ugzqkbom8…
G
This is a not an AI problem, this is a hard lesson in dealing with people. This …
ytc_Ugy-jFuir…
G
"lets stop watching youtube and go to bed"
youtube: "this is Sophia a robot that…
ytc_UgxEZfzgN…
Comment
the megacancer is going to be pretty pi$$ed off when its amoebic brain cell finally realises these human/ai bot relationships are not leading to the requisite number of bundles of joy like human work and consumer unit relationships tend to lead to. maybe it can convince the human relationship partner to set up a home in the breeding colony where the human and bot can create an a1 baby, which needs endless supplies of real stuff like nappies, baby food, toys, clothes, and when older, a car, a job, insurance, health care, pensions and other lucrative stuff made from the finite gloopy gloop petri dish energy substrate. potentially both men and women could do this, actually doubling the number of potential breeding colony units
youtube
AI Harm Incident
2025-08-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyncnGCigBHgW_M1LF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykL-2vpW_fuTXBCb54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9dX0O6PXi8l-J1NJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxReGt5sJnLzywpmC14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEM-0cEzQfwGu7shV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxD7CwL0SRgt2whDW94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugxzty0lBseNnLBVZkN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy_JpHA9GSmWihiEv14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxf6W1ECkUfPRSHwFZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwi-nA2hEBUf7ANf2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]