Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
INDIAN GOVERNMENT AND ALL STATE GOVERNMENTS SHOULD INTERVENE IMMEDIATELY AND ADD…
ytc_UgzAK22lO…
G
That was a crazy story. I kinda didn’t believe what I was hearing when she was t…
ytc_Ugys5lBzt…
G
This is straight up fear mongering. A.I. chatbot gpt maybe good at story telling…
ytc_UgysmOrmP…
G
I don't think AI and robots will be able to work effectively in manufacturing. …
ytc_Ugy5IYKPO…
G
This is false. Per question!? Lol. Gallons? No. A data center uses a lot of wate…
ytr_Ugz_IHPJx…
G
@rjvtechnologies AI companies stole artist's work that was used in training data…
ytr_Ugzg1AS_-…
G
Not yet huh
Y wud a ROBOT tell us that we are close to be overrun by ROBOTS 😅😅😅…
ytc_Ugzy1fRyH…
G
And i wonder when do these AI's begin to take over like they did in Terminator?
…
ytc_UgxYQ1OLH…
Comment
Issac Asimov
The Three Laws of robotics.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2025-06-18T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzKMW6Y0OT6hTaUGE54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugysbuq9403cPD3mZNt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwubkC7tifs_5n2CJx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZqxWqP4bKiWoIE0x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9f9mpxFp6z1nIghR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxECXqYH8qfn5mzWq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiQg1FlDRUPSDGDrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmtC7fi939-RZLR_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz3CfR3V2PQ8YV0jCh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxOK-dI469z0yYDLKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]