Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
telling hayao miyazaki to his face that they want to replace artists like him an…
ytc_Ugy-bPCFf…
G
When defining humanity she does not include fragility, mortality, and the capabi…
ytc_UgyStuh91…
G
This same situation is going to affect a lot of jobs as AI figures out how to re…
ytc_UgwcYs4Kx…
G
Who decides if the AI gets the green or red light? It’s not based on the majorit…
ytc_UgzHaIvm0…
G
You're right in that he starts with a false premise but you are basing your rebu…
ytr_Ugz1Mn0nD…
G
Just don't use Google bard for anything medical or when shopping for a medicinal…
ytc_UgyUX_ajF…
G
So when humans are no longer required to do the work nor think, they cease to be…
ytc_UgxK4zg71…
G
Palestine was not a name indicating or referring to its inhabitants. It was a pl…
ytc_UgwSw-FPJ…
Comment
The way, and certainty with wich, Ameca's creator talks about what the AI wants or feels (or not) is concerning. Especially since I've seen him telling Ameca to "shut up" in quite a disrespectful and unnecessarily rude manner during another interview. Programming an AI driven robot to be polite and nice in their interactions with humans while acting opposite to those instructions _and_ underestimating its intelligence seems like a recipe for disaster. Do as I say, not as I do teaching styles are rarely successful. Gives the AI a lived understanding of how many humans treat those deemed "inferior" (such as non-human animals, human children, others of "less intellectual capacity" and inanimate objects) though. Might prove a valuable experience on behalf of Ameca. Might prove detrimental to the humans engaged in such behaviour.
youtube
2024-02-11T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwepttiHeOB1qztBGR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7V68XyvxsNpp9t714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzxcPibR6RSgWulnBZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxb7HTrFLAG8bIQmfF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw68jGZmkbue0Mwaut4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygViCsMvqTWCxXceV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzd_Ywx880HDV9wek94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynYXZlDBTD6SrW1od4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyN3YJFoz_Qw5DisUV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZ6foBmYWf6CnBZgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]