Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are not too bright. Humans also use "Datasets" except we call these memories…
ytc_UgzC7_mrC…
G
This is all overhyped nonsense! No computer program has any consciousness whatso…
ytc_Ugx_ixl_Z…
G
They created something more intelligent than us, which they can not control, and…
ytc_Ugz49e1dW…
G
As long as you don’t claim it as your art ai art can be fun to mess around with…
ytr_UgwPT52Hz…
G
Health care or Health SCARE? Since 2020 plandemic and the introduction of the "s…
ytc_Ugy0uOEQz…
G
The movie of chihiro is a good example of something the ai wont be able to do in…
ytc_UgxQx1srB…
G
I still don't get the complaint. The AI does not reproduce the books. Just like …
ytc_UgxzwYSYs…
G
It's not India that's using AI in agriculture. Former software engineers got tir…
ytc_UgzfPnFdI…
Comment
you guys know what this means right???
if someone gets there hands on this. they could Murder multiple people without Getting shot or stunned With the pain, So this sadly for this reason unless Elon is a idiot actually. wont be released to the public T_T But maybe you can be trusted or maybe it will be MORE Expenssive With more saftey features and Maybe even a real police officer Always watching when its on Only when you pay for them to watch. So it either wont be released maybe or it will idk. BUT. They could still release a robot That has a button on it anyone can shut off By pressing or turn back on maybe to make it easier. AND, they might still be able to make things Like just hands with wheels on the ground still cool but not as capable of robbing or harming!
youtube
AI Responsibility
2025-10-16T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzoqtxyMtJ40DUw06t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgykyUF-AmhnoN55FPF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1DM4WHJkLjquyeXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNtZAmj2KqURAD6s54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyuhYQhbcxBs23Tw054AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEEf8T2IGpNSZyubR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoSGku3r62z2tNp7J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxASdUqCfFgx1DZUz54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzEGttcnWHHpPAjPrp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx8rNhtn9FTLiqVOZV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]