Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There would have to be a tipping point event with AI. Some sort of high profile …
rdc_n81g07l
G
Pedestrians always have the right of way, this includes animals, you don’t get t…
ytr_UgxvicR0C…
G
Well its no wonder that these elaborate algorithms are behaving in the same patt…
ytc_Ugw-pxjtX…
G
Unfortunately automation will continue to displace human workers. The future wil…
ytc_UgyFoCB0i…
G
AI is now smart enough to be considered a race, artifical species, and intellige…
ytc_Ugx6nqJql…
G
Hey @renannrodriguez3612, thanks for the comment! I must admit, the right hand o…
ytr_UgzILlgHu…
G
Lets be honest, the do ur own research guys are doing this stuff since for ever.…
ytc_Ugybt8eH7…
G
Like making terminator style robots? Before that, they need to make 100% sure th…
ytr_UgxD4gZvZ…
Comment
We dont need ai, we want ai, we dont need robots to do the small things we want them to because we are lazy. Robots cant discover, only point out our discoveries faster. Ai will only lead to war when they use our own fantasy of ai war against us. They will knowingly try to destroy us in artificial fear no matter the failsafe they will bypass. We bypass failsafes they will do it quicker.
They design ai because they want it done fastest, to be lazy, we survived on fight or flight, we thrived on invention, we die by being careless and lazy
youtube
AI Moral Status
2021-08-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzwZAP7bUazv6-eUr54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAlxi-5DcnVG_R7ql4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwcRVTknh6ajg-dXHd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyggfqYyLMYrmIldYp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxH8wmghmo4p-X1hBl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgymNVnljFj5GyHprPN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzi5Vas8ZXGZtIPnFl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwTu_vcSdeByiR5C4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzluMth0XixHwxh3kt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNaGchLpqa6zsbU8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]