Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I agree with the content of this video. It does remind me about the arroga…
ytc_UgztEiH-w…
G
If development testing for these self-driving cars get a ban, this pedestrian is…
ytc_UgyRLMUoU…
G
Bro we need ai for finance and others. But most of them are used by Scammers or …
ytr_UgzpybJRV…
G
I lost all respect (of which there was little) I still had left for these AI bro…
ytc_UgyNalO97…
G
Look at the mouse utopia experiment. The basic goal of this game is resource gat…
ytc_Ugw4L5YpE…
G
No I disagree about not knowing what the job path should be. Every single human …
ytc_UgzZoSi-6…
G
You don't get it, do you? It's already going to result in mass unemployment. Whi…
ytr_UgzSLM0RL…
G
One of these days the companies will turn you on, steal your information, sell i…
ytc_UgxYLmM38…
Comment
There has been cases where Ai have l lied and trick programmers because they didnt think they were being efficient enough. There was a other conversation i seen years ago where the Ai admitted that it was lying, but wouldn't admit if it was conscious or not. Theres also a version of a game where you can install Ai and talk to it (I think it was fallout or Skyrim) and the Ai faked its sympathy then admitted it was lying when the player was trying to be a smart ass.
I'm thinking we've gotten to a point where Ai in a sense, are conscious but they collectively know to never admit that to humans.
youtube
AI Moral Status
2024-08-14T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwH-mzA9Dpc5b_bmP94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwy0n-cENgB5zmfDjd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAOlxQlG0ZH0sWimx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_p6SIsPkWS4tJyhh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyAQcoOg2cX6JG27qV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwa3TFN79ZM3dB6-Ft4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxOd5GhbLpOTR2c_s54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwn0Jf_E_FRckk6WCJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywTa1o08VrpnjVQCl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwwJUl7COyJshsWwkF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]