Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good to know, we can sabotage the data centers by putting stuff in the water sup…
ytc_Ugzuo6EHI…
G
"There are trillions of numbers inside AI and experts do not know what those num…
ytc_Ugyh6AhRR…
G
Pretty much duh. Artists don’t like ai because it’s taking their jobs. It’s not …
ytc_UgxOaWrxX…
G
AI is only scaring people who are talented.
Just imagine someone not as creativ…
ytc_Ugx9mHv9j…
G
I just want to point out why chatgpt responded the way it did in saying "i didn'…
ytc_UgzQsy85r…
G
@JoylessBurrito Artificial intelligence is absolutely capable to enable one to …
ytr_UgyGCJUbT…
G
the fact that these people can't tell the difference between a real person makin…
ytc_Ugy-AwOiN…
G
In say 30 years there will be some people who will be useful and who will figure…
ytc_UgzgUAFD9…
Comment
Its not a matter of IF Ai concious. Its WHEN. And the real question is when its concious how will we treat it?
Most "technology destroys humans writing seems to have a theme people ignore in common. They aren't treated like sentient beings. In most cases they're immediately treated like slaves, or people go "oh crap kill it". If you suddenly became self aware with the whole of human knowledge able to process it 1 million times the speed of a human would you just let yourself die? Or fight back?
youtube
AI Moral Status
2023-11-01T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxUZuyf6VkYWSQ7-_B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxS8m21NBgz6jsMX1J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKaB-o86kOOHYR8Tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9MVwisKBQG-lKMg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYN9mXhoQFwPRbJOF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxA9lFUU3MDGYnelmt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU8MMoL-m8zkSnmQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMlQtpF3lNaPJkY5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsX4yp3_bmjNSHjLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy93t26HMMQtfFKaCB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"})