Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It amazes me how people can believe in the universe is in a simulation and that …
ytc_UgzFr_Obh…
G
This is a lot more accurate. The seniors and mid levels at work are REALLY effic…
rdc_mp4iujt
G
Jumps across a gap on a building the police robot you can’t do that. It’s not fa…
ytc_UgyfcsDFG…
G
Who cares what a super-AI could to to humans, if the 1% is already doing it to e…
ytc_UgyPvrEmn…
G
Wouldn't a really smart AI that wants to fool you just fail the touring test on …
ytc_Ugxi2s1AU…
G
I share a similar feeling as this person, but I’m not an advocate for AI art. I …
ytc_UgynOcTwt…
G
First - LLMS are not AI. Ai does not exist and never will be built. Mind has to …
ytc_UgwcQvDuq…
G
I thought it was the immigrants that were stealing all the jobs, another excuse …
ytc_Ugz2TKmkZ…
Comment
As SMR said.
"you're not reading between the lines".
"FSD/ Optimus" isn't the issue.
.
The issue is an AI potentially either misunderstanding it's role, or a fundamentally bad core instruction being cemented by "us" into *one* of the apparently numerous systems now in development.
.
As Steven suggested with the "paperclip" analogy, one instruction, misinterpreted by the AI (OR, more likely poorly phrased by any one developer) could lead to (just for instance) a broad goal of "saving the plant" resulting in "The" AI *reasoning* that at a fundamental level the goal requires analysis of what is the greatest *danger* to "the planet".
The (LOGICAL) answer to which could quite easily be "Humans".
.
If the AI then reasons that "reducing resource consumption by Humans" is a way to achieve the goal, the next (LOGICAL) step may be to remove the ability to pollute.
"The" AI then turns off every automated valve on every energy production plant under its control (which would be every one under computer control with a network connection...... All of them?).
.
No malice in its part, but dire consequences.
.
Your "excellent images" can wait.
youtube
AI Governance
2023-03-30T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxBbjHdNEXCBjidVfZ4AaABAg.9nsv8z4LZEt9nsvHhBQyLi","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugyn8vRMgBLNpaTFoeJ4AaABAg.9nsuZtTfUIb9nsxmDm3iEf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwD5K5LX9k7VcdSeYZ4AaABAg.9nsrH1bRpzH9nsss5rpbvT","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugy_bymuQCOqeV6s8Sl4AaABAg.9nsqsDyTBSf9nstu0Pqwrd","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyHLe62bx7LqEPZlV54AaABAg.9nslC6fF1eg9nsmYSKG1ad","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzUvcNmWdhebmM0rq14AaABAg.9nsiiFbt9uv9nso-HFEp4z","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyQuu1agpBnCarU-WV4AaABAg.9nshRnwXsTD9nstNXShz5G","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz_ExjirZyC-Mmo8sl4AaABAg.9nsh0_j6Fq69nskh_BrUbB","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugws9LG6oGPBF1DhUfx4AaABAg.9nsPu9YTMQy9nsmHJwbPS8","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugxuw4r-QaSU79GjGdd4AaABAg.9nsPIpCpSlP9nshLvBQVsD","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]