Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Physical art takes an hour, digital art takes 45 minutes, Ai takes 10 seconds. T…
ytc_UgyWrwwyY…
G
Time traveling evolved AI. It's macbeth's witches. We re in a loop from the star…
ytc_UgxbnqbNR…
G
first, AI shouldnt ev😂n be used for those stiuations. and to fix that problem ju…
ytc_UgzGk4y5Q…
G
A.I. reminds me a nuclear bomb - once you know it can exist, you must get there …
ytc_Ugzruq71z…
G
Artist trait:
Constantly gets insulted
Constantly gets compared to other artist…
ytc_Ugx9FJ0LS…
G
Why is everyone assuming the singularity is actually going to happen? It’s a fun…
rdc_kqt6tik
G
Can you make art without your 'tool'? If yes, then you are an artist. If not, th…
ytc_UgwD_iRo9…
G
Consciousness doesn’t matter. Viruses aren’t conscious but they can still spread…
ytr_Ugx5g4WuJ…
Comment
We need to get used to the idea of weaponised AI, not to accept it but to start thinking of how to best deal with it and the contingencies needed. Imagination can now be reality, for example what happens and what can we do when human access to anything digital has been removed, AI controlled or handed over to an 'enemy'.
We need to work out if it's possible for an AI protective system for all life and start developing it globally so that both AI and life have a mutual goal. In my mind one of those goals is to find out more about the nature of reality, existence and value/protect all information.
youtube
AI Governance
2023-05-12T06:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9QSfJH8BO7_0HfrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzdK9RowD-QjAL1h4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo2VP3HKOJNpOn3Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw17CXthwiuZDpG2bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx0mCUjC33Y9x0EHmJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugw0LgMQoei9z38xeRh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzFdUDxUGJ79DAAUBN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwrLNu8DICIzuPlARJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzpH11BQ-HRDLIkxq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwdI9cgvLSrBd8gbqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]