Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let’s get real. Big Tech, Big Biz, etc their #1 objective for AI is to influence…
ytc_Ugxxk513c…
G
If AI works for free (not including energy and processing consumption) then the …
ytc_Ugx7PaKY3…
G
Some years back, I created a system based on the NEAT algorithm that evolves its…
rdc_jkhifu8
G
there is no separation. we are the infinite sea of consciousness- all of us. AI …
ytc_Ugxi8qyYh…
G
First thing I noticed, video 11:11 long. Next thing, screw AI with a blow torch …
ytc_UgxxduYuq…
G
This interview is an investor call not a real view of what is actually hapoening…
ytc_UgzPI6-DR…
G
Facial recognition hmph, they can't even recognize a black hand under a hand soa…
ytc_UgwsUEoRm…
G
I am screaming, ai's are calculators for creativity. Use it that way only!
But d…
ytc_Ugws7x2GS…
Comment
The biggest fear I have is that the AI system of a select few companies could take over the entire AI industry. Just like how Apple and Samsung dominate the mobile phone industry.
In that scenario in 10 years 1 or 2 AI systems could control huge swathes of society. That is dangerous.
I feel like a few ground rules need to be established with AI on a global scale, I mean something akin to the Geneva convention.
The number one thing is that AI can never be in control of replicating or spreading itself into other devices. Another one could be a universal kill switch for any one AI system which would destroy all AIs of the same type across all devices.
youtube
AI Governance
2023-05-02T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzxA5z8K4xN0D3guwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_c_wvOn-FDrLPHKp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDIKHwwq54eSEYZsl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwACK_4gNoCthCGzwl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrRpKw3RggZysiygl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyImSY3LcYKoHYpW7B4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSwUhrHG1Y_gF3_s54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3Uwuxiua5s2829mR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMSMd3rCTrh4-ui3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPyjS5Er9lWoQyXxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]