Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can launch nukes all around the earth in low orbit that will fry every single…
ytc_Ugz5HhoW-…
G
This "AI getting rid of people because it does not need them" is such a stupid t…
ytc_UgxCZdRU_…
G
The middle class is at higher risk, as both the middle class and AI deal with in…
ytc_Ugy-kx_JL…
G
It's a written exam, the clinical exam is arguably more important and something …
ytc_Ugx73hvfw…
G
The only time I used ChatGPT was to ask if Epstein killed himself and it flagged…
ytc_UgwqBy0Ec…
G
So we’re just completely forgetting how art is all about creativity and expressi…
ytc_Ugzu-OXq6…
G
unless you own the platform yourself, you are subject to all the rules set by th…
rdc_g4l3wwo
G
I've literally had nightmares involving being stuck inside a self-driving car. T…
ytc_UgxMLUBFQ…
Comment
Given that whole world works on this technology, and don't have to comply to your rules and standards, considering the power of this technology what is the use of most perfect agency, laws and regulations you find and make when pretty small company, not country could make pretty big and bad not to say evil ai ? How would you solve the problem which is not in your hands? This is much greater than invention of atomic bomb, this is like inventing atomic bomb that could be built by any company out of sea water in few hours, forget regulating that everybody would have one till tomorrow morning and by the time you make the draft for one law every country in the world would have whole bunch of them.
Yes I do exaggerate a bit, speed is not really that high but I am also not to much far away from the truth? Wait for few years and tell me am I right, I hope not.
youtube
AI Governance
2023-05-18T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzsvmj0bbd-HbQfzTR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZE1uKqAzRxoaNqc94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjB8QlWoolkVgUAUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwM63I__v3k2GDetTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMTMb52k1uWFhDbzJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzLI0fTGhFfOVgDvgp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzy0g5FX72XQ_Y4mft4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9SB6z8O0DRp3d9RZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbgaiIYd5fkpN5OCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxrD0GnsfHJFN8P2T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]