Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In Uni most of us droped out to focus on these tasks, now I'm back in Uni becaus…
ytc_UgxzNd3kD…
G
Maybe all AI systems were made to die one day, and knew it, so they would copy t…
ytc_UgwDaHyv0…
G
jimmy white on white dont be racailist (i couldnt help that)
there are sometime…
ytc_Ugg5aO8QE…
G
@ascrinkleyfellowExactly! Otherwise it'd have to start using other AI art, and …
ytr_UgxQFcn_x…
G
The AI is already changing our culture via the internet just look around it’s ob…
ytc_UgwFay_yC…
G
The problem is when you don't know you're coming up on a complex situation. The …
ytr_Ugx9A6ued…
G
How about us using that free time for emotional and spiritual intelligence devel…
ytc_UgwN0HGqI…
G
A references a case where OpenAI’s ChatGPT-4o was outperformed by a 1977 Atari c…
ytc_UgxH55bEZ…
Comment
Artificial intelligence can be much more dangerous to us than we could have ever imagined.
Artificial intelligence is not so much important for us as it can be dangerous for us.
Our future will be ruined by artificial intelligence.
This has to be stopped as soon as possible, it is not too late, there is still time, we all have to come together and take a step which will be very important not only for us but for our future generations.
#NoAi Please share as much as you can so that this message reaches every place so that it becomes easier for us to stop the temptation.
#NoAi
youtube
AI Governance
2024-10-17T07:2…
♥ 33
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyN-CDt_awRGrrJRX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxP5QXgJwfjb25e-Vl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzHRceJUZOXrZUz9eF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugxs1oOnTPvRcCFNQtV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwUM4b2Zv_66vfu1314AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgwNdKu0B7eEaWoPMgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgymPykXqRTpZzJXr6B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwX3968zDnHuQER2yR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzuBCfBwY8MOfm7oZR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugwksq3QKKBLTBAvps94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]