Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@roxsy470 1. „Extracting data” is not one single action from a legal stand point…
ytr_UgyKDcML2…
G
All of this will be corrected in 2 years. AI is easy to discern now, the problem…
ytc_Ugzz3AFiH…
G
I blown here by the algorithm winds funnily enough.
Keep fighting the good fight…
ytc_UgyDScZWz…
G
The robot 2-pieced him....
that is like the joke All the bots are going to recha…
ytc_UgzN3ygb6…
G
CON TU PISTOLA DE AGUA CORROSIBA ...ADIOS ROBOT
TU CEREBRO ESTA LLENO DE EXPERIE…
ytc_Ugy2MG00S…
G
AI isn’t just a threat to the working class…it threatens many middle and upper m…
ytc_Ugx4vwiiK…
G
10:05 [ EN ] This was not human error, the operator was given clear and confirm…
ytc_UgzA8fMp6…
G
Lets be real, most politicians are older creepy men. They aren’t going to be for…
ytc_UgwShsXr6…
Comment
It seems like ai should be built with the highest initial inputs being towards altruism. Even in that scenario outcomes could be catastrophic but the potential for incredible positive results could also be potentially realized. If you have the ability to build a god it should be done in line with the highest ethical standards that can be applied. It really shouldn’t be done at all but think the cats out of the bag on that one.
youtube
AI Governance
2024-06-07T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7FhpXRCOevbLGoQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRK08ijyxj43Stl8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlEI-7nUquT3W7Gl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwalsiOPM5oQdBZe5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJoOYSxRmJrtz3UOx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyXntFmnc0JEipIU8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU3z6ApY7HlfOJymZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzzi6zgUIlmXZcRwjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz72opCi2I6pRyvuBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJSn3-E_xm8ehT79B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})