Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As Ai gets far more into the future. When Ai doesn't reach our goals. We hit the…
ytc_UgzH5-zzT…
G
He is right. A.I., even after it has developed strategies of problem-solving wel…
ytc_UgzY5ABc2…
G
The tech oligarchs will use AI to develop a bioweapon that will kill everyone ex…
ytc_Ugy_HNmU_…
G
And we wonder why ai is going to replace us. We are doing it to ourselves by cho…
ytc_Ugw3sNUIY…
G
i’ve tried mixing up tone and phrasing, but Winston AI always finds those little…
ytc_Ugzssy4ux…
G
Fully self driving cars make sense, just not any of the ways we've tried to do i…
ytc_Ugxd_KMn0…
G
It sounds like you're expressing some strong feelings about AI and its implicati…
ytr_UgxJOn3yG…
G
My dad is an incredible artist but has unfortunately fallen for liking Ai art, h…
ytc_UgwyiEZzG…
Comment
How many times would you allow someone to play Russian roulette with your loved ones.. five? four? Zero.. no matter what science claims can be learned from the experience..if it's possible anyone could use it negatively then no one should be allowed to develop it.. and those who do should be severely punished and shamed. Restricting machine learning to a specific dumb "druid" level should be the international communities primary concern. To allow your own tools to, however implausible, have an opportunity to break its chains is suicide..and no one group or individual should be allowed to roll that dice.
youtube
AI Governance
2024-03-27T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdKFA-iUiWIpaxryZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyla4feeTyK7Jg2h3h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfB53RPkemwIUwNvt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTwTTcaZs6wsamx6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1BaMLCNRUTSaaqT94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwiJipLlGN87P3NF754AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzbSFlVMvAStNtsx-l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8cTYSvX9hdfOn3SN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysuezsAgJ5yhgsOkp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUHqjYK0I-xNKhyop4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]