Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't care if you drive a Tesla or a hamburger, I don't trust ANY driver but m…
ytc_Ugydc9rwu…
G
"No AI" is the new standard of worth. Seriously. Advertising "No AI" is a winnin…
ytc_Ugy1UFltj…
G
I'm an artist myself, and I've noticed a major difference in how AI versus human…
ytc_UgySrBD8a…
G
All wars would end tomorrow if the leaders supporting it would be sent to the fr…
ytc_UgySwSm9w…
G
I see this with 3d printing all the time. These scammers sell models that look o…
ytc_UgxcqgUr4…
G
AI is not “free.” It takes tons of water. Companies that use it should be respo…
ytc_Ugz4_2mqX…
G
What's the point of company's saving money with AI if there are no consumers to …
ytc_UgwflAyGH…
G
Why didn't the parents help their son find mental health help? Instead the son t…
ytc_Ugzh_3YPl…
Comment
as a student AI master in germany and each week we have 3 ethic courses that ever what it costs we give promise to not give the Ai the algorithms of emotions and one professor made all student to repeat this sentences ten times . we are creating a monster that we will never controll because humans are limited by biology and aging and diseases while AI nothing can stop it .its like a snow ball more it rolls more it become bigger.and the most bigger danger is making Ai emotional this will lead the end of humanity because machines will develop the instinct of survive and threat and fear of death
youtube
AI Governance
2023-05-03T00:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx4qnWhKg5X2PqfYcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCNxjOQFZDny9_YG94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzemNyWh-uw1DBf1ix4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwN6mMiRgdWrXF1ZI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy8JxYaYNoo13JQgQ94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugymln3xD7yZSG2h_m94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy6Ncxl7oTA2eWaRAF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx7J87NH7_8gnhkQxx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1vG5mQFxsfxE06Ix4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKgEz6AXeprEegMz94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]