Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This was actually, in part, Sam Altman's own prediction.
He thinks that a lot o…
rdc_jf6velv
G
AI has no constellated core, no self-> everything is externalized -> same as a t…
ytc_Ugzg--IrK…
G
AI will never be able to do things like humans do. Humans put emotions in the th…
ytc_Ugw5Rmx5G…
G
"I dont think I like to think about whats going to happen" 😮wtf do you mean. We …
ytc_UgxZw4ysS…
G
This can be achieved by training LLMs not on the data that has ability to create…
ytc_UgwIJBliZ…
G
With limited resources and unlimited population? It would not be possible. Thats…
ytr_UgxVpUn1Z…
G
The whole premise here seems to be that AI "can't" write good code because it ig…
ytc_Ugyq-BJmN…
G
"Show me how" to copyright an AI co-created song where I've written all the lyri…
ytc_Ugw34Fc2D…
Comment
People are scared of AI because we have been negatively affected by it. Look how many people have lost their jobs because of AI. Is it really a good thing that we could decide the direction that AI could go? Is it not a double-edged sword? Majority of companies that develop AI are actually there for money. And some people are willing to do everything to make money, which means they can totally lead AI to a wrong way if they could make a huge money out of it. Not 100% of people develop AI for good purposes, and you certainly have no control over it.
youtube
AI Responsibility
2024-11-01T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxaYTcy9GuusN9kD1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTz-kffZoZxTbi6AV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyL0mLMR-nClDxxJf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFRBlOGleNMH4r-tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySR7IJmC6dWCcS9-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZ8l-xwL6TmDTvfzB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnL7R2dfDsQHyYZld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwE6sfWk8UaSCxA3d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx3DVa_GAUyh3Who0F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw04uiqnR3asSWfZ6Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]