Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry creepy comment was not for you it was for something i was watching accide…
ytc_Ugz6jRv72…
G
One problem is the transition between, at the start self driving cars will be mo…
ytc_UghKAohdh…
G
whenever a robot is in human form it is the least intelligent one, most likely c…
ytc_UgxmgPEuQ…
G
No automations can't take your job because as like karl marx said if there is no…
ytc_UgylRm2KM…
G
@laurentiuvladutmanea what you said has nothing to do with anything here.txt su…
ytr_UgxW1JP0Y…
G
Ew ai, why people willingly make themselves dumber is beyond me. But while I no …
ytc_UgxfWI0a0…
G
Sounds like parents who are unwilling to accept their role in his suicide, and a…
ytc_Ugz-t7pcd…
G
Why on earth didn’t you immediately step in when that guy said, “Elon has no mor…
ytc_UgzE4n2PB…
Comment
Wouldn’t that mean that you need all the good ideas you can get, on how to mitigate the dangers? It seems simple but I hear of no “Summits” on how to guarantee that AI is always subservient to and sensitive to, babies or humans.
youtube
AI Governance
2023-10-23T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxXr5bgs6YkuEdHp0J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYYqaUC3q9tY6zLoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMqYAhPLDrPg3oBtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgytrEPR80wUQxcdg7V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8fKlqJQyKtJzJTeF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1HTnyPwTNnexDSb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzNpFb7ZnKUnINF_p14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgwQvN87dkOSsmuvejl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz8-iyBXUNcoHFC7xF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjjjkEDUfFyNZumnd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]