Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's a brutal but honest observation. And it actually connects directly to the…
ytr_UgwwDZlcq…
G
I learned art like ai. Could you give me examples of how you learned that is dif…
ytc_UgxI_4gaC…
G
Looking at anything AI generated feels like looking at either stolen work, or th…
ytc_UgxAEowrd…
G
As a nurse I would love for AI to at least help out with the heavy lifting we do…
ytc_UgwpGdK2O…
G
Cresting the peak of inflated expectations and a 1/4 the way down the through of…
ytc_Ugzr5N1lK…
G
Silly physicists believe that consciousness arises from complex systems when the…
ytc_UgxOlfBiB…
G
This is just infuriating. This is why I personally think AI is doing more bad th…
ytc_Ugx0fZqLE…
G
there is nothing wrong with work being wiped out by Ai, what's immoral or injust…
ytc_UgzvJRHps…
Comment
10 years is way too optimistic.
The newest models got the possibility to improve their own code. What possibly could go wrong, right?
The problem is that humans think that they have an AI still in their hands...
They will never slow down because of the fear they will lose this race with other countries. Pandora's box was opened and a genie came out of the bottle.
youtube
AI Governance
2023-07-07T09:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKW176UPvripbH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyjY2dXlFoeIhNacMR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxaDIhRkSCKxtWw5nB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRbWRzLCpjC675Vs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFrtnsGVlYL77Hf-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5mTfOeUxvWBJMBP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgWQElQQR_t2y_Uo14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlnSezJ9FGb_BLqYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPX3-Gh9zoltjM77V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgycQu9Gv_dnxZkCA4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]