Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I remember seeing Elon warning us about AI during his interview with Joe rogan f…
ytc_Ugzjp48MU…
G
How about: DO WHAT EVER IT TAKES TO PASS THAT FINISH LINE AND FINALLY MAKE MONEY…
ytc_UgzI_RjaC…
G
Here's my theory: as AI is becoming more and more prevalent in today's society a…
ytc_Ugwjy8qOH…
G
facial recognition is absolutely common place. In 2018, within 2 hours of an in…
ytc_UgzKh2lmH…
G
This is ridiculous. Emotionality is something that has been evolved over million…
rdc_j8voz1b
G
I had Gemini 2.5 Pro make a highlight sheet:
Introduction
Dr. Roman Yampolskiy…
ytc_UgzcteC0Z…
G
Ai going rogue is not the problem. Billionaires stripping billions of people of …
ytc_UgwFnpN7o…
G
This video may be old but I think Elon already knew about how AI will be changi…
ytc_Ugw6KuR-r…
Comment
Isn't it more likely that AGI would be a service that could be purchased. For example I could purchase a yearly subscription to power my smart home with AI. I mean these is no need for each appliance to have AI individually.
And given the large computational power required, it would be better to house them in dedicated server farms for efficiency and cooling, for atleast the early stages of AGI.
Even in the industrial sector the AI would be a high level system, that oversees overall operations. But things related to safety would be hardwired to the local machine for rapid response and also compensate for the communication and processing lag.
youtube
AI Moral Status
2019-01-20T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5d1Q6Hspo0LkZHcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Ria52U8rYm4o-Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh0nN_yMz5rJNJX6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyofBn08Bm4LyCCnOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa2yI9dUj8pVUFUbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBFHv6g4gWKYZc5cV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJ6S4J8y7auS0JgyB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgybG6Eri3iLrYs_tgx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvU-20s4sLbbGjtNd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxH35ZKOkIzcvq6hZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]