Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So Waymo is learning to drive from human drivers then? Maybe it should be called…
ytc_Ugy9mI67K…
G
One of the practical sides of ai is the deflection of responsibility. Like we sa…
ytc_Ugz7zBigq…
G
This isn't really a compelling argument.
Whatever your problem is there are pro…
ytc_Ugx5Aoyr8…
G
There is no point in what is clearly not a 'full' self driving car, as you right…
ytc_Ugy_81W7O…
G
It feels like 2012 with self driving cars. It’s just around the corner they say.…
ytc_UgzioSZUO…
G
Like a clueless teenager casually holding humanity at gunpoint "because he can".…
ytc_UgzTEfX5Z…
G
Imagine life as a grand simulation that began with a divine creator forming huma…
ytc_UgxT2QLDy…
G
Stuff like this is an endless battle someone cracks the code to something we fix…
ytr_UgzYD3bsY…
Comment
It’s more of a philosophical question I think… would A.I want to destroy us? Would we want to destroy our creator? Imagine aliens came down tomorrow and landed on the Whitehouse lawn then they tell us that they were created on their native planet from natural evolution. But then they tell us that they created us long ago. Would we want to destroy them if we were able. I don’t think so.
youtube
AI Governance
2024-01-29T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwR41gsbUXstR_RI7V4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzF1thujulI6wTZ21B4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6nvTNXqzgIWEcXGR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzTu1lacER3BeEoLs14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyTTEhLCrUnVb0qGSJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNn0hqordcNqVyOPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCAcg5jtPAfySbqfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy5uUz5gWgBEDWtOf54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRTFi8pMrF5VN9XpV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5MxhPEwOywD8Ux4N4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]