Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@doomguy7111you shouldn't use it because the ai stole from a real person to mak…
ytr_UgwxNP_Zx…
G
Everyone that you named in the beginning, including Stuart Russell, Geoffrey Hin…
ytc_Ugxbw7YTu…
G
I asked Gemini to send a text to my two friends because I was running late once.…
ytc_Ugw61hZ-i…
G
But if Trump is colluding with Musk to illegally advance AI training why is he p…
ytc_Ugy0Yzwj2…
G
AI is trained on public data THEREFORE should belong to the public. 🤷 like inste…
ytc_Ugx5Lnup1…
G
They're going to placate us. The climate is doomed. They are building an AI so…
ytc_UgzU2YrUb…
G
well giving a robot a machine gun is indeed the beginning of the end because it …
ytc_UgxYzJeFE…
G
I really think all this doom and gloom about fertility rates is overblown becaus…
rdc_nka5e9a
Comment
If Robots replace the jobs humans do - who will consume? If humans have no income who will drive the need for the services robots can do cheaply if nobody can afford the services ?
What purpose would existence of human life serve? Would animals accept robots? Can AI create rocket ships for outter space travel- what true useful need do we have for AI if not to advance the human existence in harmony with technology? At some point where does this end? Will robots eliminate humans completely? What purpose do we serve? So many questions I feel this interview so far at the 27:38 mark has yet to answer.
youtube
AI Governance
2025-09-05T18:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxldTKoPm3o2tBOhp14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-CK1Y4Jo8NloZ29l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvNUOuaOEBZNxVrdJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDjCEV9eoOBKU8VHd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy4wnjivmSI_dVPgTl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgybKiGYQ0T-2IQUAwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjJ0YbExQIG3VJRc14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdDtSTa2NzQsC75El4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZxjH20mnCbrJKixd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2NZRQ4VQVMeGKMCF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]