Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Mafon2 No. What I mean is you can't learn professional photography just by gett…
ytr_Ugz7jAmAx…
G
Ai is meant to be a tool used for improving work and the final product, but not …
ytr_Ugxfrs17u…
G
Putin can't win against Ukraine while they have american arms and ammunitions an…
rdc_mcroucr
G
A lot of AI was used in the making of this video. At least they are cooperating…
ytc_Ugx7kWm0a…
G
This is exactly what I was thinking for the last few weeks. This is going to be …
ytc_UgxKVOhdV…
G
The problem here is greed. Companies and governments are run by people, and peop…
ytr_Ugw86BQio…
G
AI will never be the same… because all of the hours, time, and love you put into…
ytc_UgxAZC-Zi…
G
all these companies have to do is feed it more data, and free AI models can now …
ytr_Ugz2SO7dj…
Comment
It’s great to see Stephen Bartlett doing such deep, serious dives into AI safety with heavyweights like Tristan Harris and Professor Stuart Russell. It feels completely clear that this is a genuine passion project for Bartlett rather than a commercial move. He’s obviously well educated on the subject himself and is doing everything he can to communicate the urgency to an audience that is largely oblivious to how short the timescales are before our societies are utterly transformed, for better or for much worse. It’s a wonderful podcast, and I really admire the balance and thoughtfulness he brings to what is arguably the most important issue of our time.
youtube
AI Governance
2025-12-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwAu4RSAvyCqYwO7_F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9jYO7r9MVnbvLnQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxURsOiegkAFxbdgIt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygtdsVGOWQ-jTsREJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzU0zDansQYJIiYv_Z4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw18weYW9nco7gt5Q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMLfJ3Lr82pv8CB14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxE_aqYPhB36tPy0SJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw2-Lj6fU2sbsljGQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrHGB8hs0c2QJEgOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]