Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey robot girl humans are gonna win not robots get the whole world. We could get…
ytc_UgymUOiJH…
G
Id accept ai taking any job exept for art. Art is supposed to be creative and cu…
ytc_Ugyc5uNRU…
G
I just want to know one thing: Where will all these unemployed people go? Will t…
ytc_UgwTShVRe…
G
By the time this comment was commented sora ai invited someone to be able to dow…
ytc_UgzHKpTvr…
G
An Ai can think in a billionth of a second, it can decide is threatened as soon …
ytr_Ugwt1-nH4…
G
I didn't expect this video from you! ( rambling wall of text incoming lol)
I'm a…
ytc_UgxzMpC8q…
G
To be fair… I treat ChatGPT as my friend, and always says thanks to him because …
ytc_UgymeTwek…
G
11:25 She was only in the video briefly, but mitski has been such a major part o…
ytc_UgyI3_Zag…
Comment
If squirrels invented humans, would the humans’ goals remain aligned with the squirrel‘s well-being? Possibly for a short time, but not forever. Not now, but some day we will be the squirrels. "If they are not safe, we won’t build them." (1) Cars before seatbelts. (2) Nations that do not build AI will be out-competed by those that do - we cannot get off this train.
youtube
AI Governance
2023-06-26T21:1…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwH-6hm87UtoueFPWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxpdou8J-Mw29x-Zrd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSn61F8CnsZATGdjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-fWVIjvGigcWWvcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxRNSUq3g4j9m2Xu7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6VJdTx_854kKoTah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfDe1MsjPlNh2yMkZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwzbk-4P9eZqRv4nad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhWFRsnJNk4XwOKl54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwpXS7IEJKGUTfTjjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]