Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If squirrels invented humans, would the humans’ goals remain aligned with the squirrel‘s well-being? Possibly for a short time, but not forever. Not now, but some day we will be the squirrels. "If they are not safe, we won’t build them." (1) Cars before seatbelts. (2) Nations that do not build AI will be out-competed by those that do - we cannot get off this train.
youtube AI Governance 2023-06-26T21:1… ♥ 6
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwH-6hm87UtoueFPWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxpdou8J-Mw29x-Zrd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxSn61F8CnsZATGdjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx-fWVIjvGigcWWvcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxRNSUq3g4j9m2Xu7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz6VJdTx_854kKoTah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwfDe1MsjPlNh2yMkZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwzbk-4P9eZqRv4nad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzhWFRsnJNk4XwOKl54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgwpXS7IEJKGUTfTjjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]