Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This has become trend and allmost every day there is a new guy "that knows" what will happen. In 99% cases, we will die and robots and ai will overtake humanity. Stupid.. I am 99% sure that nothing will happen and this types of guys "i know everything" will dissapear like snow last year. Civilization exist because of people. We created something. People have limits so our creation also has limits. It is stupid to think that "we know" that we live in simulation because if something is sooooooo advanced built this we wont be able to know that we live in "simulation". people use the word simulation because they don't understand, don't know and don't want to accept life as it is. they make an elephant out of a fly and of course, they are looking for something divine in something again. Life is as it is, they bombard us with the symbol of artificial intelligence all the time, but we are people. people have done what means that it has a limit, just like humans. unfortunately humans cannot do and create anything divine because they are limited. and from this "hope" that we will create something apocalyptic - we will not. half the planet does not have the internet. half the world does not have money and half the world is uninterested in developing anything similar. the most ordinary hype and bubble for making money. chat gpt is the most ordinary nonsense, an advanced internet browser that takes data from the internet and makes a sentence out of it. it is automatically defined as artificial intelligence. no, that's not intelligence, it's a super browser that made individuals millions.
youtube AI Governance 2025-09-13T13:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzXEsRobI1mJtBJqMR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzMnbM6C9UWZmF2YF94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwdGsEpVibRy2Yx2yR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxjQWpp3oOHlsnEzw54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwN5J8548DPrdSKYMp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwdwJyNlQmrr8dIGnl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgzXWyQ5dS-KPZVemv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjPW3x_Yh0ta935ct4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyU_RUcUAjYiVTO_Kh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzA8Zn-BbtkJSlNjpd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]