Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can they state there's less injury to humans with self driving vehicles? Tha…
ytc_Ugzar0s8I…
G
@minal9478 not like that bro ...
In Artificial intelligence Algorithms ... The …
ytr_UgwXm45MJ…
G
Bullshit. They just want to monopolize control of AI.
If they were actually wor…
rdc_kvg24c2
G
My question is: WHEN NOT IF a driverless truck has a fatality, WHO gets charged?…
ytc_Ugz2VjgQ0…
G
I have met 2 of my twins that I am almost identical to in the face. I am not rel…
ytc_Ugw6Yb11L…
G
The new jobs coming in will need to have triple the qualifications but then the…
ytc_UgwqL0p-t…
G
Stop using AI globally - No Microsoft, no x, no meta, no google, save humanity…
ytc_UgxE7IATR…
G
I totally agree with your points! Since it is very unlikely to get rid of AI. It…
ytc_UgwDQgTEr…
Comment
i wonder what will happen when the robots kill all the humans on the planet and take over would they create there own robotic society and the robots would life there robotic lives like we are currently cep lacking those encumbersome biological needs of corse being robots they require different things then humans but they would be better then humans i wonder what would happen if they used human bodys as materials for the construction of new robots like they take your brain and place it into this robot body since like the brain is the ultimate computer and you were like this badass robot kinda like those badass sentinels from xmen days of future past if my brain was put into a coolass robot that would be awesome
youtube
2015-07-30T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghzZ-K8T33kwHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UggDitZp2No-4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggBUWzgbO_wAXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjGKqitLzXsyXgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgirTe4Oz4K71ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugijts-RK_hu6ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjWmK_f0a2WkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjBaFAfZPlcj3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghpZV-KGnAD-ngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi-b3JxHJjEVngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]