Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I read an article today about an Australian woman who also found deepfakes of he…
ytc_UgwM-CNSY…
G
miss when ai generated stuff was uncanny and disturbing in a funny way and not l…
ytc_Ugw-Y8OD_…
G
A lot of the conversation was based on AGI but the 15 minutes of actually buildi…
ytc_UgzWD2Z0S…
G
Literally one of the biggest reasons why AI is so annoying is because you can ju…
ytc_UgyeC9Bbm…
G
This explains why a lot of people are unemployed now and I see people on TikTok …
ytc_UgwmMoq_k…
G
talked about algorithm and to be honest to make one algorithm you have to do so …
ytc_UgygRpao-…
G
I say get rid of AI I am totally against AI it's going to cause havoc among the …
ytc_Ugx5piKPF…
G
Hot take, AI ‘artists’ shouldn’t be called that, they should just be called “AI …
ytc_Ugwx2ccA8…
Comment
You use your brain potential to lift your body up when you wake up. No autonomous robot could possibly, with available capacity, be capable of flying an aircraft as good as human does. Coming to the analogy of it being a wasp, it wouldn't be capable of keeping up with it either. Is it possible in the future? If we survive as a specie long enough - of course! I just think it wouldn't be the best idea to give a weapon to the first robotic singularity computer - prone to glitches AND missjudgment.
youtube
2012-11-23T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]