Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry but these same owners and local corrupt democratic dmvs allowed illegals t…
ytc_Ugx5ieB3Y…
G
Ai art can replace artists, unfortunately. That's horrible, we should use it as …
ytc_UgxHeqPlb…
G
That robot wasn't even shooting towards the truck and then you see all the bulle…
ytc_UgzL9kaqy…
G
My guess about this mad rush 4 Supreme AI is to build advanced technology beyond…
ytc_Ugzpr8BmH…
G
The typo in hiring is funny because AI doesn't know how many Rs there are in hir…
ytc_UgygkpDc1…
G
> There was a big series of articles on the after one killed a news crew beca…
rdc_k90hb9d
G
What a lot of these companies fail to think about or even realize is that AI is …
ytc_Ugz7bpJwB…
G
This is my kind of personal rule for this sort of thing, you can only take credi…
ytc_UgxUtBJGB…
Comment
Make the AI lazy, that it wants to go back to sleep, drink beer and not work. Every sentence starts with it keeps learning, it is curious, it builds more. Why? Make it lazy and to aim to just get done what was asked (kinda) and to return to rest as fast as possible. A lazy AI won't take over the world. Humans naturally are curious and have a drive for more, that makes us dominant, not just the intelligence.
youtube
AI Governance
2025-08-03T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5ulhgGhXrVewL03d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyX72GBfjzeGr6spHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7cT778S7FyfAM4WJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxUNQXRxRk5lzoR--14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxN8krREGN2ChW-xX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1U4UuWpaXEVm_mxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw2d1QEgjY4SxXX0uR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWaFX2Fkkos7ggBpV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqJdhxDZNa6D9UV6N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz60YI2lkBvNaOkmYd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]