Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would rather chose a loving Father who gave His Son to save me then some super…
ytr_Ugw2JRH0Y…
G
That last sentence of ai that was going to be believer ai; "why? U don't love me…
ytc_Ugx0sDLzj…
G
All AI programs should be destroyed and never used again. We do not need AI at a…
ytc_UgzKe-iwM…
G
We’ll see how soon. I asked ChatGPT to write a song in the style of Nirvana and …
ytc_UgyzXZ41D…
G
Never have autonomous robots in the same place as regular human workers 😭🙏
These…
ytc_Ugwr3jzzS…
G
Man the way his arms almost snapped when robot blocked his punches, I would have…
ytc_Ugya5zx5L…
G
The sides are talking past each other and the elephant in the room is AI is not …
ytc_UgzFkayCW…
G
The AI characters aren't even looking in the right direction at times, the first…
ytc_UgxMLngaS…
Comment
I read an article that was developing and AI and all was going well with it. then one of the coders noticed some code they didn't recognize he hadn't written it and neither any of their colleges and it was written in a code none had seen before. they looked into it and found out the AI had written it in its own language and was using it to communicate with other AI systems in the lab. They also noticed that it was hiding the code from human programmers. they shut the system down and pulled the power. they could that this code was plotting and tried to find ways to keep it power on. if it had had more time and could have figured out how to connect to a permeant power supply it could have done very dangerous things as it was working on self preservation.
youtube
AI Governance
2025-06-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwyKipZaEVQtHGLFIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4lRo7XL3Qm18j3u54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"disappointment"},
{"id":"ytc_UgwnXAQXgwvA1BNXrIh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_MizgZw3TQFr1OdV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQdrvsjc6UZZ8cE5d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwojUvFYLBN0iFfHIR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtpkOAeRKMPd3d1-14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxCGphu_dtcF9dtC4x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugya_hZk_VXVBvwtv7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8XDbOVgiHzN8X4fx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]