I don’t feel I was too bombastic. Anthropic apparently giving the middle finger to warmongering cunts lends credence to my prediction. Nukes don’t compare to the power of ai. Ai is virtually unlimited in its ability to proliferate, sustainably destabilizing (no MAD), faster at making decisions, super sneaky, and perhaps independent in achieving goals. In the context of accelerating global AI competition, anything more drastic than labeling anthropic a supply chain risk would likely have spooked markets. Big ai seems to have more strategic leverage over civilization itself than most if not all governments nostr:nevent1qqsxjkkq6rk6p4d4mmf0l9aaykvwle8mm792ddpsgqpj3jcc9tm33nspzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgnpd8ey