Stop Begging Your AI to Try Harder — Give It a Skill Instead
I've been there. You're in the middle of something complex, and your coding assistant just... gives up. "I can't access the logs." "You'll need to do this manually." "I don't have the ability to......

Source: DEV Community
I've been there. You're in the middle of something complex, and your coding assistant just... gives up. "I can't access the logs." "You'll need to do this manually." "I don't have the ability to..." You try rephrasing. You add "try harder" to the prompt. Sometimes it works. Mostly it just produces a more confident refusal. I kept thinking: there has to be a structural fix for this, not a prayer. The PUA Rabbit Hole A few weeks ago I stumbled on tanweai/pua — a GitHub skill that tackles exactly this problem. The approach? Corporate Performance Improvement Plan rhetoric. It literally tells your coding tool it's on a PIP and at risk of termination if it gives up. Darkly funny. And honestly? Effective. The framing creates a kind of pressure that does get results. But it felt wrong to me. Not ethically — just strategically. Pressure-based motivation is brittle. It creates anxiety-driven behavior: more hallucination, more desperate guessing, less careful reasoning. I've seen it in humans and