Networks are unreliable. We’ve all experienced trouble connecting to Wi-Fi, or had a phone call drop on us abruptly. The networks connecting our servers are, on average, more reliable than consumer-level last miles like cellular or home ISPs, but given enough information moving across the wire, they’re still going to fail in exotic ways. Outages, routing problems, and other intermittent failures may be statistically unusual on the whole, but still bound to be happening all the time at some ambient background rate. To overcome this sort of inherently unreliable environment, it’s important to design APIs and clients that will be robust in the event of failure, and will predictably bring a complex integration to a consistent state despite them. Let’s take a look at a few ways to do that.
Advice and lessons learned on how to get a legacy codebase under control and bring it to a new level of maturity.
The other day, the engineering manager on my team was testing some stuff and happened upon this heartbreakingly terrible interface: He pinged the designers and we obviously filed a design bug immediately to bring that page up to any kind of snuff.... | Cap Watkins | Sr. Product Design Manager at Etsy. Formerly at Amazon, Formspring and Zoosk. Draws pretty pictures on the Internet all day.
Learn techniques for how to deal with complex and complicated unknown legacy code, how to understand it, and finally writing the Golden Master tests for future changes. | Difficulty: Intermediate; Length: Medium; Tags: Web Development, PHP, Bash
In my previous post I argued against setter injection. Optional dependencies are one of the main objections raised in the comments and elsewhere. I did mention these with a suggestion of just makin...
How to access a complete terminal from inside the Chrome Developer Tools
In this article, I’ll share with you how my team has used the PHP dependency-management tool Composer to streamline our development processes and to maintain our WordPress project dependencies across the development team consistently and reliably.
Whether you like it or not, there are scripts and bots out there hammering away at your sites with endless HTTP "POST" requests. POST requests are sort of