The article states that the new policy is beneficial

The article asserts that the new policy is beneficial. There are thousands of airports connecting cities across countries and continents, yet with just three letters – from AAC and BBI to YYZ and ZZU – both me and you and our bags can root round the world as unambiguously as practically possible. Airport codes – if you fly, you know them – are part of the planning on your tickets, trackers and tags, and even as part of the port itself as big branding.

It’s impossible not to wonder, bored on a long haul with only in-flight entertainment, about potential patterns peeking through – like all the Canadian Y airports. Why Canada? And why everyone? How do all these codes code? Well, neighbor, to find the answer we need to divert this flight to YUL, the Canadian city that’s capital of codes: Montreal, where IATA is headquartered.

IATA is not a governmental organization but rather an independent aviation agency for airlines, where they work to make airports and airplanes increasingly interoperable using humanity’s most exciting and powerful, yet off-maligned as dull, tool: standards. One of which is the IATA airport code: three letters to identify every airport in the world, from the most connected to the least. All are coded so companies can communicate clearly and concisely complicated connections to carry their customers and their bags.

And actually, the code IATA had to create isn’t only for airports – rather, technically, it’s a location code for all kinds of transportation interchanges, like plane stations that connect to train stations, such as Amsterdam Schiphol, which is just so intermodally epic. Okay, let’s try not to get distracted by efficient infrastructure – easier said than done.

Here’s how the IATA code is supposed to work: one airport, one code, which is unique because airport names are not. Booking passage to Portland? Cool. That could be Oregon, or Maine, or Victoria, Australia. Ambiguity is the enemy. International flying creates communication connections between every language on Earth, so the IATA code helps when you don’t speak Greenlandic or Odia, but still need to book a flight to Kangerlussuaq via BBI. Much clearer – not just for you, but also for the ground crew getting the bags through.

Ideally, the IATA code comes from the first three letters of the location – like with Gibraltar, where Gibraltar Airport is given GIB. Gibraltar. So, going to Cork? It’ll be COR – Cork, Ireland. Oh, that didn’t work? Seems Cordoba, Argentina, built their airport first and got COR ahead of Cork. So, ORF for Cork? Tough noogies. ORF – Germany. That’s an adorable town name you’ve got there, but you’re going to need to pick something else for your code.

Thus, a single code collision kicks off a consistency cascade, as airports compete for clear codes. So, if your local airport has an odd three letters, there’s probably a rival port that picked previously. This is one of the major things IATA does: coordinate everyone’s code preferences, which means dealing with not just individual airports but all the aviation agencies in different countries, some with their own design desires for inter-country code consistency, such as Canada, who clearly claimed all the WHYs. Thus, picking a Y1 at random, at least you know roughly where you’re going to go. Oops – no, that didn’t work. WHK brought us to Washington, USA.

And since we’re here, we might as well talk about the FAA in America. The Federal Aviation Administration, daughter of the Department of Transportation, is given the job of assigning all American airports and American airport codes. Yes, the FAA actually has her own set of three-letter codes, but we’re not going to talk about it because it means in America there’s one airport, two codes. And for simplicity, I’m sticking to this story: one airport, one code.

Right now, FAA has letters she’d really rather American airports not use. NQWKZ or YN is reserved for the Navy. For OMG – is it aircraft carriers? No, they use an unrelated and additional system that…