Debugging Planet Earth: Part 1
What is a "malformed payload?" And what causes it?
In software terms, a malformed payload is a "bad" data packet received by an application that, if not appropriately accounted for within the application, can cause a crash. A malformed payload error can occur for several reasons, but two of the most common are listed below:
- A developer is testing an API or database and fails to properly format the input data. In short, it is a temporary syntax error arising (hopefully) only during testing that will/should be fixed prior to production release.
- The request or response from an API was too large to process in the allocated timeframe of the receiving application. In short, it is a timeout error. This can arise in a production application if query sizes (or responses to those queries) increase beyond the expectations of the original application developers. This generally requires an emergency hotfix to patch out the bug. Even if the system doesn't crash, the situation needs to be handled moving forward, so it's possible that the issue may be deemed noncritical and scheduled for an upcoming release instead of deploying a hotfix.
Okay, but how does this relate to planet Earth? And can you use a little less jargon in your explanations? Now I'm even more confused.
Alright, alright. I'll get to the good stuff.
Let's start with scenario one from above:
Humankind regularly intervenes in planetary affairs. This is done for many different reasons, but it all amounts to wanting something from the Earth and developing a strategy to obtain that "want." In this sense, humans are the software developers of Earth's ever-changing ecosystems.
There are certainly scrupulous groups (such as the First Peoples of North America) that take only what they need and ensure that the ecosystem is not damaged by their actions. But by and large, corporations and governments of this world care far less about future repercussions than they do about future profits (whether economic, political, military or something else altogether). To that end, experiments are conducted (most often in a small-scale environment, and increasingly with the use of machine learning technologies) to determine a risk/reward ratio.
Risk can come in various forms when harvesting resources, such as:
- Social fallout from cultural disapproval of the harvesting (or harvesting methods).
- Physical danger to workers due to hazardous harvesting conditions.
- Uncertainty surrounding sellability of the harvested resources (especially in the case of new resources or harvesting methods).
Harvesting resources is only half of it, of course; there's also the act of processing these raw materials into usable goods. That, too, entails risk of a very similar sort.
Thus, tests are performed prior to the deployment of a new harvesting or processing facility. If, during the course of these tests, risks are identified that outweigh potential rewards (profits), then the project is scuttled. This ties back to scenario one in the software development analogy.
But if the rewards outweigh the risks, even only in the short-term, the project is likely to move forward.
That brings us to scenario two:
When humans deploy a new resource harvesting or processing — even if they've appropriately anticipated and prevented severe environmental impact — over time, damage to the ecosystem can occur. This happens when impact was more severe than anticipated, or when impact was simply ignored altogether.
There is a compounding effect if these issues are left unresolved over months and years. Severe weather events are increasing in frequency and intensity year after year. Events such as torrential rainstorms and hurricanes risk causing algal blooms, killing off fish — and the fishing industry. Tornadoes and hurricanes can down power lines, causing fires to break out — which is especially dangerous in regions containing nuclear power plants. While not an event that was precipitated by humankind's actions (or inactions), the tsunami, earthquake, and nuclear meltdown in Japan is an example of the absolute worst-case scenario.
The "system" of our planet has not yet "crashed," but it is constantly at risk of cascade failures like those mentioned above. Planetary management (i.e. government bodies) have deemed the ongoing — and looming threat of — environmental crises to be noncritical issues and delayed release of any "hotfixes." This is a grave miscalculation. The planet is dying a slow death, with blame placed heavily upon our own heads.
How long until we're the next species to fall victim to our own failure to manage planetary programming?
That, my friends, is the question, isn't it? Prominent environmental scientists estimate a timeframe somewhere between 35 and 55 years time during which we can reduce fossil fuel emissions to the point where human-induced climate change's effects will be reduced (and possibly reversed, given time). Are we prepared to accept a response to climate change within that 55 year timeframe — and thus avoid a malformed payload cascading into the die-off of humankind?