November 2012 For some years I've been applying the tools of computational physicist to clean-tech, wondering all along what to call the combination. Today I think I've finally decided - it's algorithmic sustainability. Here's an attempt at a definition. The automated optimization of scarce resourcesA full definition of algorithms requires more computer science than I was ever able to stand, but Minsky is mercifully succinct, defining algorithms as a set of rules which tell us, from moment to moment, precisely how to behave.
With regard to sustainability, I'm going to improvise on some ideas from Wikipedia, defining sustainability as the capacity to endure through renewal, optimization, and sustenance of scarce resources. I like this definition because it succinctly subsumes PV and wind (renewal), energy efficiency and recycling (optimization), and biodiversity (sustenance) [1]. Algorithms, a set of rules, can of course contribute all the threads of sustainability, but they're probably particularly useful for the optimizations which'll often be tedious, complicated, and frequently demanded. Of course, given the right permissions an algorithm can not only find the optimal solution but could also execute it: for example charging your Nissan Leaf according to an objective function parameterized by both the price of electricity and your expected departure in the morning.
So, to continue the spirit of succinct definitions, I'll define algorithmic sustainability as the automated optimization of scarce resources
For exampleEcoFactor optimizes residential HVAC loads, Spirae manages the energy generation from distributed sources, Dominion concerns itself with optimal grid voltages, Echologics keeps tabs on the distribution of water, and so the list goes on...
The algorithmsThree logical units are likely to be prominent in algorithmic sustainability systems[2]:
The first and third units (loosely the input and output respectively) pose all the usual challenges of networking and big data. However, by my definition above, the meat of algorithmic sustainability will reside in the middle layer: the optimization.
The optimization algorithms themselves will be as varied as the systems they optimize. Steepest descent, Monte Carlo, simulated annealing, genetic optimizations: you name it and it'll be there. Of course optimizations are worthless - dangerous even - without an ecosystem of tools to support them, most importantly descriptive analytics and models. For a hypothetical fault finding algorithm, the stack might look something like:
The authors of these algorithms will be essentially indistinguishable from those who write the algorithms that power search (Google), movie recommendations (Netflix), and quantitative finance (Wall Street). Not only will these authors' stories contain many of the same themes (time-series analysis, Monte Carlo, SVM), they'll be published in a similar manner (Hadoop, Map-Reduce, Hbase), and probably even be written in the same languages (Python, Java, R).
In short, algorithmic sustainability will draw on the vocabularies of advanced numerical methods, high performance computing, and big data. In summaryIn a certain sense there's nothing special about algorithmic sustainability, it's just another twist on the methods that drive much of quantitative finance, search, and recommendation. On the other hand, there's every hope that issues of sustainability will gain ever increasing respect, and there's no doubt that automated optimization will be a prerequisite for progress. Algorithms will be developed around opportunities and challenges that are unique to issues of sustainability, and will ultimately form a rich web of functionality that traverse wide ranging applications. In this sense then, algorithmic sustainability has every right to be its own discipline, and to compete with Google and Wall Street for the biggest investor and smartest graduate. UPDATE APRIL 2013:
UPDATE SEPTEMBER 2013:
Footnotes
|
Writings >