
Discover more from Emergent Outcomes
I am doing away with the previous Curation Friday format for free subscribers. The engagement wasn’t great and your early survey responses indicate much more interest in deeper writing. If you haven’t taken the survey yet, please do! It’s a 👉 short newsletter survey I designed to take less than 10 minutes of your time and it helps with my goal of making this one of your favorite newsletters to read.
Hello there ⛰.
When I meet people interested in systems thinking or complexity science they often tell me their story of having a paradigm shift in perceiving the world and problem solving approach. There’s an “aha” of appreciation for the intricate connections between things and their environment, and the compounding effect that one small change can have over time.
I am the same way. If I trace back my interest in complexity it takes me on a journey to the early 2000’s wave of new media art. I remember Jared Tarbell’s Complexification.net seeding a curiosity in me about recursion, fractals, networks, and using computers to simulate and explore the beautiful effects of randomness, elegantly ordered.
I also love Sand Dollar, another Tarbell work from 2004 (made with Processing, no less) that I look at today with renewed wonder. How simple changes to base rules create recognizable but unique iterated patterns. There’s a poetry to seeing life through this lens.
This early interest in aesthetic simulation led to a model thinking framework through which I began to see many patterns in life. The vast expanses of nature’s order could be recreated simply through code and variables. I discovered Dan Shiffman’s Nature of Code, which used Processing to create particle systems, automata, fractals: the models of complexity science.
Then there is the technology used to explore these domains. Once I began a career I found information systems and data engineering. Inspirations from my early interests carried through to me working in pseudo operations research and data modeling. Sequence, process, dynamics of change over time.
Though my complexity “aha” occurred at the NECSI Winter School program in Cambridge, MA; two weeks of intensive foray into foundations of complexity science with researchers, practitioners, and curious minds from all over the world. With its conclusion I felt like I had a different, more organic psyche to problem solving and observation.
I began to problem solve from the bottom up. Name the parts, define the interactions, create parameters, introduce randomness, simulate, rinse, repeat. Unlike the big data hype that encouraged executives to “just get the data!” and architect massive data lakes to support a dream that deep learning algorithms will find valuable correlations if you just feed the machine enough data, I start from the other side of the problem.
How is information being generated? What is the frequency and source of signal? What does it mean when you record it in this format? Time and time again delivering large scale data science solutions for organizations I’ve seen that less is more when the meaning and source of data is deeply understood. There are a few reasons for this.
First, more data means more to manage. The incremental cost - both capital and human attention from the engineering teams - yields diminishing returns to model outcome, which rarely justify the economics; e.g. 1.2% improved model performance yields no additional capital benefit to the business process, but costs an additional FTE to maintain the data pipeline.
Second, just because data is captured doesn’t mean it can be integrated. Sometimes information was just not designed to be combined. One of the big projects I worked on in the past five years was building a “customer journey” data asset to connect every interaction across many corporate functions. It required a lot of work and fuzzy matching algorithms (that were still quite imperfect) to match the customer profile in one department to the next. To fix these issues we recommended going back to the source and defining common keys to use across the company.
Third, data within meaning is nothing. Even worse, meaningless data combined with other data pollutes the entire dataset. Data costs something to store and capture. So there needs to be a very clear purpose for keeping that information. And there is no purpose without meaning.
With these three principles information management can be greatly simplified. Just as immense and beautiful structure can be created from simple rules with good aesthetic, great systems of insight can be built with simple, thoughtful information design. I apply complexity science thinking to my approach for designing technology systems, and if I trace my roots back far enough there’s art to it all.
Do you have an aha moment that inspired interest in complex systems?
If so I would love to hear about it. Either leave a comment on this post 👇
Or reply to my tweet 👇