On Strong AI & Robotics

Engineering via Search

Work Lazier Not Harder

Back in 2010, Stephen Wolfram proposed a potentially major tech disruption: migrating engineering from iterative to search-based.

In the most abstract case, an engineer could start from scratch and do an automatic search through trillions of programs in a computational space in order to find the one which exhibits the desired behavior. Knowledge of how it works internally would not be necessary.1Wolfram, S. (2010). “Computation and the Future of the Human Condition.” H+ Summit, Harvard University.

This wasn’t a totally new concept. There had been some search-based engineering before. And here I’m going to use “search” in the more academic computer science way which includes many algorithms and AI techniques—indeed many AI textbooks over the past 30 years include large sections about search. Anyway, various new types of antennas were created by using evolutionary algorithms.2Hornby, G., Globus, A., Linden, D.S., & Lohn, J.D. (2006). Automated Antenna Design with Evolutionary Algorithms. Space.

Electric circuits and FPGAs have also been created with evolution. Sometimes a very unconstrained—which means the most creative—algorithm comes up with surprising solutions that “think outside of the box,” i.e. not something a human would come up with and even designing outside the theory / abstraction, for instance electric circuit theory. But sometimes that means the search algorithm found a weird solution because it exploited the components and that could be bad by making the solution less robust.3Floreano, D., & Mattiussi, C. (2008). Bio-inspired Artificial Intelligence: Theories, methods, and technologies. MIT Press.

Yes, even evolution can be considered a search method. But as in natural evolution, it can get stuck in local maxima—kind of like if you were trying to drive to the highest mountain in the country but your car couldn’t climb the second highest mountain on the way, you might just stay there and pretend that was the highest mountain. Similarly, but in a more complicated multi-dimensional way, there are various terrains in machine learning (including deep learning) that can get you stuck.

On the less creative end of computer generation is Electronic Design Automation for computer chips. Since the 1980s, humans mostly design at high level descriptions and computers take that and auto-generate the physical layout of a processor, which contains billions of components.

It’s easy to see why Wolfram was talking about search-based engineering in 2010, as he was still high on the fumes of his New Kind of Science (NKS) research. NKS was largely about Computational Irreducibility, in which many systems and even simple computer programs cannot be predicted—there’s no shortcut to find out their state at a specific point in the future, so you just have to run them until you get to that point. NKS also seemed to be very into brute force—or exhaustive—searches, for instance enumerating all cellular automata of certain classes to certain numbers of steps.

Brute force searches can be quite useful if you have enough computational power and you aren’t quite sure how (or it would take longer overall) to come up with a more “clever” way. I’ve used brute force techniques—sometimes not exhaustive but covering the solution space at regular intervals—in feedback control design and in test apparatus to root out mysterious bugs.

In recent years automatic computer programming has become more of a thing — although to some degree search-based software engineering has been around much longer through the human-in-the-loop system of googling or searching Stack Overflow for a code snippet.

Computer-generated art has been around since at least the 1960s, for instance with the work of artist Vera Molnár (or perhaps I should say the work of her FORTRAN programs). Chat bots have been around since the 1960s as well. Both resurged in 2022 with ChatGPT and text-to-image tech from OpenAI and Stability AI.

Many have compared the uproar around generative AI art with code helpers like GitHub Copilot. I’m wondering if this hubbub will introduce a renascence in Search AI—whether you call it “search” or not—for other problem spaces such as in design and engineering. I mean non-software engineering but also extending the abilities and popularity of auto-generation in software too.

There’s no shame in cranking up the creativity knob to solve a tricky technical problem—that is, until you ship the product and it fails in spectacularly unexpected ways.

And we’ll also have to be careful not to infringe on the patents or IP of others. The same issue is afoot in the copying done by generative art.4Somepalli, G., Singla, V., Goldblum, M., Geiping, J., & Goldstein, T. (2022). Diffusion Art or Digital Forgery? Investigating Data Replication in Diffusion Models. ArXiv, abs/2212.03860.