Smarter, faster algorithm cuts number of steps to solve problems

Computer scientists at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed an algorithm that exponentially speeds up computation by dramatically reducing the number of parallel steps required to reach a solution.

The algorithm, and variations of it, could be used for applications such as designing sensor arrays for medical imaging; more quickly assessing the accuracy of machine learning models; and designing clinical trials for drugs to treat Alzheimer's, multiple sclerosis, obesity, diabetes, hepatitis C, HIV and more.

According to the scientists, a lot of so-called optimisation problems that find the best solution from all possible solutions, such as mapping the fastest route from point A to point B, rely on sequential algorithms that haven't changed since they were first described in the 1970s. These algorithms solve a problem by following a sequential step-by-step process. The number of steps is proportional to the size of the data. But this has led to a computational bottleneck, resulting in lines of questions and areas of research that are too computationally expensive to explore.

“These optimisation problems have a diminishing returns property,” said Yaron Singer, Assistant Professor of Computer Science at SEAS. “As an algorithm progresses, its relative gain from each step becomes smaller and smaller.”

Singer and his colleague, Eric Balkanski, asked: what if, instead of taking hundreds or thousands of small steps to reach a solution, an algorithm could take just a few leaps?

“This algorithm and general approach allows us to dramatically speed up computation for an enormously large class of problems across many different fields, including computer vision, information retrieval, network analysis, computational biology, auction design, and many others,” said Prof Singer. “We can now perform computations in just a few seconds that would have previously taken weeks or months.”

Traditionally, algorithms for optimisation problems narrow down the search space for the best solution one step at a time. In contrast, this new algorithm samples a variety of directions in parallel. Based on that sample, the algorithm discards low-value directions from its search space and chooses the most valuable directions to progress towards a solution.

This process of active learning is key to the algorithm's ability to make the right decision at each step and solves the problem of diminishing returns.

The scientists are continuing to work with practitioners on implementing the algorithm.