Convex optimization highlights ICCV program session

Published

By John Kaiser, Research News

Convex optimization, prized for its efficiency and utility in solving small and medium-size problems across multiple disciplines, could soon be extended to handle much larger problems and tasks, such as those found in image and vision processing.

Prof-Stephen-Boyd (opens in new tab)Explaining how and why that might happen, Prof. Stephen Boyd of Stanford University gave the plenary address Sunday at the biennial International Conference on Computer Vision (opens in new tab) in Santiago Chile.

Spotlight: Event

Inclusive Digital Maker Futures for Children via Physical Computing

This workshop will bring together researchers and educators to imagine a future of low-cost, widely available digital making for children, both within the STEAM classroom and beyond.

To date, convex optimization used in many applications like AI, automatic control, and automated trading, starts to falter when the problem size exceeds 10,000 variables. That’s sufficient to control an engine or inform hedge fund portfolios, but not nearly enough to handle the complex image recognition calculations required for autonomous vehicle navigation through busy urban areas.

Boyd is a leading authority on convex optimization, a century-old branch of mathematics that was largely theoretical until the early 1990s. That’s when faster computing speeds, and advances in algorithms for solving them, made it practical to use for applications like predicting peak power usage. Fast forward to today and convex optimization is routinely used across a variety of disciplines not only in computer science but also genetics, bioengineering, economics, behavioral sciences and various other fields.

In the plenary talk and accompanying paper, Convex Optimization with Abstract Linear Operators (opens in new tab), Boyd describes “recent progress toward the goal of extending (domain specific languages or DSLs) to handle large-scale problems that involve linear operators given as abstract operators with fast transforms, such as those arising in image processing and vision, medical imaging, and other application areas. This involves re-thinking the entire stack, from the high-level DSL design down to the low level solvers.”

The research was conducted in collaboration with Steven Diamond, PhD candidate at Stanford.

Although faster and cheaper CPUs will always be significant in charting the rise of any newly adopted technology, the use of DSLs may prompt wider adoption by enabling non-experts to articulate exactly what they want using a more natural language.

“Domain specific languages allow a broad variety of people to use (convex optimization) tools without knowing the details of how they work,” Boyd said.

It’s the prospect of solving very difficult computational problems with just several lines of code that has far reaching, “game changing” implications.

Eventual goal: “People who do images and work with 1 million variables can just write a 5-line Python script,” Boyd said.

Domain specific languages for convex optimization are easy to use, since they are “simple declarative languages,” where programmers declare what they want, but not how to achieve it.  This is unlike the usual procedural languages where the programmers state what to do when something occurs.

“In most quantitative hedge funds you don’t have hardcoded logic that says what to sell or buy depending on how the market moves,” Boyd explained. “What they do instead is solve an optimization problem that takes everything into account and then decides what to trade.”

Boyd’s PhD-level course on convex optimization is among the largest graduate courses at Stanford with more than 330 students (compared to a median graduate class size of 12). For access to open source code, papers, and related materials, see the Stephen P. Boyd site (opens in new tab) at Stanford University.

Sponsored by the Institute of Electrical and Electronics Engineers (opens in new tab) (IEEE), ICCV runs from Dec. 11-18, 2015.

Continue reading

See all blog posts