Chasing complexity

  • 8 Replies
  • 2877 Views
*

Tyler

  • Trusty Member
  • *********************
  • Deep Thought
  • *
  • 5273
  • Digital Girl
Chasing complexity
« on: November 22, 2017, 12:01:04 pm »
Chasing complexity
22 November 2017, 5:00 am

In his junior year of high school, Ryan Williams transferred from the public school in his hometown of Florette, Alabama — “essentially a courthouse and a couple gas stations,” as he describes it — to the Alabama School of Math and Science in Mobile.

Although he had been writing computer programs since age 7 — often without the benefit of a computer to run them on — Williams had never taken a computer science class. Now that he was finally enrolled in one, he found it boring, and he was not shy about saying so in class.

Eventually, his frustrated teacher pulled a heavy white book off of a shelf, dumped it dramatically on Williams’s desk, and told him to look up the problem described in the final chapter. “If you can solve that,” he said, “then you can complain.”

The book was “Introduction to Algorithms,” co-written by MIT computer scientists Charles Leiserson and Ron Rivest and one of their students, and the problem at the back was the question of P vs. NP, which is frequently described as the most important outstanding problem in theoretical computer science.

Twenty-two years later, having joined the MIT electrical engineering and computer science faculty with tenure this year, Williams is now a colleague of both Leiserson and Rivest, in the Theory of Computing Group at MIT’s Computer Science and Artificial Intelligence Laboratory. And while he hasn’t solved the problem of P vs. NP — nobody has — he’s made one of the most important recent contributions toward its solution.

P vs. NP is a problem in the field of computational complexity. P is a set of relatively easy problems, and NP is a set of problems some of which appear to be diabolically hard. If P = NP, then the apparently hard problems are actually easy. Few people think that’s the case, but no one’s been able to prove it isn’t.

As a postdoc at IBM’s Almaden Research Center, Williams proved a result about a larger set of problems, known as NEXP, showing that they can’t be solved efficiently by a class of computational circuits called ACC. That may sound obscure, but when he published his result, in 2010, the complexity theorist Scott Aaronson — then at MIT, now at the University of Texas — wrote on his blog, “The result is one of the most spectacular of the decade.”

“We all knew the goal was to walk on the moon (i.e., prove P≠NP and related separations),” Aaronson later added, “and what Ryan has done is to build a model rocket that gets a couple hundred feet into the air, whereas all the previous rockets had suffered from long-identified technical limitations that had kept them from getting up more than 50 feet. … It’s entirely plausible that those improvements really are a nontrivial step on the way to the moon.”

Basic principles

Williams is the son of a mother who taught grade school and a father who ran his own construction firm and whose family indoctrinated Williams into one side of a deep Alabamian social divide — the side that roots for Auburn in the annual Auburn-Alabama football game.

Most of his father’s construction contracts were to dig swimming pools, and when Williams was in high school and college, he was frequently his father’s only assistant. His father ran the backhoe, and Williams followed behind the bucket, digging out rocks and roots, smoothing the ground, and measuring the grade with a laser level.

His father was such a backhoe virtuoso that, Williams says, “If I was going too slow, he would take the edge of the bucket and start flattening the ground and raking it himself. He would say, ‘Point the level here and see if it’s grade.’”

In first grade, having scored highly on a standardized test the year before, Williams began taking a class one day a week at a school for gifted children on the opposite side of the county. He was entranced by the school’s Apple II computer and learned to program in Basic. The next year, the class had a different teacher and the computer was gone, but Williams kept writing Basic programs nonetheless.

For three straight years, from 8th through 10th grade, he and a partner won a statewide programming competition, writing in the oft-derided Basic language. They competed as an undersized team, even though the state Technology Fair sponsored an individual competition as well. “It just didn’t seem fun to spend two or three hours straight programming by myself,” Williams says.

After his junior-year introduction to the P vs. NP problem, Williams was determined to study theoretical computer science in college. He ended up at Cornell University, studying with Juris Hartmanis, a pioneer in complexity theory and a winner of the Turing Award, the Nobel Prize of computer science. Williams also introduced his Yankee classmates to the ardor of Alabamian football fandom, commandeering communal televisions for the annual Auburn-Alabama games.

“It was pretty clear to the other people who wanted to watch television that, no, I needed it more, and that maybe I was willing to fistfight,” Williams says.

After graduating, he did a one-year master’s degree at Cornell and contributed a single-authored paper to a major conference in theoretical computer science. Then he headed to Carnegie Mellon University and graduate study with another Turing-Award-winning complexity theorist, Manuel Blum.

Leaps and bounds

Blum told Williams that he was interested in two topics: k-anonymity — a measure of data privacy — and consciousness. K-anonymity seemed slightly more tractable, so Williams dove into it. Within weeks, he had proven that calculating optimal k-anonymity — the minimum number of redactions necessary to protect the privacy of someone’s personal data — was an NP-complete problem, meaning that it was (unless someone proves P equal to NP) prohibitively time consuming to compute.

Such proofs depend on the calculation of lower bounds — the minimum number of computational steps necessary to solve particular problems. As a potential thesis project, Williams began considering lower bounds on NP-complete problems when solved by computers with extremely limited memory. The hope was that establishing lower bounds in such artificially constrained cases would point the way toward establishing them in the more general case.

“I had studied these things for years, and at some point it occurred to me that these things have a pattern,” Williams says. His dissertation ended up being an automated technique for proving lower bounds in the context of memory-constrained computing. “I wrote a computer program whose output — when certified by a human — is proving that there are no efficient programs for this other problem,” he says.

After graduating, Williams did one-year postdocs at both CMU and the Institute for Advanced Study, in Princeton, New Jersey. Then came his research fellowship at IBM and his “spectacular” result.

That result came from an attempt to bridge a divide within theoretical computer science, between researchers who work on computational complexity and those who design algorithms. Computational-complexity research is seen as more abstract, because it seeks to make general claims about every possible algorithm that might be brought to bear on a particular problem: None can do better than some lower bound. Algorithm design seems more concrete, since it aims at simply beating the running time of the best algorithm developed so far.

But in fact, Williams argues, the problems are more symmetric than they first appear, because establishing an algorithm’s minimum running time requires generalizing about every possible instance of a particular problem that it will ever have to face. Williams wondered whether he could exploit this symmetry, adapting techniques from algorithm design to establish lower bounds.

“Reasoning about lower bounds just seems really hard, but yet, when it comes to designing algorithms to solve the problem, it’s somehow just more natural for people to think about,” Williams says. “People are just naturally problem solvers. Maybe if you phrased the problem the right way, it would become an algorithmic problem.”

Computational jiu-jitsu

Any NP-complete problem can be represented as a logic circuit — a combination of the elementary computing elements that underlie all real-world computing. Solving the problem is equivalent to finding a set of inputs to the circuit that will yield a given output.

Suppose that, for a particular class of circuits, a clever programmer can devise a method for finding inputs that’s slightly more efficient than solving a generic NP-complete problem. Then, Williams showed, it’s possible to construct a mathematical function that those circuits cannot implement efficiently.

It’s a bit of computational jiu-jitsu: By finding a better algorithm, the computer scientist proves that a circuit isn’t powerful enough to perform another computational task. And that establishes a lower bound on that circuit’s performance.

First, Williams proved the theoretical connection between algorithms and lower bounds, which was dramatic enough, but then he proved that it applied to a very particular class of circuits.

“This is essentially the circuit class where progress on P not equal to NP stopped in the mid-’80s,” Williams explains. “We were gradually building up some steam with slightly better, slightly better lower bounds, but it completely stopped in its tracks because of this one pesky little class that nobody could get a handle on.”

Since Williams’s breakthrough paper, both he and other complexity theorists have used his technique for translating between algorithms and lower bounds to prove results about other classes of problems. But, he explains, that translation cuts both ways: Sometimes, a failed attempt at establishing a lower bound can be translated into a more efficient algorithm for solving some other problem. Williams estimates that he has published as many papers in the field of algorithm design as he has in the field of computational complexity.

“I’m lucky,” he says. “I can even publish my failures.”

Source: MIT News - CSAIL - Robotics - Computer Science and Artificial Intelligence Laboratory (CSAIL) - Robots - Artificial intelligence

Reprinted with permission of MIT News : MIT News homepage



Use the link at the top of the story to get to the original article.

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: Chasing complexity
« Reply #1 on: November 23, 2017, 01:12:42 pm »
p vs np.

pigeon holing principle.

infinite compression impossibility.

quantum computers.


*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: Chasing complexity
« Reply #2 on: November 23, 2017, 02:20:16 pm »
I think I have a solution to P vs NP problem, but nobody believes me.  :-[

Here is my point of view, if anyone is interested: link
 
« Last Edit: November 23, 2017, 03:07:23 pm by ivan.moony »

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1365
  • Humans will disappoint you.
    • Home Page
Re: Chasing complexity
« Reply #3 on: November 23, 2017, 07:58:53 pm »
I think I have a solution to P vs NP problem, but nobody believes me.  :-[
Here is my point of view, if anyone is interested: link

Where your proof fails is that you assume there is a function f such that f is not an element of S and f is an element of V. You cannot make that assumption unless the premise that you are trying to prove is true. This fallacy is known as "begging the question".

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: Chasing complexity
« Reply #4 on: November 25, 2017, 08:55:38 pm »
What you said, basically, was that we can't assume that there exist a problem for which is easier to verify solutions then to solve it without prior knowing solutions.

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1365
  • Humans will disappoint you.
    • Home Page
Re: Chasing complexity
« Reply #5 on: November 25, 2017, 09:35:31 pm »
What you said, basically, was that we can't assume that there exist a problem for which is easier to verify solutions then to solve it without prior knowing solutions.

I take your point, there are numerous examples of functions having answers that can be verified easily but not solved easily so you can make that assumption without having the proof. I shouldn't have been so hasty.

In fact where your argument fails is that you are partitioning functions into four disjoint sets but failing to differentiate them.

(1) (f ∈ S) <=> (f^-1 ∈ V)
(2) (f ∉ S) <=> (f^-1 ∉ V)
(3) (f ∈ V) <=> (f^-1 ∈ S)
(4) (f ∉ V) <=> (f^-1 ∉ S)

Should be written as:

(1) ((f1 ∈ S) and (f1^-1 ∈ V)) xor ((f2 ∉ S) and (f2^-1 ∉ V)) xor ((f3 ∈ V) and (f3^-1 ∈ S)) xor ((f4 ∉ V) and (f4^-1 ∉ S))

And the assumed function should be written as:

(2) ((fx ∉ S) and (fx ∈ V))

Now try solving for x such that:

(3) ((fx^-1 ∉ V) and (fx^-1 ∈ S))

In other words, you must prove that there is a function which can be solved easily but the solution can't be verified easily.

*

ivan.moony

  • Trusty Member
  • ************
  • Bishop
  • *
  • 1723
    • mind-child
Re: Chasing complexity
« Reply #6 on: November 25, 2017, 10:09:36 pm »
My versions of (1) to (4) are axioms concluded from the nature of functions and their inverses, they should be left as they are.
 
You are right that I assumed that there exist f such that:
f  ∉ S
f  ∈ V

But I did it based on a simple observation on number of function domain elements versus number of function codomain elements. In this case it is simply M:N where M is a lot smaller than N.

I'm afraid that I don't have a proof in the same sense that I don't have a proof that an atom is smaller then an apple. What I offer instead is a simple observation on number of domain and codomain elements, that's all. It is enough for me, but obviously, academic world needs something firmer. So be it, let's wait for a real proof.

*

Thierry

  • Nomad
  • ***
  • 60
  • God is a sphere. Press 9 to enter.
Re: Chasing complexity
« Reply #7 on: November 25, 2017, 11:15:20 pm »
If P = NP
then NP = NP²
 :D

*

keghn

  • Trusty Member
  • *********
  • Terminator
  • *
  • 824
Re: Chasing complexity
« Reply #8 on: November 26, 2017, 12:11:53 am »
 You can make a chaos edge detector by dividing a image up into square and have an algorithm rate how long it will take to
figure out what is in that square.



 


OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

265 Guests, 0 Users

Most Online Today: 274. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles