Big Data Computational Complexity
The volume and veracity of big data bring significant challenges to computing. Classical computational complexity theory classifies solvable problems in polynomial time into tractable and intractable ones. However, algorithms with polynomial time complexity or even linear time complexity in big data calculations have become unacceptable in practical applications, and actually making the tractable problems recognized by the classical computational complexity theory intractable. By characterizing the complexity of tractable problems in big data computing, the classical computational complexity theory will bring some theoretically fundamental changes.
Research Areas
The core interest in most big data computing is that computational complexity and algorithm theory should be re-examined in an equilibrium approach between data volume and computational effectiveness. The definition of tractability put forward by traditional computational complexity theories is not suitable for big data calculations because it ignores the influence of algorithm effectiveness on tractability. As Big Data Computational Complexity theory has succeeded in identifying the tractable problems in big data computing, and finding NP problem. It is considered as the “foundation for solving the computational complexity of big data” by the international computing community.