Many software engineers who have been in the industry seem to view asymptotic analysis (“big-O”) as not as effective or relevant to their work. Yet it is still fundamentally emphasized in academic computer science programs, forming a major part of standard algorithms and data structures courses. Is there still some value in asymptotic analysis, or is it better to evolve away from it going forward?
Categories
