The big-O notation was introduced in 1894 by Paul Bachmann. Primarily there are three types of Asymptotic notations: It analyzes two algorithms based on changes in their performance concerning the increment or decrement in the input size. Therefore, we use Asymptotic analysis to compare space and time complexity. Even if we calculate time and space for two algorithms running on the same system, their time and space complexity may be affected by the subtle changes in the system environment. It heavily depends on the tools and the hardware we use for comparisons, such as the Operating System, CPU model, processor generation, etc. We cannot directly compare two algorithms side by side. In simple words, it tells us how good an algorithm is when compared with another algorithm. If you are not from a Computer Science background, you may find this concept a little more complex than usual. Introduction to Asymptotic Notations (made easy)
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |