# Thread: Need help with function..

1. Also, the best complexity to detect if something is sorted is O(n). It's a single pass through the list, making sure that every element has the correct order in respect to its predecessor. Again, it cannot possibly be faster, because you have to look at each element at least once.

2. Originally Posted by master5001
What is debateable is what can most rapidly confirm that the data is already sorted. You have the web at your access to my friend. I am not going to search stuff you are capable of looking up on your own. You were given satisfactory explanation to the logic. Even if you have multiple CPU's each reading part of a pile of data, there is always going to be a point where things are only processed on an individual basis. No matter what. I applaud that you are so boldy trying to make something "better."
Multiprocessing or multithreading will not change the fundamentals of the O(...) method of describing the complexity of an algorithm - it will just [potentially, assuming it's actually possible to parallelise the algorithm] speed up the actual execution, which is still proportional to the number of elements in some way. If it's O(n), then twice as many elements will take twice the amount of time. If you have four processors, it will still take twice as long to do double the number of elements.

3. That was my point. The complexity of the problem is a constant. Thus, there is no suitable way of reducing the absolute best case scenario. Which was my entire point to begin with. I am just not entirely sure where there is any room for cutting corners short of actually not checking every element.

If you didn't need to have everything sorted, you could theoretically make something with a O(n/x) complexity where x denotes a constant factor in which you skip elements of the data. The objective would simply be to sort data based on probability and skip elements that are in all probability already sorted. However, again, if something is truly randomized the probability of any element being sorted already is low.

In otherwords, you can make something with less than O(n) complexity if it does a half-assed job of sorting the data.