# Thread: Elements repeated thrice except one

1. Originally Posted by Mario F. No. It hasn't. That's what we are trying to tell you. I have no idea how I can convince you of this more than I have. I suspect we will be going around in circles if I have to start repeating my arguments.
If the system doesn't do the sorting and your program doesn't do the sorting, who does it? The vacuum? Originally Posted by Mario F. But in any case, even if that was a requirement of this particular problem, the immediate answer to the OP was that there was no solution for this exercise. As soon as you make sort an integral part of this algorithm, this problem cannot be solved in O(N), unless you are willing to consider only best cases... which I will object strongly.
There probably is a solution. The point, however, is that the solution cannot use an algorithm that requires the data to be sorted first, because the data isn't sorted and sorting it is prohibited by the complexity requirement. Originally Posted by Mario F. There is always a "system". An algorithm doesn't exist in the vaccum. The system holds the data and the state. The algorithm processes the data, alters or not the state and produces or not an output.
Okay, let's consider this simple example. A device that measures some variable, say temperature, writes the current value to a file ten times a second. Every day, the data is collected and fed to a program to analyze. The program needs to perform an operation on the data that requires the data to be sorted by value. But the data is sorted chronologically, not by value. Then the program does the sorting. Originally Posted by Mario F. At this point I'm not sure what you are talking about. No one is changing rules here. In fact this debate happens because you are trying to introduce a new rule where previously there was none -- And your rule is, an algorithm that requires sorting needs to make the sort a part of its complexity description.
I'm not saying that. An algorithm that has a prerequisite cannot be used directly when that prerequisite isn't met. Therefore the algorithm used becomes just the second part of the large algorithm (the program), where the first part is meeting the prerequisites for the second part. The complexity of the whole program is the sum of the complexities of all its elements. Originally Posted by Mario F. I suggest that if you want to negate binary search it's current O(logn) status, write a paper about it and present your case with better argumentation than what you did here. Because you'll have a lot of explaining to do to the scientific and computer communities.
If you have unsorted data, your program cannot only consist of the binary search algorithm. Your program then becomes sorting plus binary search. O(nlogn + logn) = O(nlogn). That's my whole point. Then obviously there is no point using binary search if you only need to do it a constant number of times per one sort (unless that constant is very big). Your program cannot have any prerequisites that the input doesn't offer. 2. Originally Posted by maxorator Okay, let's consider this simple example. A device that measures some variable, say temperature, writes the current value to a file ten times a second. Every day, the data is collected and fed to a program to analyze. The program needs to perform an operation on the data that requires the data to be sorted by value. But the data is sorted chronologically, not by value. Then the program does the sorting.
Think about it for a moment. And then tell me what are the requirements you delineated for that algorithm.

As for the rest of your post, I'm sorry max. I won't be repeating myself. Think as you want. 3. If the data is sorted, can't we find the unique one in logarithmic time (look at three middle values, thereby determining if the needle is in the first or the second half)?

But anyway, the problem statement quite clearly says that the whole program needs to be O(n) and that the only assumptions about the data are as listed.

There is an array of N elements. All but one is repeated thrice. That element has occurred only once. Find the element in O(N) time. 4. Question: In general, when there are outside sources of complexity, do they count towards the complexity of an algorithm? Like, say, I use a container that doesn't have a constant access time, does that container's complexity affect the complexity of my algorithm? Theoretically speaking, of course. 5. Originally Posted by User Name: Question: In general, when there are outside sources of complexity, do they count towards the complexity of an algorithm? Like, say, I use a container that doesn't have a constant access time, does that container's complexity affect the complexity of my algorithm? Theoretically speaking, of course.
The one word answer is yes, they all count.

The long answer is they count if you want them to. You can always postulate the existence of some oracle that performs some step of the algorithm in constant time (or O(n), or whatever you want). You could then go on to prove various properties of your algorithm, contingent on the existence somewhere of something that could implement the oracle. "Provided somebody can build a machine that does XYZ with O(n) complexity, I can show the following..."

But if you're using something specific, such as a sort which takes O(n log n) time, then that is as much a part of your algorithm as any other part. 6. If there is no restriction on memory then how about the following...
1) allocate space for memory[], the size of which is sufficient to encompass all possible values.
For every element do (2) and (3)
2) number = array[i]
3) is memory[number] zero? yes then add number to used[]. memory[number]++
4) when completed go through the used[] to index into memory[] and look for 1 == memory[used[i]] % 3

The reason for the used[] array is so that one doesn't have to traverse through the entire memory array, whose span may be billions of elements. Popular pages Recent additions 