# Thread: prime numbers and time consuming

1. ## prime numbers and time consuming

Hi.

I spent all morning thinking of an algorithm that could tell weather a Number K is prime or not.

Ii came out the following formula:

1..... Kz=-y(1+z)

where 0 < z <K-1 and 0 > y >1-K, and z and y are even.
If there is a pair (x,y) that satisfies 1., then K is not prime, otherwise, it is prime.

I timed the following numbers:

number prime time
23451 NO 3 min. 3 sec. 432 milliseconds.
5737 YES 0 min. 10 sec. 172 milliseconds

The question is: Does anyone has a routine that times this?
Im using APL as my programming language. I would like to see if C is faster.
(APL is an interpreter)
Thanks.
Gustavo.

2. That is very slow. For example, using the following simple algorithm (pseudocode):

Code:
```for i from 2 to sqrt(x):
if (x % i == 0):
return False
return True```
Python interpreter reports a runtime of 0.000s for each of your test cases.

You also haven't provided an algorithm.

3. for i from 2 to sqrt(x):
if (x % i == 0):
return False
return True
---------------------
you mean squareroot(x)? , but if this is a rational number?
I dont use C anymore, thas why I didnt provide an algorithm.
I have the algorithm in APL which is quite different.

Whats the step of i?

4. Originally Posted by Gustaff
I spent all morning thinking of an algorithm that could tell weather a Number K is prime or not.
I can admire your dedication, but sometimes it is just easier to search the Web.

Originally Posted by Gustaff
you mean squareroot(x)? , but if this is a rational number?
The result can be truncated to be an integer.

Originally Posted by Gustaff
Whats the step of i?
AntP's suggestion is trial division. You only need to perform trial division by testing with prime numbers as divisors, but the example was simplified such that each integer from 2 and including that truncated square root is tested, i.e., the step is 1.

5. C is one of the faster (perhaps the fastest depending on the program and compiler), language. It *is* going to be considerably faster than an interpreted language - any interpreted language.

C is one of the faster (perhaps the fastest depending on the program and compiler), language. It *is* going to be considerably faster than an interpreted language - any interpreted language.
Depends. In the majority of problems (this one included), computational complexity will have a much greater effect on running time than things like caching and other such compiler optimizations. Higher-level, interpreted languages tend to use very well-optimized algorithms in their standard libraries, for procedures that C may not have at all, thereby requiring the programmer to come up with the algorithm. Designing a good algorithm is often extremely unintuitive and, no matter how good a programmer you are, the complexity of the algorithm in the standard libraries is almost guaranteed at least to match yours; most likely, it will be several orders better.

So, in practice, it is often the case that the interpreted language will outperform the lower-level compiled language.

7. Originally Posted by AntP
Depends. In the majority of problems (this one included), computational complexity will have a much greater effect on running time than things like caching and other such compiler optimizations. Higher-level, interpreted languages tend to use very well-optimized algorithms in their standard libraries, for procedures that C may not have at all, thereby requiring the programmer to come up with the algorithm. Designing a good algorithm is often extremely unintuitive and, no matter how good a programmer you are, the complexity of the algorithm in the standard libraries is almost guaranteed at least to match yours; most likely, it will be several orders better.

So, in practice, it is often the case that the interpreted language will outperform the lower-level compiled language.
Uhh, no! Algorithms are language independent by nature so what ends up being crucial is the way in which you translate itfrom natural language into binary code, which is the only thing your computer cares about.

8. Originally Posted by claudiu
Uhh, no! Algorithms are language independent by nature so what ends up being crucial is the way in which you translate itfrom natural language into binary code, which is the only thing your computer cares about.
Uhh, no! Read my post again until you understand it. Standard libraries in higher level languages will implement extremely efficient algorithms - much better than the ones you can come up with yourself, which you will have to do if there is no built-in procedure.

Let's say you have task x, for which interpreted language A has a procedure that implements an O(logn) algorithm. But you are using compiled language B, which has no built-in procedure for said task. You go ahead and write your own procedure, which happens to implement an O(n) algorithm (because you are not aware of a better one). Suddenly, language A is outperforming language B (potentially by a very big margin). Unless you are a computer science genius, this is a very common scenario.

Because of this, there are many situations in which C will be outperformed by a higher-level interpreted language, purely because the algorithms designed by the C programmer are hugely inferior to those already implemented by the higher-level libraries of the interpreted language. Knowing that you are inherently using optimal algorithms for higher-level tasks is one of the biggest benefits - if not the biggest benefit - of working with a higher-level language.

Algorithms are language-independent, but that is irrelevant if the algorithms themselves are different.

9. Originally Posted by AntP
Let's say you have task x, for which interpreted language A has a procedure that implements an O(logn) algorithm. But you are using compiled language B, which has no built-in procedure for said task. You go ahead and write your own procedure, which happens to implement an O(n) algorithm (because you are not aware of a better one). Suddenly, language A is outperforming language B (potentially by a very big margin). Unless you are a computer science genius, this is a very common scenario.

Because of this, there are many situations in which C will be outperformed by a higher-level interpreted language, purely because the algorithms designed by the C programmer are hugely inferior to those already implemented by the higher-level libraries of the interpreted language. Knowing that you are inherently using optimal algorithms for higher-level tasks is one of the biggest benefits - if not the biggest benefit - of working with a higher-level language.

Algorithms are language-independent, but that is irrelevant if the algorithms themselves are different.
There are libraries for C that you can download from the internet that implement at least the same amount of standard algorithms as high level language libraries do. Point proven.

Anyway, people who are used to using standard algorithms from libraries, doesn't really matter whether for low-level or high-level language, are prone to missing the point that most real problems are special cases of a standard algorithm, which means there is probably no pre-built algorithm to do exactly that and writing one yourself can give you better asymptotic complexity. This is the fact that most youth programming contests are built on - there is usually a very simple solution you can write yourself, that is ineffective, a medium complexity solution which is usually a mix of some simple custom code and standard algorithms, or a completely custom algorithm that has the best complexity for the given task and doesn't rely on any standard algorithms. Go on and try some of those tasks, then come and try to tell me standard algorithms can solve every problem in the best way.

Besides, once you've taken a few CS classes, the standard libraries don't seem so "magic" anymore.

10. That's a myth, AntP - in real tests, it (almost) never happens. There are TONS of excellent algorithmic resources for fast compiled languages like C, C++, OCamel, (and Fortran for math).

I'm not saying you can't beat standard C functions like qsort(). I'm saying you can not out sort C, with an interpreted language - ANY interpreted language. Same with (almost) anything else.

An exception is with string handling - C doesn't have the best string handling, and languages like Delphi, which do it smarter, can smoke it - but Delphi is a compiled language, and C will smoke Delphil, across a broad range of algorithms.

Google on "The Great Computer Language Benchmark Game". It's a bit twisted because he scores for things other than speed, and only uses programs that are submitted, but you'll get the idea - the aforementioned languages (and a newcomer derived from OCamel called ATS), kick interpreted language's ass.

11. Originally Posted by maxorator
There are libraries for C that you can download from the internet that implement at least the same amount of standard algorithms as high level language libraries do. Point proven.

Go on and try some of those tasks, then come and try to tell me standard algorithms can solve every problem in the best way.

Besides, once you've taken a few CS classes, the standard libraries don't seem so "magic" anymore.
Strawman argument. Of course, computation-by-computation, compiled languages are always going to be faster. That's a given. In practical terms, however, unless you are willing to put in a lot of time and effort in maximizing efficiency by analysing every algorithm you write and ensuring there isn't a library that can do it better (which is often an unwarranted amount of work), you will often get better performance by working with a higher-level language, interpreted or not.

I'm not sure how the rest of your post is even relevant, because I never once implied that standard algorithms can "solve every problem in the best way", nor anything of the sort. I am responding to the assertion that C "will always outperform any interpreted language", which, in practice, is often not the case (even if it potentially could be).

12. Originally Posted by AntP
Strawman argument. Of course, computation-by-computation, compiled languages are always going to be faster. That's a given. In practical terms, however, unless you are willing to put in a lot of time and effort in maximizing efficiency by analysing every algorithm you write and ensuring there isn't a library that can do it better (which is often an unwarranted amount of work), you will often get better performance by working with a higher-level language, interpreted or not.

I'm not sure how the rest of your post is even relevant, because I never once implied that standard algorithms can "solve every problem in the best way", nor anything of the sort. I am responding to the assertion that C "will always outperform any interpreted language", which, in practice, is often not the case (even if it potentially could be).
Yes it is the case. The best argument for that is that most system development for *Nix platforms is still done in C/C++, because at that level, no interpreted language will be fast enough for your OS real-time requirements. Also, your library argument is complete BS since the library for C and especially C++ is extremely complex, more so than that of most interpreted languages.

An interpreted language is just that: INTERPRETED. Do you think you are going to communicate faster when talking to someone who doesn't speak English by having someone interpret everything you say as opposed to speaking in a language both of you understand? No, simple as that.

This doesn't mean that interpreted languages are doomed to being slow(which is a relative thing anyway) but it does mean that compiled languages will always have the edge in terms of execution speed (although perhaps not in terms of development speed).

13. Originally Posted by claudiu
Yes it is the case. The best argument for that is that most system development for *Nix platforms is still done in C/C++, because at that level, no interpreted language will be fast enough for your OS real-time requirements. Also, your library argument is complete BS since the library for C and especially C++ is extremely complex, more so than that of most interpreted languages.
I'm sorry, but "the library for C and especially C++ is extremely complex, more so than that of most interpreted languages" quite clearly demonstrates that you have still completely failed to understand what I am saying.

Again, I am talking about algorithms and the fact that higher-level languages have built-in procedures (using efficient algorithms) for things that lower-level languages (like C) often have no built-in procedures for at all (hence the term "high-level"), thereby meaning the programmer will design their own algorithm (unless they are both aware of the fact that a more efficient method may exist and willing to spend time researching it), which will often be inferior to the efficient algorithm implemented by the standard library of the higher-level language.

My point is that asymptotic complexity is a much bigger factor in performance than the distinction between compilation and interpretation.

Please refrain from aggressive responses to my posts until you have actually read (and understood) them properly.

14. Give me a "practical" example, AntP. Because it sounds like you're saying:

"If you run Python and do a Quicksort algorithm on an array, it will be faster than sorting with a Bubble sort, using C".

I'll give Java and Python and Ruby, and even QuickBasic, their due, but I've seen them run chess programs, and sudoku programs - where speed is king - and they suck, compared to C.

I absolutely agree that it will take longer to code up a good C program, and it may not be worth the time and extra work, for a given program.

But when you say that C runs slower than interpreted languages in practical applications, I have to ask "have you lost your mind?".

Give me a "practical" example, AntP. Because it sounds like you're saying:

"If you run Python and do a Quicksort algorithm on an array, it will be faster than sorting with a Bubble sort, using C".

I'll give Java and Python and Ruby, and even QuickBasic, their due, but I've seen them run chess programs, and sudoku programs - where speed is king - and they suck, compared to C.

I absolutely agree that it will take longer to code up a good C program, and it may not be worth the time and extra work, for a given program.

But when you say that C runs slower than interpreted languages in practical applications, I have to ask "have you lost your mind?".
Again, this isn't what I'm saying at all. At least, it wasn't, before people blew my point way out of proportion and twisted what I was trying to say into "C iz slow ruby is bettar lolll."

My point was simply that it's not a tautology that C will be quicker for a given task. Take your string matching example, for instance. There's no reason why C couldn't outperform Python for string matching/manipulation tasks but, given the sheer amount of work and research that it would take to implement algorithms of equivalent intelligence in C (to those that are already standard in Python), it's not wrong to assert that - in practical terms - Python will outperform C on this front.

Of course it would be possible to do marginally better with C, but the blanket statement that "C will always outperform an interpreted language" is extremely misleading as someone who didn't know better could infer from it that "writing my own function in C would be quicker than using this built in one in Python." You have to bear in mind that many people (and many programmers) aren't familiar with the notion of asymptotic time complexity and thus would see nothing wrong with making this inference.