++Eureka!!!

This is a discussion on ++Eureka!!! within the General Discussions forums, part of the Community Boards category; Originally Posted by laserlight There is a difference between "just about everything in mathematics" and "the foundations of mathematics". That ...

  1. #16
    Epy
    Epy is offline
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    960
    Quote Originally Posted by laserlight View Post
    There is a difference between "just about everything in mathematics" and "the foundations of mathematics". That the former "had already been figured out by the 1920s" is either a gross exaggeration or a statement made in sheer ignorance, whereas one might reasonably agree that the latter "were certainly done before the 1900s".
    To me there isn't. Just about every major part of mathematics had been worked out before that time. There was some good stuff between the 20s and the 70s but after that period things started to decline.

    What major discovery after the 1920s makes you think I'm wrong? As a person who spent every waking moment concentrating on math from ages 12 to 19, I'm pretty confident of my answer. The only counterexample I could think of is the FFT algorithm in 1965. The inner workings of the DFT were discovered in 1822 though.

    I can clarify my stance with this statement:
    If you think you've discovered something new and unpublished about mathematics, you're probably wrong, there's a good 95% chance it was already discovered, published and perfected before the 1920s.

    ANYWAY

    OP should:
    1. Research and find as many prime number theorems as possible to make sure his isn't a reformulation of any of them.
    2. Publish, as DavidP suggested.

  2. #17
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,438
    Quote Originally Posted by Epy View Post
    from ages 12 to 19, I'm pretty confident of my answer.
    Now, c'mon, Epy!

    You have to pause on those words and agree you are merely echoing a teenager expected behavior.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  3. #18
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,699
    Quote Originally Posted by Epy View Post
    To me there isn't. Just about every major part of mathematics had been worked out before that time. There was some good stuff between the 20s and the 70s but after that period things started to decline.

    What major discovery after the 1920s makes you think I'm wrong? As a person who spent every waking moment concentrating on math from ages 12 to 19, I'm pretty confident of my answer. The only counterexample I could think of is the FFT algorithm in 1965. The inner workings of the DFT were discovered in 1822 though.

    I can clarify my stance with this statement:
    If you think you've discovered something new and unpublished about mathematics, you're probably wrong, there's a good 95% chance it was already discovered, published and perfected before the 1920s.

    ANYWAY

    OP should:
    1. Research and find as many prime number theorems as possible to make sure his isn't a reformulation of any of them.
    2. Publish, as DavidP suggested.
    Well, first of all, I am not formally trained in mathematics. That being the case, there is a large body of material that I'm sure I may have overlooked. Which is the main reason why I have posted my ideas here and elsewhere - in the hope that someone with more training might spot something obvious! In fact, this is precisely what has happened; as it turns out, conjecture #1 does already exist - it's known as Giuga's conjecture (which remains unproven). At any rate, I've always felt that the best work is done in collaboration; the idea of "keeping to myself" just goes against my philosophical principles, really.

    Second, I find your assertion that "just about everything in mathematics had already been figured out by the 1920s" quite silly, frankly (and remarkably reminiscent of Albert Michelson's statement in 1894 that "the more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote...our future discoveries must be looked for in the sixth place of decimals"). I have yet to come up with a formal proof for the infinitude of "revolutionary" mathematical interconnections, but I believe that it's nonetheless a pretty good heuristic!
    Last edited by Sebastiani; 07-29-2010 at 10:03 AM.
    Code:
    bool fun(bool value)
    {
        return std::pow(std::exp(1), std::complex<float>(0, 1) 
        * std::complex<float>(std::atan(1)*(1 << (value + 2))))
        .real() > 0;
    }

  4. #19
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,438
    Quote Originally Posted by Sebastiani View Post
    Second, I find your assertion that "just about everything in mathematics had already been figured out by the 1920s" quite silly, frankly (and remarkably reminiscent of Albert Michelson's statement in 1894 that "the more important fundamental laws and facts of physical science have all been discovered [..]"
    It's a concrete example of the more general notion that we live in the Modern Age.
    Thing is, if we read the bibliography of the time, we will quickly learn that we have been living in a "modern age" (verbatim) since the Ancient Egypt.

    oops.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  5. #20
    Epy
    Epy is offline
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    960
    Quote Originally Posted by Mario F. View Post
    Now, c'mon, Epy!

    You have to pause on those words and agree you are merely echoing a teenager expected behavior.
    It does sound that way, except that I would listen and maybe change my stance if someone would provide me with a counterexample. I really expected that to happen too since everyone was so ready to shoot down my generalization.

  6. #21
    Epy
    Epy is offline
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    960
    Quote Originally Posted by Sebastiani View Post
    Well, first of all, I am not formally trained in mathematics. That being the case, there is a large body of material that I'm sure I may have overlooked.
    Even if you were, there still would be a large body. There's so much that's already been discovered. It's almost required to do as DavidP suggested, i.e. collaborate with mathematics professors to make sure your idea is unique.

  7. #22
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    21,636
    Quote Originally Posted by Epy
    It does sound that way, except that I would listen and maybe change my stance if someone would provide me with a counterexample. I really expected that to happen too since everyone was so ready to shoot down my generalization.
    I do not disagree with what you mean to say (i.e., it is unlikely that there will be any further major developments in mathematics, so if one has something groundbreaking, it would be wise to ensure that one is guaranteed attribution); I disagree with what you actually said (i.e., "just about everything in mathematics had already been figured out by the 1920s").
    C + C++ Compiler: MinGW port of GCC
    Version Control System: Bazaar

    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  8. #23
    Anti-Poster
    Join Date
    Feb 2002
    Posts
    1,399
    Quote Originally Posted by Epy View Post
    It does sound that way, except that I would listen and maybe change my stance if someone would provide me with a counterexample. I really expected that to happen too since everyone was so ready to shoot down my generalization.
    No, it's not our responsibility to educate you. Even a cursory search into the history of mathematics will provide you with plenty of counterexamples. The only possible argument you might have left is that you do not view these as 'major' contributions, but that's merely a matter of opinion.
    If I did your homework for you, then you might pass your class without learning how to write a program like this. Then you might graduate and get your degree without learning how to write a program like this. You might become a professional programmer without knowing how to write a program like this. Someday you might work on a project with me without knowing how to write a program like this. Then I would have to do you serious bodily harm. - Jack Klein

  9. #24
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,438
    Another interesting read may be Future of Mathematics.

    Plus the thought the possibility of mathematics not being able to fully describe a system without having to sacrifice proof, or the other way around (as per Gödel’s Incompleteness Theorem that we discussed some time ago on this forum), puts a great challenge to your claim, Epy. Or so I think.

    Finally, there's the problem of logic. You didn't prove "just about everything in mathematics had already been figured out by the 1920s". You have nothing more than an hypothesis. And an impossible one to test, at that. Yet, you are brandishing it as proof had been made.

    During you vast 7 year-long mathematical studies, I'm surprised you hadn't learned anything about strictness.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  10. #25
    Epy
    Epy is offline
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    960
    Quote Originally Posted by pianorain View Post
    No, it's not our responsibility to educate you.
    Right, because you just want to say I'm wrong without giving a reason why. That makes sense. Might as well say "LOL MORON".

    From the link you posted:

    One of the more colorful figures in 20th century mathematics was Srinivasa Aiyangar Ramanujan (1887–1920), an Indian autodidact who conjectured or proved over 3000 theorems, including properties of highly composite numbers, the partition function and its asymptotics, and mock theta functions. He also made major investigations in the areas of gamma functions, modular forms, divergent series, hypergeometric series and prime number theory.
    Total pimp.

    Andrew Wiles, building on the work of others, proved Fermat's Last Theorem in 1995.
    Would've accepted this as a counterexample.

  11. #26
    Epy
    Epy is offline
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    960
    Quote Originally Posted by Mario F. View Post
    Finally, there's the problem of logic. You didn't prove "just about everything in mathematics had already been figured out by the 1920s". You have nothing more than an hypothesis. And an impossible one to test, at that. Yet, you are brandishing it as proof had been made.

    During you vast 7 year-long mathematical studies, I'm surprised you hadn't learned anything about strictness.
    Seeing as how "just about" can't provide any sort of definite quantity or proportion, of course it can't be proven. It's called a generalization.

    I already clarified my position:
    I can clarify my stance with this statement:
    If you think you've discovered something new and unpublished about mathematics, you're probably wrong, there's a good 95% chance it was already discovered, published and perfected before the 1920s.

Page 2 of 2 FirstFirst 12
Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Eureka!
    By Sebastiani in forum General Discussions
    Replies: 4
    Last Post: 07-22-2010, 02:10 PM
  2. Eureka show
    By brewbuck in forum A Brief History of Cprogramming.com
    Replies: 12
    Last Post: 06-10-2009, 11:49 PM
  3. Keeping track of the player in a 2d numbered grid.
    By Shamino in forum C++ Programming
    Replies: 1
    Last Post: 03-25-2009, 05:39 PM
  4. ASCII to float
    By trucutu in forum C Programming
    Replies: 5
    Last Post: 11-14-2005, 05:25 PM
  5. Scroll bars.......
    By incognito in forum Windows Programming
    Replies: 5
    Last Post: 01-11-2004, 06:57 AM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21