Thread: Rant: internet gets faster, then slower

  1. #16
    Registered User MutantJohn's Avatar
    Join Date
    Feb 2013
    Posts
    2,665
    Dude, the Node community is almost literally allergic to writing their own code.

    What's worse is, companies that use something like Node are _not_ coding for maintainability at all. The concept of writing code that ages well seems altogether too foreign. The problem is external dependencies galore. The concept of standardization is this weird alien idea in the JS community. And this is problematic.

    Maybe someday... someday JS will have a stand-alone sufficient set of standardized modules but that largely seems like a pipe dream. Basically, I can always get a job no matter what cleaning up the crap code that was left behind. The good news though is that I think the script fad is slowly fading as C++11 becomes more popularized. I think with the recent rise of C++ HTTP frameworks (cpprestsdk looks the most promising) and task-oriented programming becoming a goal of C++, we should be good. C++ also interops with Node pretty well.

    Node: the best thing to happen to the front-end, the worst to happen to the back-end.

  2. #17
    Lurking whiteflags's Avatar
    Join Date
    Apr 2006
    Location
    United States
    Posts
    9,613
    Sorry for the slow reply. I wanted to gather my thoughts, and I had a few moments now anyway.

    Quote Originally Posted by Mario F.
    In the context of web design, the word "product" takes many forms. It can be a product in a shopping website, or an article in a news website. Content Farms in particular are great adepts of AJAX, because it facilitates the setup of the type of short-lived information they live (some would say, prey) on.
    This was a very important correction for me, so thank you.

    Most news websites today are content farmers, including otherwise respectable newspapers as soon as they move online. And content farmers' model demands that articles be made available for relative short periods of time, quickly replaced by new ones and then discarded and forever lost. For them Information is a consumable product, like staples or printing paper and the business is all on selling as many staples as possible. Storing links to news articles is becoming a thing of the past and collectively we all are loosing the ability of the web to remain an historical document of our world. Places like Wikipedia are today riddled with links to highly active news sources that nonetheless return 404 after 404 to news articles not 1 year old. And AJAX excels at facilitating this state of affairs where content is dynamically served and removed.
    *facepalm*

    Ugh. I'm upset with myself that I didn't think about that. I hate when wikipedia links to a local paper somewhere and it's not found months later. Like you, I'm also concerned that there is no real historical record online. I would say that you could pay for the archive at the news site, but that's also not really a solution the more you think about it. Historians will likely not have access to privately funded archives, unless those same papers are taking especially good care of their hard drives. It's a shame that there isn't a digital library keeping the articles around either for public use. The whole situation is kind of a mess right now.

    Anyway, consider me convinced.

  3. #18
    Make Fortran great again
    Join Date
    Sep 2009
    Posts
    1,413
    Quote Originally Posted by MutantJohn View Post
    Web development attracts a lot of non-programmers.
    Been the problem since day 1.

    There are some really nice looking, fast, and impressively made websites out there. The people who do that work need to keep rocking, and everyone else needs to ........ off quite frankly.

    Same with desktop. I am so sick of the everyday bugs I find in software. It's like testing is taboo or something.

  4. #19
    Make Fortran great again
    Join Date
    Sep 2009
    Posts
    1,413
    Relevant to this thread, one of the main reasons I hate asynchronous loading of things is because it allows for purposely designing webpages to trick you into clicking on an ad. Example: async loading ad and list of links below it in a column. The list of links loads instantly and by about the time the ads load is when my finger moves towards my phone to click a link in that list. Ad loads, list of links shifts down, click ad instead. And if I choose to wait for everything to load, it's anywhere from 5-15 seconds before it's fully loaded. Lazy ass people not thinking about this usability concern and not specifying the size of the div ahead of time.

  5. #20
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    Quote Originally Posted by MutantJohn View Post
    Maybe someday... someday JS will have a stand-alone sufficient set of standardized modules but that largely seems like a pipe dream.
    Indeed. The problem is that every new framework is an incentive to another. The landscape is so crowded and confusing that web developers have lost sight of it. Most of those frameworks are simply variations of another existing framework, done because of one of the most damaging aspects of free programming; the beguiling but harmful idea that I can do better. And as the number of frameworks grows, so grows the complexity of choice, which leads to the less beguiling but equally harmful idea that I just should create my own.

    It's a cycle that feeds on itself and has become unstoppable. Why it has happened to JS and hasn't happened to other languages is not because of the relationship the browser plays with the internet. The separation of client from server code is one factor that promotes an increase in the options for websites to display interesting behaviour that doesn't require stuffing the internet pipeline. That opens a world of options. But it is no different from any other client code and we don't witness this type of jungle on most programming languages.

    Instead I think the real factor is web design itself. It is best defined by its spartan compartmentalization, which promotes a library to do A and another library to do B, and a new library that does A + B (which is cheaper than using both libraries) and another library that does A+Subset(B) and... you get the meaning. The combinations quickly become endless as you start adding featuresets C, D, E...

    The thing is that I think this is being imposed on designers, and does not really reflect something fundamentally wrong about the notion of web design. Neither it reflects something that is inherent to JS. Instead, it is a consequence of the underlying technology itself, by how browsers have been designed to handle client-side processing for security reasons and resources optimization. Which leads me to the next point...

    Quote Originally Posted by MutantJohn View Post
    The good news though is that I think the script fad is slowly fading as C++11 becomes more popularized. I think with the recent rise of C++ HTTP frameworks (cpprestsdk looks the most promising) and task-oriented programming becoming a goal of C++, we should be good. C++ also interops with Node pretty well.
    While I agree that C++ is an extremely appealing choice for client-side scripting, I don't think that C++11 brought anything new that wasn't there before and that couldn't have been done before. But even if we saw a rise of C++ based frameworks, you must realize that it would only be expanding on the already existing ecosystem. Admitting that some C++ framework displaces some JS framework, you will still have to contend with the fact that you will remain with the same total number of available frameworks. And assuming the more likely scenario that C++ adds (not replaces) to the existing JS ecosystem, C++ is just going to contribute negatively to the existing problem.

    Because in the end we still have to face the fact that the insecure web is the reason of why web browsers had to design their client-side processing infrastructure the way they did. And this browser infrastructure is the reason why you have so many frameworks.
    Last edited by Mario F.; 10-17-2016 at 03:46 AM.
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  6. #21
    Make Fortran great again
    Join Date
    Sep 2009
    Posts
    1,413
    As a followup to this, turns out adblock plus was to blame. FAster with ads than without.

  7. #22
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Try uBlock instead.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  8. #23
    Make Fortran great again
    Join Date
    Sep 2009
    Posts
    1,413
    Thanks, downloaded/installed.

  9. #24
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    Quote Originally Posted by Epy View Post
    Thanks, downloaded/installed.
    uBlock Origin, not the uBlock. If you downloaded the just "uBlock" get rid of that crap immediately. Origin is the one from the original author and the one you want. (And it's awesome!)
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  10. #25
    Make Fortran great again
    Join Date
    Sep 2009
    Posts
    1,413
    Quote Originally Posted by Mario F. View Post
    uBlock Origin, not the uBlock. If you downloaded the just "uBlock" get rid of that crap immediately. Origin is the one from the original author and the one you want. (And it's awesome!)
    Also thanks, I had the crap version. Going to try it out on the netbook later, that'll be the real test.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. assembly is slower !
    By mark9876 in forum C++ Programming
    Replies: 3
    Last Post: 05-14-2015, 08:36 PM
  2. the more new and delete operation, the slower they gets?
    By idleman in forum C++ Programming
    Replies: 12
    Last Post: 07-08-2009, 11:16 AM
  3. Gmail slower?
    By jverkoey in forum A Brief History of Cprogramming.com
    Replies: 6
    Last Post: 03-27-2005, 11:26 PM
  4. computer going slower than normal
    By MisterSako in forum Tech Board
    Replies: 20
    Last Post: 08-17-2004, 03:35 PM
  5. Linux's GUI slower than windows'....
    By Nutshell in forum A Brief History of Cprogramming.com
    Replies: 34
    Last Post: 02-16-2002, 08:47 PM

Tags for this Thread