Like Tree1Likes
  • 1 Post By grumpy

Why not just make red software?

This is a discussion on Why not just make red software? within the General Discussions forums, part of the Community Boards category; Coding Horror The problem with software it's us. Or, better, people that write stuff like this....

  1. #1
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,403

    Why not just make red software?

    Coding Horror

    The problem with software it's us. Or, better, people that write stuff like this.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  2. #2
    Unregistered User Yarin's Avatar
    Join Date
    Jul 2007
    Posts
    1,591
    Were you thinking of a particular post, or just the whole thing?

  3. #3
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Stack Overflow is a pretty great place. I almost never contribute to it or "hang around", but it's definitely ranked #1 on my list of sites that supply useful information turned up via google (most of my programming questions aren't C/C++ related).

    I have never paid much attention to Coding Horror, but in a quest to discover what has Mario's goat, I learned about "hellbanning":

    Coding Horror: Suspension, Ban or Hellban?

    ROTFL I suddenly had this vision involving transgalactic2 (if anyone remembers): still a member, still ranting away and creating threads where the same question is asked over and over, but the only people aware of and participating in those threads are other hellbanned users. I bet it would make for some great reading.

    Of course, there's no way to know if that's true. If it were, I imagine the mods can't let the cat out of the bag.

    Hey hold it -- maybe I've been hellbanned. A lot of you do seem overly aggressive too much of the time. I remember all these nice people here OUAT -- where'd they all go? When did this happen? WHAT HAVE I DONE FOR ........ SAKE LEMME OUT OF HERE YOU ........ ........S !!!

    Last edited by MK27; 06-27-2011 at 01:59 PM.
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  4. #4
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,403
    Jeff's someone I admire for his work with StackExchange, but someone I don't tend to want to listen to. We are almost always in a collision course when it comes to personal opinions regarding technology.

    That article...

    It builds on the premise that a page load delay in the range of tenths of a second reduces user satisfaction. That is, I'll be less satisfied if a page takes 0.9 seconds instead of 0.5 seconds. Or 1.8 seconds instead of 1.4 seconds. Performance is thus a competitive advantage concerning user satisfaction. Bullocks!

    What about the site ability to generate income? Not from users giving up, but from the fact the faster a page loads, the faster an ad impression registers. When you have millions of daily hits those tenths of a second count...

    Or do they? Do I have less page impressions a day because a page loads some tenths of a second slower, or do I have less impressions a day because less users visited my site, or clicked less in my site? If a user needs 1.8 seconds to hit my page, does it mean he will load more pages if I make it 1.4 seconds, or will he load the same pages because his web activity is entirely transversal to my website performance? That is, if Cboard takes 1.8 seconds instead of 1.4 seconds, do I read the same number of posts that interest me, or do I read less posts?

    ...
    Now, it's a well know fact that red websites load faster because of the way our GPUs process the color Red. Ideally one should implement RGB(255,0,0) on as many pixels as possible. Every reduction of the red component imposes some GPU cycle delays in the Achromatic Removal algorithm. This algorithm processes each color in succession, by first setting it to 255 and then decrement to the requested value, unless the request is zero:

    Code:
    for (each of the three colors)
        if (color != 0)
            for (i = 255, i >= 0, i)
                set colorcomponent = i
            end for
        end if
    end for
    I dunno... just making further valuable suggestions.


    More seriously,
    Website performance should matter. But not how it was painted by that article (as something producing a continuous gain). At some point performance gains will only scale to bandwidth demands. Meeting these demands is a matter of working within a range of acceptable values that generate equal results. That is, if I have 1m daily impressions and if I increase my page load times by a mere tenths of a second, I will not magically gain more impressions or will I contribute to user satisfaction (they won't even notice). But as my users increase I will slowly creep my way out of my current safe zone and will eventually be forced to optimize to guarantee the same level of performance as before.
    Last edited by Mario F.; 06-27-2011 at 07:35 PM.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  5. #5
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Quote Originally Posted by Mario F. View Post
    That article...

    It builds on the premise that a page load delay in the range of tenths of a second reduces user satisfaction. That is, I'll be less satisfied if a page takes 0.9 seconds instead of 0.5 seconds. Or 1.8 seconds instead of 1.4 seconds. Performance is thus a competitive advantage concerning user satisfaction. Bullocks!
    Okay, this is a great point. It certainly is a common practice to jump on statistical studies about user behaviour or performance and say: let this be our guiding light, statistics don't lie.

    I guess statistics don't lie, but their use value is not always so obvious. Just because the study cited in the article is true, does not mean that by reducing load times you are going to drastically enhance the value of your product. For starters, unless you have a 100% success rate on some level (as in, 100% of users who complete a page load, buy something or sign up for a service or whatever) then this study is far from complete: of the people who decide to give up during that extra 0.5 second load time, how many of them are sufficiently interested to begin with? Vs. of the people who would have waited anyway, how interested are they?

    I guess the nature of your content is going to be a big factor here. Eg, something tells me this is going to be much more meaningful for Pepsi than Wikipedia. The fact that google have taken it so seriously demonstrates that some people at google favour catering to a lowest common denominator, which is not terribly surprising.

    Now, it's a well know fact that red websites load faster because of the way our GPUs process the color Red. Ideally one should implement RGB(255,0,0) on as many pixels as possible. Every reduction of the red component imposes some GPU cycle delays in the Achromatic Removal algorithm. This algorithm processes each color in succession, by first setting it to 255 and then decrement to the requested value, unless the request is zero:
    I have not bothered to check if is this is true, Mario, but either way, it is a nice piece of satire.

    It reminds me a bit of the justification for fixed width pages, which drive me nuts. The justification was a study commissioned by The London Times five or ten years ago that proved that statistically people have an easier time reading a line with 10-15 words on it. No doubt, but most people actually do not read very much, and the fact that children will prefer books with large block lettering does not prove that scientifically, the ideal type face for human consumption is large block lettering. However, this quickly morphed into apocryphal stories about how "scientifically, your eye finds it easier to read a short 10-15 word line". No, it does not, and this is not something a statistical survey would prove. This is pure conjecture based on the outcome of such a survey and could easily be explained a number of other ways.

    So the question really was: what benefit is there in catering to the semi-literate? There might be a lot. IMO, a more important factor was: fixed width pages are very appealing from a design perspective because they make it easier for the designer to dictate and control the form of the page. I don't think this enhances the user experience very much*, but it certainly enhanced the designer experience and makes a lot of insignificant things seem significant. Making insignificant things seem significant can be a legitimate and sincere act of self-aggrandizement, if those things are within your realm of expertise.

    The article smells a little like that.

    * And as evidence that the "making material easier to read" really was just an excuse for obsessive-compulsive web design, I'd observe that it is a common practice on fixed width pages to also use SMALL fixed point sizes.
    Last edited by MK27; 06-28-2011 at 04:35 AM.
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  6. #6
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    21,310
    Quote Originally Posted by MK27
    It certainly is a common practice to jump on statistical studies about user behaviour or performance and say: let this be our guiding light, statistics don't lie.

    I guess statistics don't lie, but their use value is not always so obvious.
    This reminds me of the book: How to Lie with Statistics
    C + C++ Compiler: MinGW port of GCC
    Version Control System: Bazaar

    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  7. #7
    Registered User
    Join Date
    Jun 2005
    Posts
    6,166
    There is also an old curse: "lies, damned lies, and statistics!"

    Statistics don't lie, but humans do. So statistics can be misinterpreted, either accidentally or deliberately, or manipulated (for example, by filtering out data that deviates too much from a defined outcome).

    But, then again, I may be lying too
    anduril462 likes this.
    Right 98% of the time, and don't care about the other 3%.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 2
    Last Post: 04-27-2011, 04:14 PM
  2. Establishing 'make clean' with GNU make
    By Jesdisciple in forum C Programming
    Replies: 9
    Last Post: 04-11-2009, 09:10 AM
  3. How can I make software answer the phone?
    By Logan in forum C++ Programming
    Replies: 1
    Last Post: 04-22-2006, 09:57 PM
  4. Make window in VB but make program in C/C++?
    By Boomba in forum Windows Programming
    Replies: 1
    Last Post: 06-23-2004, 12:29 AM
  5. Replies: 6
    Last Post: 04-20-2002, 06:35 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21