IgnorantGuru's Blog

Linux software, news, and tips

The Dangers of Software Evangelism

From The Sporkbox Blog, a review of the dangers of software evangelism and how it applies to the current situation with systemd adoption, with some devel mailing list quotes.

In May 2011 Lennart Poettering proposed systemd as a dependency for further releases of GNOME. As systemd is available only on Linux, the proposal led to the discussion of possibly dropping support for other platforms in future GNOME releases. While some people responded to the proposal with criticism others suggested the idea of a GNOME Operating System on top of the Linux kernel.

Basically this comes down to: Are Linux users going to allow corporations to take over their OS and change it in unfriendly ways? Because that’s what’s happening.

One of my concerns on this is how poorly these developers maintain these projects. I just came across an easily reproduced GTK3 bug affecting SpaceFM which was reported almost a year ago – with no response yet. That’s one thing you can look forward to when these corporate developers control everything: Microsoft-quality responsiveness and attention to detail.

February 14, 2013 - Posted by | News, reviews

4 Comments

  1. I tell you something. I used to contribute a lot to GNOME a couple of years ago. Until now, nothing has changed. They are the same pricks as before. They like to infect everything, shout out a lot of hot air, want everything to be GNOME and and and …

    And the reality is what we have today with GNOME 3. This reminds me of one of those “what community think we do, what mom and dad think we do, what government think we do – AND what we realy do”.

    Pöttering with his PötterKits is another one of these high ego kind of people. Exactly the same shit that happens right now, where all those MS Red Hat crawlers are choking the cock of Linus Torvalds to get this crap Signature shit from MS into the Kernel (read it on some news site). The same way Linus did react on the crap that happened with udev some months ago.

    Linus is the last bastion protecting us from all this crap. If he falls then everything is lost.

    Comment by Samson | February 26, 2013

    • One could argue that’s already happening on a smaller scale: since so many things are tying into dbus, people are pushing for dbus to be standardized and put in the kernel instead of relying on simpler IPC mechanisms that work just fine if you build your data structures and IPC “protocols” properly. A wise man once said, “Get your data structures right and the program practically writes itself.” It’s a case of “we want to be lazy programmers, let us be lazy!” This mindset is exactly what causes the speed of software to steadily go down despite hardware growing in power at an alarming rate. To make the most of the hardware, we have to have sane, efficient code. Sad to say, the majority of programmers just don’t care about efficiency the same way programmers in the 70s and 80s did. The reason? The hardware isn’t forcing them to.

      IMO it’s only graphics and scientific developers that are really getting the most out of today’s machines. And even that can be improved.

      Sorry for the tangent, haha. I do agree that Linus is the last “fortification” out there for the Linux-powered world. Should he leave the FOSS scene, I’ll probably move to BSD or something else… since if he leaves, it likely means that he’s either died or got tired of the BS. And that means the kernel is greased.

      Thanks for the link, IG. Given the date, it’s not a tongue-in-cheek Valentine’s gift, is it? :P!

      Comment by sporkbox | February 27, 2013

      • Happy Valentine! ;)

        > This mindset is exactly what causes the speed of software to steadily go down despite hardware growing in power at an alarming rate.

        Indeed, regardless of huge hardware advances for decades, most software is slower to use than it ever was. Further, it’s buggier than ever, with bugs and lack of intelligent self-diagnosis/-correction becoming an accepted norm. I never believe the user should have to wait for the computer, even a small ui delay, except for waiting for a specific large task to finish, such as copying a large file. And even in that case the ui should remain ready for use while the task is backgrounded. It’s a computer – why should I wait for it?? Exploring why orders of magnitude of hardware speed improvements have had a reversed effect on this problem is enlightening, but not very encouraging.

        Then there are all the cute little animations in many systems which ultimately slow the ui down even more (personally, I avoid them). I partly believe humans are so offended by the intrinsic precision, speed and reliability of computers that they deliberately cripple them and turn them into bug-ridden, slow, unreliable tools. Either that or someone doesn’t want the computer to evolve as the much more capable tool it can be.

        Linux bucks this trend in many CLI programs, or it did. But it seems the norm for anything GUI related is ‘slow and crippled’. Nor are the developers the only cause, because users also seem to obsess over minor GUI issues and appearances – shallowness – rather than concerning themselves with functionality and performance. Many people want to be entertained by new and regularly changing bells and whistles, rather than learning to use a more capable tool. Many developers respond to this by devoting unreasonable resources to frills and change for its own sake rather than functionality.

        I’m reminded of the movie Idiocracy – the whole idea that things improve with time is highly questionable. They can improve, but only if that is the direction taken. They can also devolve. To me, Linux and software in general seems to be devolving, with some exceptions.

        Comment by IgnorantGuru | March 5, 2013

        • Well said. You’re right; the problem is more than just developer mindsets. If developers produce efficient, powerful programs, they still need users to help debug them, provide more varied use-cases, and develop a ‘mindshare’ of sorts where the project gains recognition. That doesn’t happen if users don’t value what the software brings to the table. This focus on shallowness and computers-as-toys instead of computers-as-powerful-tools may be another aspect that’s dragging down efficiency.

          Programming languages, like natural languages, also influence the way we think. If the developer culture behind a language is not focusing on efficiency (alongside readability — the primary reason a new language is made), then how can we expect that language and its developers contribute positively to efficiency? That said, I think even some modern languages are focused on performance. Rust, Haskell, and to a lesser extent, Python all value efficiency and speed in addition to their more-or-less improved legibility.

          I’m not sure if compilers have much to do with it, since they’re being developed all the time. If there were regressions in that arena, wouldn’t we notice them? Some compilers can optimize certain codepaths better than a human… or so I’ve been told. At any rate, squeezing efficiency from today’s hardware is an interesting problem to me, and it will probably be a relevant problem as we reach the limits of Moore’s Law.

          Comment by sporkbox | March 17, 2013


Sorry, the comment form is closed at this time.

Follow

Get every new post delivered to your Inbox.

Join 123 other followers