Jump to content

My opinion on Linus and Luke's Linux challenge

On 10/2/2021 at 1:49 PM, Jimzamjimmyy said:

I tried to comment this on YouTube, but it keeps deleting my comments so I have to post here:

 

As a (Fedora) linux user I would say pick RHEL and Fedora (or Manjaro or PopOS)

 

My personal choice is Fedora, but I see a plenty of reasons using PopOS or Manjaro. EDIT: after writing this "comment" I watched the rest of the wan show, and I trust Anthony at picking the best distros for them. 🙂

 

I really would like you to try Red Hat or SUSE technical support, since I haven't heard anyone trying them. It would be extremely interesting to hear some comments. Also an average user does not have Anthony at home so this support as a service will replace him partially. I picked RHEL, because it is more widely known, but SUSE is great distro too with a support team available.

 

Fedora is great choice, because it has really fresh packages and you can get the latest features of for example KVM there. There is huge community behind it, similar to Ubuntu, you can find solutions to almost any problems on their forums. Another reason to use Fedora is the Gnome, it is extremely close to stock Gnome (think it like clean android with no manufacturer customization which you would find in PopOS and Ubuntu). Fedora is also the choise of the Linux founder: Linus Torvalds as well as seemingly many LTT community Linux users. Fedora is one of the most polished and easy to use distro which actually has plenty of user customizability (in my opinion). @Conan Kudo mentioned plenty of other reasons, some of which I haven't tried.

 

PopOS is great option because it is Ubuntu based and to my knowledge it has quite new packages, not as fresh as Fedora but still. And System76 customizes Gnome quite heavily which is useful, but I do not like.

 

I do not have personal experience with Manjaro, but I expect ithas some greatness from Arch and also the risk of bricking things easily. Sure it would be the closest comparison to the Steam Deck.

 

I would not recommend Arch or Gentoo, because you have to figure so many things out before using them, and your experience would not be standardized like most new linux users have. You can customize other distros extremely well too, but it will be more interesting to see you using some distro which an average ex windows user would choose. The ability to pick and buid your distro is great but it isn’t useful to the viewers whose experience would be extremely different and again, an average person doesn't have Anthony helping them at home to fix their problems.

 

I would not pick Mint, yes it is fine distro, but it is based on Ubuntu, so no difference with other Ubuntu based distros and Mint has the oldschool Gnome. I do not see a reason to learn using that, because Gnome 41 is awesome I am just really subjective. Use Mint if you have specific reason to do so.

 

I would not pick Ubuntu either, because it is pushing Snap packages and Flatpak is used on almost every other distro. You can install Snaps on almost any distro, but the distro defaults do matter.

Sorry for offtopic, but form what I read, Fedora might be a good starter point for a beginner?

Background: I've used Linux in past and used Xubuntu, Mint XFCE, Ubuntu and maybe something else. Used mainly for web browsing, but right now I am looking for Linux distro because I am learning to program (Python) and I want to find an easy-to-use Linux distro.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Emzijs said:

Sorry for offtopic, but form what I read, Fedora might be a good starter point for a beginner?

Background: I've used Linux in past and used Xubuntu, Mint XFCE, Ubuntu and maybe something else. Used mainly for web browsing, but right now I am looking for Linux distro because I am learning to program (Python) and I want to find an easy-to-use Linux distro.

It's not off topic I don't think. The best way to learn Linux is honestly the way Linus and Luke are doing it.. Dive in head first and have no safety net.. Only way you'll learn to swim. And you will. If you have access to Windows or MacOS you may go their to solve your task instead of figuring out how to solve it on Linux and it will rob you of the learning experience. You can do every thing anyone else can do on Linux today.. You may just do it a different way.

Fedora is a fine distro on the surface. I personally tend to avoid it because I disagree with design choices RedHat makes, from a sysadmin level they make it needlessly complex. At a user level though.. Fedora, Ubuntu, OpenSuSE, Debian (and the respin distros based on other distros Pop OS, Mint, Manjaro etc) are all fine and popular and ALL of them have problems you'll have to solve.. their is no escaping this. (Windows and MacOS have problems too)

 

Once you get your feet under you and you understand basic Linux concepts If you'd really like to learn and get good.. It might be time to try a more difficult distro like Gentoo, if you can merely complete the Gentoo install walk through guide and get a running system you will have a fairly wide range of understanding. My personal favorites are FreeBSD, Gentoo and Alpine due to their design choices and flexibility.. And I use Ubuntu also for those times when I just don't care to mess with it but want it to basically work and be mostly hassle free. (tho I tend to use MacOS.. aka commercial Unix for my daily driver anymore..)

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, jde3 said:

It's not off topic I don't think. The best way to learn Linux is honestly the way Linus and Luke are doing it.. Dive in head first and have no safety net.. Only way you'll learn to swim. And you will. If you have access to Windows or MacOS you may go their to solve your task instead of figuring out how to solve it on Linux and it will rob you of the learning experience. You can do every thing anyone else can do on Linux today.. You may just do it a different way.

Fedora is a fine distro on the surface. I personally tend to avoid it because I disagree with design choices RedHat makes, from a sysadmin level they make it needlessly complex. At a user level though.. Fedora, Ubuntu, OpenSuSE, Debian (and the respin distros based on other distros Pop OS, Mint, Manjaro etc) are all fine and popular and ALL of them have problems you'll have to solve.. their is no escaping this. (Windows and MacOS have problems too)

 

Once you get your feet under you and you understand basic Linux concepts If you'd really like to learn and get good.. It might be time to try a more difficult distro like Gentoo, if you can merely complete the Gentoo install walk through guide and get a running system you will have a fairly wide range of understanding. My personal favorites are FreeBSD, Gentoo and Alpine due to their design choices and flexibility.. And I use Ubuntu also for those times when I just don't care to mess with it but want it to basically work and be mostly hassle free. (tho I tend to use MacOS.. aka commercial Unix for my daily driver anymore..)

Alright, it seems that the only logical solution is to go Ubuntu route and just modify it a little bit (like another user did) and make it look like Mac so it looks how I like. Thank you for your answer! 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Emzijs said:

Alright, it seems that the only logical solution is to go Ubuntu route and just modify it a little bit (like another user did) and make it look like Mac so it looks how I like. Thank you for your answer! 🙂

I'd actually just start with Gnome vanilla and try to figure it out.. it will start to make sense. Once you "get it" you can start adding extensions to it to customize it's behavior. -- It's an oddity to some people why it doesn't have app menu or a taskbar but honestly the reason it doesn't is because it does not need it.

I find it to be a very good work flow and a very... focus on the task at hand DE but you can't fight it and try to make it something it's not because it will suck if you do.

I've had to learn dozens of strange OS's through history.. (not just windows, mac and linux.. try PrimeOS, OpenVMS or Novel NetWare) One piece of advice I can give you is don't get upset when something is different than you expect. Sometimes in order to make things better, you have to change it's behavior. It's a very common action to get upset when a computer doesn't react the way you expected but try to view it it from the perspective of a user with no computer knowledge at all.

For example: There is a very good reason there is no "C:" drive for instance.. and as you learn more you will start to think.. ok yeah a C: Drive is actually some stupid dos thing that never should have existed.. let alone exists in 2021. Different isn't bad but it might be frustrating when you go and look for the C: drive and find a (at the time) confusing structure where everything is mounted off root.

In my time now.. thinking back on it.. I think that the hierarchy structure of windows/dos was always complete shit and this is probably the reason that lead people to the behavior of doing a search for every file today out of one large bin. Windows today does everything it can to hide the fact that garbage dos hierarchy structure is there.. but it is there.. lurking.. waiting to unleash it's suck and inefficiency on the world yet again.

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Emzijs said:

Sorry for offtopic, but form what I read, Fedora might be a good starter point for a beginner?

Background: I've used Linux in past and used Xubuntu, Mint XFCE, Ubuntu and maybe something else. Used mainly for web browsing, but right now I am looking for Linux distro because I am learning to program (Python) and I want to find an easy-to-use Linux distro.

Fedora is a perfectly suitable starting point. Moreover, Fedora is hugely into Python, and there's a dedicated Fedora Workstation variant preloaded with tools for helping people explore the Python programming language. Even without the variant, the standard Fedora Workstation is a great place to get started with a basic Linux experience and then work with Python.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Conan Kudo said:

Fedora is a perfectly suitable starting point. Moreover, Fedora is hugely into Python, and there's a dedicated Fedora Workstation variant preloaded with tools for helping people explore the Python programming language. Even without the variant, the standard Fedora Workstation is a great place to get started with a basic Linux experience and then work with Python.

They do indeed use Python for everything at Red Hat. And that's not a positive thing. 

 

Python is one of the least energy efficient programming languages. And that is what is most often used by new programmers to produce huge amounts of Python code. Just the internet will ask more than the energy consumption of China or the US in 2030 which is absurd. 

 

At the current rate of growth, AI will demand huge amounts of energy in a few years' time. So what I mean is that the current generation of brilliant programmers are really just killing the planet in record time.

 

Julia is often more productive than Python and a more versatile programming language. That's why I would recommend young programmers to ditch Python and use Julia instead for scripting, web, data science, deep learning, and AI.

 

Here are some informative articles about Julia:

https://benchmarksgame-team.pages.debian.net/benchmarksgame/fastest/julia-python3.html

https://towardsdatascience.com/5-spectacular-features-from-julia-i-wish-were-in-python-b55d66d25d7b

https://github.com/ninjaaron/administrative-scripting-with-julia

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Alexander Pushkin said:

They do indeed use Python for everything at Red Hat. And that's not a positive thing. 

Red Hat writes more C and Go than Python by far. Between the low level Linux ecosystem projects and big products like OpenShift, there's way more of that than Python.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Conan Kudo said:

Red Hat writes more C and Go than Python by far. Between the low level Linux ecosystem projects and big products like OpenShift, there's way more of that than Python.

Python is fine'ish.. It's kind of like programmer ducktape and can be good in places but.. it probably is inefficient and I have concerns with any scripted language in places.

Also.. Gentoo probably uses more python than anyone because the entire portage build system is written in python and it's massive and complex. You can't use Gentoo without it. But ANY Linux distro support it just fine so no problems. I don't see why it's even being discussed really..

 

"Only proprietary software vendors want proprietary software." - Dexter's Law

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Conan Kudo said:

Red Hat writes more C and Go than Python by far. Between the low level Linux ecosystem projects and big products like OpenShift, there's way more of that than Python.

You'd be surprised how much Python is used in projects that are largely developed by Red Hat.

 

One of the nice stats that Github shows is what percentage of the code a particular programming language has in the total code of the project.

For example, take a look at how much Python is used in the following large projects:

 

https://github.com/ansible/ansible

Ansible is one of the rare projects that has over 5000 contributors on GitHub.

It is very energy inefficient due to the programming languages used.

 

https://github.com/virt-manager/virt-bootstrap

https://github.com/spacewalkproject/spacewalk

https://github.com/rpm-software-management/yum

https://github.com/cobbler/cobbler

https://github.com/koji-project/koji

https://github.com/Anaconda-Platform/anaconda-client

https://github.com/cloud-bulldozer/browbeat

https://github.com/ibus/ibus-tmpl

https://github.com/virt-manager/virt-manager

https://github.com/openstack

 

Deltacloud was initiated by Red Hat:

https://github.com/deltacloud

 

Ruby is one of the only programming languages that consumes as much energy as Python. And shell is also one of the inefficient languages.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, jde3 said:

Python is fine'ish.. It's kind of like programmer ducktape and can be good in places but.. it probably is inefficient and I have concerns with any scripted language in places.

Also.. Gentoo probably uses more python than anyone because the entire portage build system is written in python and it's massive and complex. You can't use Gentoo without it. But ANY Linux distro support it just fine so no problems. I don't see why it's even being discussed really..

 

Because overuse of Python in rapidly increasing AI adoption has the potential to kill the planet.

 

AI Can Do Great Things—if It Doesn't Burn the Planet https://www.wired.com/story/ai-great-things-burn-planet/

 

For example, using mainly Python for data science has a double effect:

 

1) The data center will draw much more power than if data science were done with Julia instead of Python.
2) Because the data center draws much more power (due to Python's inefficiency) much more energy is needed to keep the data center cool. (huge amounts of energy are used to cool data centers)

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Alexander Pushkin said:

Because overuse of Python in rapidly increasing AI adoption has the potential to kill the planet.

 

AI Can Do Great Things—if It Doesn't Burn the Planet https://www.wired.com/story/ai-great-things-burn-planet/

 

For example, using mainly Python for data science has a double effect:

 

1) The data center will draw much more power than if data science were done with Julia instead of Python.
2) Because the data center draws much more power (due to Python's inefficiency) much more energy is needed to keep the data center cool. (huge amounts of energy are used to cool data centers)

This is very flawed reasoning, because Python in data science is done with libraries that pull all the real computation logic out of Python and into C++ (with also support for OpenCL or CUDA when working with GPU accelerated compute).

 

Moreover, compilation processes for languages like Rust, Go, Julia, etc. could be argued as worse than Python's, because Python's AOT and JIT processes can result in more optimal high level code paths while also providing better programmer efficiency, which reduces the overall carbon footprint. I agree that Julia is pretty good for data science stuff (that's what it was made for, after all), but energy and computational efficiency is very complex.

 

None of this stuff is straightforward at all, and it's not worth simplifying it down to "Python burns the planet".

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Conan Kudo said:

This is very flawed reasoning, because Python in data science is done with libraries that pull all the real computation logic out of Python and into C++ (with also support for OpenCL or CUDA when working with GPU accelerated compute).

Julia can also use the most efficient Python libraries (which are written in C, for example), and you can easily call and integrate C code and C libraries in Julia. So that doesn't technically make any difference between the two languages. What makes the difference, though, are all the millions of lines of Python code that are starting to pile up in more and more apps, and all this code would run much more efficiently if programmed in Julia.

Quote

Moreover, compilation processes for languages like Rust, Go, Julia, etc. could be argued as worse than Python's, because Python's AOT and JIT processes can result in more optimal high level code paths while also providing better programmer efficiency, which reduces the overall carbon footprint. 

A Julia program that you run the first time can indeed take more time than a Python program because of the JIT compilation. But on all subsequent runs, the Python program will be (much) slower. So no. Python does not lower the overall carbon footprint because Python is only 'possibly more efficient' in scripts that run once. But these types of simple scripts usually use very little energy, so the carbon footprint that Python saves here is completely insignificant.

Quote

I agree that Julia is pretty good for data science stuff (that's what it was made for, after all), but energy and computational efficiency is very complex.

It is clear from this statement that you have no idea why Julia was made. It is clearly explained in the following article: 

Entrepreneur Viral B Shah talks about starting up in the open-source computing industry https://yourstory.com/2021/09/entrepreneur-viral-b-shah-starting-up-julia-computing/amp

 

“We were all using Python, C, and Java to execute our ideas or run simulations, but the library dependency of these languages made them terrible. The world back then had accepted that you could not have high-level language and easy-for-production language together. With Julia, we wanted to solve this problem and we created something that was way faster than any other language,” explains Viral.

 

Julia is not specifically more suitable for data science than for eg deep learning, administrative scripts, AI or desktop apps. It is based on Lisp, the most powerful programming language available. So you can use Julia for any purpose. The reason why Julia is currently used more in data science and less for eg system administration applications like Ansible is purely coincidental. Julia's community isn't that big, so they can't target all domains and develop the best libraries for all domains. But the programming language itself is suitable for most purposes in any domain because it is a very versatile programming language.

Link to comment
Share on other sites

Link to post
Share on other sites

The statement that Python uses a lot more energy seems unsubstantiated to me. It is very difficult to determine this in real applications. Let's assume that indeed the same process in Julia does take a lot less CPU cycles than when performed in Python (I'll take your word for that), that still does not automatically mean a corresponding amount of energy usage. Obviously there is a connection, but there a lot more factors at play. 
For data science I assume the link is stronger. In data science it is a lot more common to max out CPU's than many of the other applications. Especially when used as the scripting language of choice for systems, where it is basically a bit of glue.
If Jullia needs it's better energy efficiency to convince organisations to switch or Linux distributions to switch, it really needs to make a dent in real life and I have not seen that. Switching has a lot consequences where not only a language features are important, but also things like: available libraries, experience in your staff, available frameworks, installed base. 
Companies such as Instagram and Netflix, that rely heavily on Python, have the scale that if it really matters, it would make sense for them to switch. Reducing the power bill for them really makes such a dent that the cost of refactoring their software in a different language is sensible. And they have the talented staff to pull that of. I have not seen them do that.

 

Don't get me wrong, Julia seems interesting to me. One day it may overtake Python if it is really faster in real life and as accessible a language as Python.  Although currently my money is on Go. But IMHO the whole ecosystem needs a lot more, especially outside of the data science/AI world, before it will get the adaption rates close to what Python now has.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/17/2021 at 2:14 PM, OldTweaker said:

The statement that Python uses a lot more energy seems unsubstantiated to me. It is very difficult to determine this in real applications.

A study was conducted in this regard in 2017: https://greenlab.di.uminho.pt/wp-content/uploads/2017/10/sleFinal.pdf

Ruby, Python, and Perl used the most energy of all programming languages. Only Perl scored worse than Python (and it was close).

Quote

If Jullia needs it's better energy efficiency to convince organisations to switch or Linux distributions to switch, it really needs to make a dent in real life and I have not seen that. Switching has a lot consequences where not only a language features are important, but also things like: available libraries, experience in your staff, available frameworks, installed base. 

Today's universities mainly teach Python to students for AI and Data Science, although it is the most problematic programming language of all. Google doesn't want to switch to Julia, because Python is a programming language they developed. Guido van Rossum was a Google employee. Apple isn't going to switch to Julia either, because they have Swift that they want to make the norm. Though Swift is a total flop compared to Julia. I personally think it is the job of universities to teach students to solve problems instead of creating problems.

 

Incidentally, it is not the energy consumption that is the only advantage of Julia. Python is often used to build a prototype of an idea. If it turns out to be a successful prototype, it is rewritten to eg C or C++ for production. For example, think of software for 3D printers. With Julia you only have to write the code once. So it's a faster production process. Which makes a world of difference for startups.

Quote

Reducing the power bill for them really makes such a dent that the cost of refactoring their software in a different language is sensible. And they have the talented staff to pull that of. I have not seen them do that.

You forget that primates often have copy-paste behavior. In the past, this was necessary for survival. If you used a different tactic or behaved differently from the rest of the group during the hunt, it could endanger the entire group. That is why people with deviant behavior are instinctively rejected. But that primitive copy-paste behavior is likely to be man's downfall.

 

The light computer was apparently always a better idea, but the computer industry has always continued to pierce a dead end: https://www.heise.de/hintergrund/Der-Lichtcomputer-Wie-sich-mit-Licht-schnell-rechnen-laesst-6165686.html Here you see an example of how copy-paste behavior causes people to end up in a dead-end street.

 

There is a very reliable study that says that capitalism is going to destroy the world around 2050. That trend has been going on for a while, and we are indeed getting close to the end. The study mentions the fact that electric cars solve little or nothing, and that people will have to cycle more and fly less. Also, too much meat is eaten. Then there's the fact that the development of nodes is slowing down. We're going to see 3nm soon. And then maybe 2nm or 1nm. But it seems that after that there are few possibilities to narrow the process even further.

 

By 2050, we could retain high levels of GDP, at the price of a world wracked by minerals and materials shortages, catastrophic climate change, and a stuttering clean energy transition —paving the way for a slowly crumbling civilization. 

 

The economy will hit the limits of mineral and material production to sustain this electric transition—in just three decades. And this is even with high levels of minerals recycling. Mineral depletion takes place even with “a very high increase in recycling rates” in a continuing GDP growth scenario. The conclusion corroborates findings of other studies, estimating an expected bottleneck for lithium by 2042-2045 and for manganese by 2038-2050.  Actual bottlenecks could come even earlier because existing studies—including the MEDEAS model—don’t account for material requirements needed for internal wiring, the EV motor, EV chargers, building and maintaining the grid to connect and charge EV batteries, the catenaries to electrify the railways, as well as inherent difficulties in recycling metals. 

 

Humanity is in very deep trouble, because if ICT continues to grow as it has in recent years, it's going to ask a huge ecological footprint. Then the world will certainly be largely destroyed by 2050. That is the principle of ecological footprint. Ecological overshoot means exceeding the earth’s regenerative capacity. Ecological overshoot occurs when humanity’s demand on nature exceeds what the biosphere can supply. Humankind’s demands have been exceeding the world’s biocapacity since 1980. Population growth, economic growth, and world trade are literally consuming the living fabric of the earth. 

 

The Breakthrough National Centre For Climate Restoration, a think tank in Melbourne, Australia, released a report suggesting that there was a “high likelihood of human civilisation coming to an end” within a 30 year timeframe.

Link to comment
Share on other sites

Link to post
Share on other sites

First of, I commend you about caring for the planet. And I'm fully aware mankind, or our current society, can be stupid enough to be on a self destructive path. I'm also not naive about the fact that the companies don't care enough about the environment. However, companies of the scale of Netflix and Instagram do optimize heavily because it makes sense to them money wise. Given the amount of power they consume they can afford the engineering effort to change languages if that makes for a significant improvement. Reducing power is not just the power bill. Data centers care about density. If a language is significantly more efficient in real life settings it also means they can do a lot more with less systems. 
The paper you linked to does not at all show such a significant difference. First off, it doesn't even mention Julia. It also concludes that being faster doesn't necessarily mean less energy consumption. Although given the queueing theory, which professional performance engineers use, you would indeed expect that. It is also a fully synthetic test that does it best to emulate real world loads, but remains a theoretic test. Perfect for some comparisons, but not enough to draw conclusions on real world scenarios. Reality can be a bitch..
Furthermore as others have pointed out, Python in the real world heavily relies on libraries written in the, indeed, more efficient languages for the heavy lifting. Whereas this paper uses code in pure Python. So it really says very little of apps written in Python that use those libraries. 

 

Then you make some more statements that need some correcting:

  1. Python was not developed by Google. Van Rossem joined ranks when Google was already using Python before they hired him and Python was there before Google. Also Google actually does switch, they are however moving towards Go, which is developed by them.
  2. Apple and Swift: your statement that Swift is a total flop is not backed up. I could care less about Swift but to make such a bold statement you need to back that up. I deal a lot with programmers in different languages. They often think the other language is crap, does not scale, is a memory hog and many more myths. The really good programmers don't do that. They know each language has it strength and weakness. 
  3. You claim that Python is the most problematic language of them all. Yes of course, all those people are wrong.  
  4. Python just for a prototype and rewritten in C or C++? Quick go to Instagram and tell them they are doing it wrong... There will definitely be cases where prototypes written in Python are rewritten, whether or not for legitimate reasons. C or C++ are not often the language of choice. Usually it is Java. C/C++ is these days usually reserved for when you deal with hardware or embedded. So indeed 3d printers would be a good example. Although Rust is quickly becoming the language of choice for those use cases.

I don't mind your enthusiasm for Julia. It has piqued my interest, but your anti Python (and Swift) statements are unjustified.

 

However, I think this thread is deviating a bit too much and I doubt everyone here is interested in discussions on programming languages so I'll leave it at that

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, OldTweaker said:

The paper you linked to does not at all show such a significant difference. First off, it doesn't even mention Julia. It also concludes that being faster doesn't necessarily mean less energy consumption. Although given the queueing theory, which professional performance engineers use, you would indeed expect that. It is also a fully synthetic test that does it best to emulate real world loads, but remains a theoretic test. Perfect for some comparisons, but not enough to draw conclusions on real world scenarios. Reality can be a bitch..

Even with having to do more work than other compiled languages in the benchmark, Julia's energy benchmarks are fairly close to those of Java. So if you remove compilation, it probably ends up closer to the Rust/C/Fortran category.

In fannkuch-redux, Python uses 39x more energy than Julia.

In fasta, Julia also uses 30 times less energy than Python.

That's not much? Are you serious?
Did you know that data centers use much more energy due to Python. The higher energy consumption of the programming language leads to much higher energy consumption for cooling the data center. Moreover, more cooling causes heavier pollution of the rivers.

Python makes the rivers much more polluted by data centers.

 

Quote
  1. Python was not developed by Google. Van Rossem joined ranks when Google was already using Python before they hired him and Python was there before Google. Also Google actually does switch, they are however moving towards Go, which is developed by them.

Google hired Guido von Rossum, the language's creator, as a full-time engineer to ensure that he could focus on Python's long-term goals and its future development. But if an employee of your company develops a programming language in your company under your payment, you can still say as a company that you have partially developed this programming language.

Quote
  1. You claim that Python is the most problematic language of them all. Yes of course, all those people are wrong.  

Python is a Bad Programming Language https://medium.com/nerd-for-tech/python-is-a-bad-programming-language-2ab73b0bda5

I’ll say it again: Python is a bad programming language, and the only reason it’s so popular today is because Google pushed it so hard in the first decade of the 2000s.

 

 

Quote
  1. Apple and Swift: your statement that Swift is a total flop is not backed up. I could care less about Swift but to make such a bold statement you need to back that up. I deal a lot with programmers in different languages. They often think the other language is crap, does not scale, is a memory hog and many more myths. The really good programmers don't do that. They know each language has it strength and weakness. 

Starting to dig into Swift... it sucks. https://forums.macrumors.com/threads/starting-to-dig-into-swift-it-sucks.1934555/

Swift SUCKS https://devrant.com/rants/1044530/swift-sucks-why-because-of-its-absolutely-useless-complexity-a-total-simple-thin

Four Years On, Developers Ponder The Real Purpose of Apple's Swift Programming Language https://apple.slashdot.org/story/18/06/11/1438259/four-years-on-developers-ponder-the-real-purpose-of-apples-swift-programming-language

Swift is a Bit of a Mess https://kakubei.github.io/2014/11/15/swift-is-a-mess/

iOS Development: Why XCode Sucks [Swift 2021] https://www.youtube.com/watch?v=kW7KkoR1ZcU

 

Quote
  1. Python just for a prototype and rewritten in C or C++? Quick go to Instagram and tell them they are doing it wrong... There will definitely be cases where prototypes written in Python are rewritten, whether or not for legitimate reasons. C or C++ are not often the language of choice. Usually it is Java. C/C++ is these days usually reserved for when you deal with hardware or embedded. So indeed 3d printers would be a good example. Although Rust is quickly becoming the language of choice for those use cases.

You don't seem to deny what I've written. For certain software and drivers, this software is often written in C++ and a prototype developed in another programming language. With Julia you have less work to find bugs in C++ and you can use the same language for the prototype, which is very time-saving and can make the difference between success and failure in a startup.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Alexander Pushkin said:

Even with having to do more work than other compiled languages in the benchmark, Julia's energy benchmarks are fairly close to those of Java. So if you remove compilation, it probably ends up closer to the Rust/C/Fortran category.

In fannkuch-redux, Python uses 39x more energy than Julia.

In fasta, Julia also uses 30 times less energy than Python.

That's not much? Are you serious?
Did you know that data centers use much more energy due to Python. The higher energy consumption of the programming language leads to much higher energy consumption for cooling the data center. Moreover, more cooling causes heavier pollution of the rivers.

Python makes the rivers much more polluted by data centers.

 

Google hired Guido von Rossum, the language's creator, as a full-time engineer to ensure that he could focus on Python's long-term goals and its future development. But if an employee of your company develops a programming language in your company under your payment, you can still say as a company that you have partially developed this programming language.

Python is a Bad Programming Language https://medium.com/nerd-for-tech/python-is-a-bad-programming-language-2ab73b0bda5

I’ll say it again: Python is a bad programming language, and the only reason it’s so popular today is because Google pushed it so hard in the first decade of the 2000s.

 

 

Starting to dig into Swift... it sucks. https://forums.macrumors.com/threads/starting-to-dig-into-swift-it-sucks.1934555/

Swift SUCKS https://devrant.com/rants/1044530/swift-sucks-why-because-of-its-absolutely-useless-complexity-a-total-simple-thin

Four Years On, Developers Ponder The Real Purpose of Apple's Swift Programming Language https://apple.slashdot.org/story/18/06/11/1438259/four-years-on-developers-ponder-the-real-purpose-of-apples-swift-programming-language

Swift is a Bit of a Mess https://kakubei.github.io/2014/11/15/swift-is-a-mess/

iOS Development: Why XCode Sucks [Swift 2021] https://www.youtube.com/watch?v=kW7KkoR1ZcU

 

You don't seem to deny what I've written. For certain software and drivers, this software is often written in C++ and a prototype developed in another programming language. With Julia you have less work to find bugs in C++ and you can use the same language for the prototype, which is very time-saving and can make the difference between success and failure in a startup.

Python is a GREAT programming language for one simple reason: it saves the time of the developers. Even as a non-computer science graduate can develop in a day something that a an experienced and educated C developer will need a week for. C code will obviously work faster, but the machine time is now cheap and the developers' time is expensive. And that's the reason why it is popular, not because Google or whoever else "pushed" it.

 

As for environmental concerns - well, once the electricity bills per kWh become so expensive that hiring a good C developer will become profitable, companies will switch. But you need a push for that from the governments.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/15/2021 at 8:30 AM, jde3 said:

The best way to learn Linux is honestly the way Linus and Luke are doing it.. Dive in head first and have no safety net

 

On 10/15/2021 at 8:30 AM, jde3 said:

Once you get your feet under you and you understand basic Linux concepts If you'd really like to learn and get good.. It might be time to try a more difficult distro like Gentoo, if you can merely complete the Gentoo install walk through guide and get a running system you will have a fairly wide range of understanding.


tl;dr: this challenge isn't really about learning Linux so much as testing its readiness for naive but Windows-experienced and hardware-knowledgeable power users, but an actual ‘learn Linux challenge’ based on ‘install Gentoo lul’ can be genuinely instructive without being gratuitously difficult if you control for hardware compatibility

For people who are already technical or power users like Linus and Luke, setting up an old-fashioned Linux setup which has no defaults (LFS, Gentoo, Arch Linux— in descending order of usefulness and difficulty) is probably the best approach to quickly learning everything you need to feel totally at home on any Linux distro you might want to use in the future.

If you control the complexity of the process by limiting the complexity of the hardware setup rather than trying to do so by choosing an ‘easier’ distro, you avoid wasting time stumbling over obscure difficulties but you get walked through the fundamentals in a straightforward way, instead of skipping over them in favor of the search for an install wizard that gives you a magical outcome (a search which teaches you nothing and does nothing to empower you).

For LMG, that approach is especially easy, since they can set up VMs with known configurations on a host OS of their choosing, and they have the budget and connections for easy access to known-good hardware (e.g., a 3-year-old System76 desktop). Learning how to read and follow the docs required for basic setup and troubleshooting on systems where you know there are no hardware bugs is not difficult, meme status of ‘hard’ distros aside. The point of that is not to struggle to overcome difficult quirks or learn to appreciate all of the integration work that distribution developers do, but to teach you the anatomy of a working Linux system and a handful of basic skills for inspecting and repairing them. It's not about being ‘hardcore’, it's not about doing everything yourself, it's not about avoiding asking for help, it's not about optimizing everything, and it's not about using something that's ‘better’ than other distros. The idea is to work through something that is manual but predictable and well-documented.

Then you move on to a setup that inverts what is difficult and what is easy about that introductory one: you take a batteries-included desktop distro of your choice and install it on your pre-existing hardware, on that secondary SSD on your old Windows box or whatever. Instead of going through manual steps that you know will work for every piece of hardware, you let the distro get you 95% of the way there for unknown hardware ‘automagically’. Now you're only looking at manual intervention for setting up 5% of your system instead of 100% of it, but the 5% you're looking at will likely not be covered in some single, pretty, unified manual somewhere.

At that point you're free to choose any Linux distribution and make it work, instead of playing distro roulette until all your shit happens to work for a few months and praying that nothing changes.

That would be a totally different series than what I think LMG is going for right now, though. It's also probably too much a young man's game to be enjoyable for Linus (it was fine for me as a teenager, but I might be too old and fussy to really relish exploring and struggling in that way, now). But it could work as a format for a future challenge, like ‘Can Luke become a DESKTOP LINUX SUPER EXPERT in JUST TWO MONTHS??’, where Luke walks through the  process outlined above before he and Anthony are each handed an identical PC and told they have to get everything working with two distros unknown to both of them.

Anyway, it'll be interesting to see what approaches Linus and Luke model for their viewers, and where they fall on the spectrum from ‘any fix I use without fully understanding it is no fix at all’ to ‘I will perform any sequence of rituals to make this work as long as I don't have to actually read a manual’, and whether any such differences in approach ‘pay off’ for either of them in the challenge.

Of course it's fine to say ‘I already have enough hobbies; I might use Linux but mastering it won't become a new hobby’, and I get that ultimately LTT is about guiding consumer tech purchases and related decisions. But YouTube and the web are full of content erroneously teaching people that choice of distro deeply matters, that ‘idk the actual issues you're experiencing, but using $DISTRO_X gave me good vibes and mostly worked for me once’ is good or even meaningful advice, that some distros are ‘only for beginners’, that it takes some kind of supergenius to ‘install Arch’, that choice of distro is some kind of status symbol, that people who've never written a startup script or unit file let alone a line of code have interesting or useful opinions about the inherent evil or virtue of systemd, etc. So it would pretty cool for a channel with as much reach as LTT to show that actually, if you print out an installation guide on paper, throw your smartphone in a drawer while you read that manual from start to back (just once!) before you attempt an install, you can find your footing on an ‘elite’ distro in a weekend, and over the course of a few weeks or months, develop the skills required to mostly not give a shit about what distro you're on for the rest of your life.

So here's hoping the current project is fun for Luke and Linus, and generates enough content and attracts enough viewers for LMG to think that one or more follow-up challenges are worth doing!

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Alexeygridnev1993 said:

Python is a GREAT programming language for one simple reason: it saves the time of the developers. Even as a non-computer science graduate can develop in a day something that a an experienced and educated C developer will need a week for. C code will obviously work faster, but the machine time is now cheap and the developers' time is expensive. And that's the reason why it is popular, not because Google or whoever else "pushed" it.

Your ratio is not quite correct. Programming something in C is usually going to take more time, but not seven times more than Python: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.1831&rep=rep1&type=pdf

 

The thing is, we are now 2021. We now have many high level languages that are as simple as Python, and some are much faster. Lots of people say they find Go, Fortran, Visual Basic .Net, Visual Basic 6, C#, Elixir, Ruby, SmallTalk, Groovy, Lua or Julia as simple or simpler than Python.

 

Let's make an objective comparison of whether Python or Lisp is the most productive programming language.

Let’s say we have a simple roulette program. In Python, it would be: 

main-qimg-ab475497dd84a06664f516cca1d850

 

Whereas the Lisp (Scheme) version would be:

main-qimg-9679284e89a0a47b91d76f22b5ac24

Given this simple program as a representative sample, we see that Scheme Lisp has only two more parentheses than the Python version, while Python has the comma, the colon and indentation.

From the point of view of reading and writing, a large program written in Scheme is made of so few syntactical structures that so long as the variable names are clear and intuitive enough, the whole program (logic) stands out on its own—whereas in Python, it will be relatively drenched in syntax.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/11/2021 at 4:23 PM, Alexander Pushkin said:

Jails (much more secure than the popular Docker containers)

Can you ELI5 why FreeBSD jails are supposed to be more secure than Docker?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Patrick C. said:

Can you ELI5 why FreeBSD jails are supposed to be more secure than Docker?

The whole point of Docker is shared resources, and bundling application dependencies and the application itself together. Docker is an alternative to just running the binary + dependancies. Docker does have a lot of security features and makes good use of Linux kernel security features like namespaces and cgroups. 

 

But FreeBSD jails are a virtualization envirnoment, and that's just a completely different security architecture. The do share the same kernel, like Docker containers, but FreeBSD jails don't share any other libraries or dependencies between them and are designed to be totally isolated from one another.

 

Running some application in a virtualized environment is always going to be more secure that some alternative way of running it, but that's not always the most important problem you're trying to solve. Because if it was, you wouldn't use FreeBSD jails either, as the shared kernel increases the attack surface vs traditional full fat virtual machines. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/19/2021 at 9:35 PM, Alexander Pushkin said:

Starting to dig into Swift... it sucks. https://forums.macrumors.com/threads/starting-to-dig-into-swift-it-sucks.1934555/

Swift SUCKS https://devrant.com/rants/1044530/swift-sucks-why-because-of-its-absolutely-useless-complexity-a-total-simple-thin

Four Years On, Developers Ponder The Real Purpose of Apple's Swift Programming Language https://apple.slashdot.org/story/18/06/11/1438259/four-years-on-developers-ponder-the-real-purpose-of-apples-swift-programming-language

Swift is a Bit of a Mess https://kakubei.github.io/2014/11/15/swift-is-a-mess/

iOS Development: Why XCode Sucks [Swift 2021] https://www.youtube.com/watch?v=kW7KkoR1ZcU

😂 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm keen on hearing any news about their Linux transition and often perk-up whenever it's mentioned on the WAN show. Personally having transitioned over to a primary Linux machine for web, general use and light Steam Streaming, it was quite difficult to get my hardware to a state where it was stable with power management, and getting the AMD GPU to properly work. @LinusTech's comments about having to send time getting the hardware to work properly really brought gravitas to the argument and an expected pain point of the transition, as this was my own personal concern and a source of my frustration at the time.

 

The transition came about originally as I wanted to setup a ITX home console to sit in the living room, but those plans fell though with inadequate space to store the machine. I wanted to cheap out on getting an OS to run, and wanted to give Pop_OS! a run for it's money. It worked hardware wise, with the exception of my R9 390X, often running hot from machine boot. It's a expected issue with these Hawaii GPU based cards. I knew about the issue, and knew of CoreCtrl as a monitoring and fan control software as a Linux drop-in substitute to the Windows AMD Radeon control panel. Getting it to work required changing some Linux boot parameters, and thankfully there was some documentation to activate the flags. Something that tinkers know about, and comfortable adjusting.

 

Sadly as stated the machine never got a proper run for its money, and sat idle for about 8 months before being pulled out again to be used a replacement as my day-to-day machine to replace my main rig, which is soon to become my gaming-only PC once I decide on the space. That machine is Windows based, and I hope to dive into VR, which Windows is currently the best OS to use at this point in time. Whilst the machine is capable, I'm personally frustrated by on-board audio driver issues (as the Motherboard hasn't received updates since 2018) and doesn't play nice with Windows 1903 and beyond with stutters and hitches. It even affected A/V desync issues on my Web Browsers, plus the paranoia of Windows locking down the OS, and running telemetry on everything, I just had enough.

 

I am running Manjaro on my ITX build as it fits my needs of working supporting a community maintained application collection, with a back catalogue of Arch Repository apps too. Snaps and Flatpack packaged apps are also supported, but I don't want to dive into those at present.

After the install was completed with some extra applications installed to get my GPU fans under control, and supporting DPI adjustments for my mouse (using Piper), I was running into issues with having the Machine go into Sleep or Hibernation. In either case, the machine would break the experience of launching the Desktop Environment, as the Window Server would crash and fail to launch, or if it managed to restore properly, I would not be able to shutdown the computer without doing a hard power off with the Power Button. Not good.

 

After seeking some support from the Manjaro forums, it came down to two issues

  • Hibernation was due to a misconfigured Swap partition size, as I had a non recommended setup as I wanted to use LVM rather than normal partitions. I wanted some hard drive flexibility. Being of a programmer background, it was a risk I could take, but to solve it, I had to reinstall it. This is something most people SHOULD NOT attempt, such as myself as I didn't know what I was doing.
  • Sleep/Suspend issue was due to an older BIOS image flashed to the motherboard. I had not updated it since I purchased it, and something I don't think to attempt. Thankfully this resolved the issue.

There was also a final side issue with the GPU, whilst I had the fan control working, I was unable to launch games, partially a unity game. Having a R9 390X, it was possible that I was not using the newer AMDGPU driver bundled with the Linux Kernel, and was somehow still using the older Radeon driver. A couple of boot configuration changes, and the games launched without and issue, and I had some extra monitor metrics to boot.

 

Circling back, @LinusTech's point of Linux's struggle to go mainstream is issues like these that I experience. Even for him and his edge-case machine which we are yet to learn about. A high majority of 'end users" are not coders or tinkers. They have no confidence to debug a Linux machine. I think (and this extends to myself) is we often expect the systems to be smart enough to identify our hardware, and be properly configure to work out of the box. I honestly feel that having something that allowed me to explore and modify my boot parameters to get hardware to work would get my needs up to speed quickly, but right now, it comes with risks, as if it breaks, I have to "hack" my way back in.

 

So I am interested to hear Linus's and Luke's comments on their journey

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×