Essentially, @SWA is right; however, I would like to clarify a couple things here:
-
Energy efficiency, and other efficiencies, are not features of a language; they are features of a language implementation. A programming language is just a bunch of rules for generating valid strings (syntax) and analyzing its semantics (type systems and inference rules). The physical world is only relevant when we start talking about how the language is actually implemented (how well the compilers and interpreters are written). There is nothing about python (especially with type hints) that prevents it form being compiled and there is nothing about C that prevents it from being interpreted in an even less efficient manner than python. There are many python compilers out there and while they aren’t as efficient as Rust or C, they significantly close the gap.
-
Most data science libraries are just wrappers for C. Numpy’s core is C, Pandas’s core is numpy, Slklearn, pytorch and tensorflow’s libraries and their dependencies are all written on top of C/C++. There are other libraries written on top of Julia and Rust. When you are executing python code in data science, you are almost always executing C code. There is going to be overhead from using Python as an interface, but this is nowhere even in the same universe as using raw python. If energy efficiency is a concern, then it just makes sense to write the core intense computations in a well-implemented, compiled language and then write an interface in some higher level interpreted language like python.
So the debate over the “efficiency of languages” is almost always a waste of time and often non-sensical. Very, very, very few large systems are going to be written exclusively in one language. Even if a developer writes an entire application for a business in Python, it almost certainly will have non-Python dependencies lurking somewhere.
Furthermore, we are leaving out the biggest issue of using languages like Rust, C, C++, etc: how easy it is to fuck up and create inefficiencies, leaks and so forth. C would be unsafe even for God himself to program in, but no one is more dangerous than an enraged programmer who has to write less intuitive, low level code that he has to build over and over and over again. These benchmark comparisons between well-implemented algorithms don’t even reflect reality. Whose to say that a programmer with 3 years experience in rust is going to write more efficient code than someone with 20 years in python? What about the pressures of bringing products to market and never fixing shitty codebases because there is always other things to do? No one is going to redesign everything, or even a significant chunk of things, with Rust so Rust programs are going to have to interface with other codebases, meaning that Rust may not be addressing any bottleneck in many use cases when we expect it to.
You know what Python has that most other languages don’t have? Really, really fast prototyping. Not having a compiler does increase debugging times (the compiler’s bitching does give you more hints), but I can try out ideas extremely fast in Python. That’s the core sell of interpreted languages. But no one talks about the energy that is saved by the soft factors. Instead, people talk about the hard factors like they are playing a trading card game with attack points and defense points. Yes, the hard factors like speed and power efficiency are important; however, the soft factors (ease of use, flexibility, size of standard library, development community, documentation, etc) are what matters most in creating a productive workflow and generating new ideas. We can only guess how much energy is saved in that case.