I remember chatting with one of the creators of PyPy (not the author of TFA) a number of years ago at HPI. He had just given a talk about how RPython was used in PyPy development, and I was fascinated.
To me, it seemed completely obvious that RPython itself seemed like a really interesting standalone language, but he would have none of it.
Whenever I suggested that RPython might have advantages over PyPy he insisted that PyPy was better and, more strangely, just as fast. Which was sort of puzzling, because the reason given for RPython was speed. When I then suggested that they could (after bootstrap) just use PyPy without the need for RPython, he insisted that PyPy was too slow for that to be feasible.
The fact that both of these statements could not really be true at the same time did not register.
I have asked about using RPython as a generic standalone language before. I think the official statement is that is was never intended to become one, and it's really a very minimal subset of Python (so basically no existing Python code will run, it would require heavy refactoring or complete rewrite), and it's only specifically those features that they currently need, and it might also be a moving target, and they don't want to give certain guarantees on stability of the language etc.
Once you consider that you anyway need to write very different kind of code for RPython, then maybe just using Nim or some other language is a better idea?
Neat idea! Author’s ideas about different subsets of Python are with the price of admission. What you can express in the type system, what performs well under JIT, what’s basically same and reasonable, may not be precisely specified, but are still useful and distinct ideas.
Common Lisp is not a runtime, it’s a specification. Implementations are free to compile everything to fast native code, or to interpret everything. Various available implementations do that and everything in between. That said , SBCL and the commercial implementations can be extremely fast, especially if you specify types on tight loops. SBCL comes with a disassembler that shows you right in the REPL the Assembly a function compiles to so you can even get close to C performance.
Common Lisp doesn't use (expensive) CLOS dispatch in the core language, e.g. to add two numbers or find the right equality operator. That's one known pain point due to CLOS having been "bolted-on" rather than part of the language which makes the divide between internal (using typecase and similar) and external (generic functions) dispatch pretty ugly; and gave use the eql/equal/equalp/etc... hell.
Just like Smalltalk and SELF, also Lisp Machines and Interlisp-D.
Usually comes down from a urban myth that Python is special and there was no other dynamic language before it came to be.
The JIT research on those platforms is what gave us leading JIT capabilities on modern runtimes, OpenJDK HotSpot traces back to Smalltalk and StrongTalk, while V8 traces back to SELF.
Especially in Smalltalk and SELF, you can change anything at any time across the whole image, and have the JIT pick up on that and re-optimize.
Granted what messes up Python, or better said CPython implemenation, is that C extensions are allowed to mess up with its internals thus making void many possible optimizations that would be otherwise available.
A reason why JVM, CLR, V8, ART make use of handles and have marshaling layers not allowing such kind of liberties with native extensions.
Great explanation. Five years ago I did the genealogical work to discover that StrongTalk begat HotSpot (by virtue of having some of the same authors) It was quite a joy to discover!
The problem with this is that the main value of Python is its ecosystem. SPy aims to be able to import Python libraries, but also not implement all Python features. If you are not 100% compatible how can you reliably import libraries?
SPy seems most likely to be more likely to be appealing as a more Pythonic alternative to Cython rather than a Python replacement.
hello, author of the blog post and author of SPy here.
> how can you reliably import libraries?
the blog post specifies it but probably not in great level of detail. Calling python libs from spy will go through libpython.so (so essentially we will embed CPython). So CPython will import the library, and there will be a SPy<=>CPython interop layer to convert/proxy objects on the two worlds.
I did a similar project, a typed perl. cperl. I could import most the modules, and did add types to some of the important modules. Eg testing was 2x faster. I needed typing patches for about 10% for most CPAN packages.
Based on my understanding, Mojo aims to make number crunch computation faster (GPU), while as SPy aims to make generic Python application logic faster. Very similar, but different sweet spots and use cases.
> SPy is not a "compiler for Python". There are features of the Python language which will never be supported by SPy by design. Don't expect to compile Django or FastAPI with SPy.
This is what F# provides. It has a similar whitespace syntax to Python, but is statically typed and can be compiled AoT.
Bubble sort Python:
mylist = [64, 34, 25, 12, 22, 11, 90, 5]
n = len(mylist)
for i in range(n-1):
for j in range(n-i-1):
if mylist[j] > mylist[j+1]:
mylist[j], mylist[j+1] = mylist[j+1], mylist[j]
print(mylist)
Bubble sort F#:
let mylist = ResizeArray [ 64; 34; 25; 12; 22; 11; 90; 5 ]
let n = Seq.length mylist
for i = 0 to n - 2 do
for j = 0 to n - i - 2 do
if mylist[j] > mylist[j + 1] then
let temp = mylist[j]
mylist[j] <- mylist[j + 1]
mylist[j + 1] <- temp
printfn "%A" mylist
I believe that Python is as popular and widely used as it is because it's old enough to have an expansive ecosystem of libraries. It's easy enough to implement one in pure Python and possible to optimize it later (Pydantic is a great recent-ish example, switching to a Rust core for 2.0).
That same combination of Python + (choose a compiled language) makes it quite difficult for any new language to tap into the main strength of Python.
Nim feels like a really amazing language.
There were some minor things that I wanted to do with it. Like trying to solve a codeforces question just out of mere curiosity to build something on top of it.
I felt like although it was similar to python. You can't underestimate the python's standard library features which I felt lacking. I am not sure if these were skill issues. Yes these are similar languages but I would still say that I really welcome a language like SPy too.
The funny thing is that I ended up architecting a really complicated solution to a simple problem in nim and I was proud of it and then I asked chatgpt thinking no way there can be anything simpler for it in nim and I found something that worked in 7-10 or 12* lines and my jaw dropped lol. Maybe chatgpt could be decent to learn nim imo or reading some nim books for sure but the packages environment etc. felt really brittle as well.
I think that there are good features of both nim and SPy and I welcome both personally.
This looks like a very interesting approach bringing comptime to a static version of python. This version of comptime can then be used to define new types in the same way Zig does it.
I absolutely hate the terminology though red/blue redshifting etc. Why do blue functions disappear when redshifting? If you red shift blue then it goes down in frequency so you might get green or red. Perhaps my physics brain is just over thinking it!
> The other fundamental concept in SPy is redshifting.
> Each expression is given a color:
> blue expressions are those which can safely be evaluated ahead of time, because they don't have side effects and all operands are statically known.
> red expressions are those which needs to be evaluated at runtime.
> During redshifting we eagerly evaluate all the blue parts of the code: it's a form of partial evaluation. This process plays very well with the freezing that we discussed above, because a lot of operations on frozen data become automatically blue: for example, if we statically know the type of an object, the logic to look up a method inside the frozen class hierarchy is a blue operation and it's optimized away, leaving just a direct call as a result.
Please just rename it comptime then at least people who have learnt Zig will know what it means immediately.
In FORTH these would have been called IMMEDIATE words. Namely functions which run at "compile" time rather than run time.
Yes, it's mature, but you (and your potential audience) basically need to learn a new language, a lot of quirks and "weird" (I'd even say counter-intuitive) nuances, and it's also significantly less readable in comparison with strict and typed Python. Even its modern syntax doesn't click immediately (also performance wise the new syntax somehow is a bit slower in my tests)
It's still pretty confusing: Uunchecking the box doesn't seem to do much (is it actually unchecked when you click it? There's still a checkmark); you still have to click Accept to see the text; what are you accepting?
In any case, pre-checked boxes are not valid consent under GDPR (“Planet49”).
Looks very interesting!
I remember chatting with one of the creators of PyPy (not the author of TFA) a number of years ago at HPI. He had just given a talk about how RPython was used in PyPy development, and I was fascinated.
To me, it seemed completely obvious that RPython itself seemed like a really interesting standalone language, but he would have none of it.
Whenever I suggested that RPython might have advantages over PyPy he insisted that PyPy was better and, more strangely, just as fast. Which was sort of puzzling, because the reason given for RPython was speed. When I then suggested that they could (after bootstrap) just use PyPy without the need for RPython, he insisted that PyPy was too slow for that to be feasible.
The fact that both of these statements could not really be true at the same time did not register.
I have asked about using RPython as a generic standalone language before. I think the official statement is that is was never intended to become one, and it's really a very minimal subset of Python (so basically no existing Python code will run, it would require heavy refactoring or complete rewrite), and it's only specifically those features that they currently need, and it might also be a moving target, and they don't want to give certain guarantees on stability of the language etc.
Once you consider that you anyway need to write very different kind of code for RPython, then maybe just using Nim or some other language is a better idea?
Neat idea! Author’s ideas about different subsets of Python are with the price of admission. What you can express in the type system, what performs well under JIT, what’s basically same and reasonable, may not be precisely specified, but are still useful and distinct ideas.
Common Lisp also allows you to redefine everything at runtime but doesn't suffer from the same performance issues that Python has, does it?
Doe anyone have insight into this?
Common Lisp is not a runtime, it’s a specification. Implementations are free to compile everything to fast native code, or to interpret everything. Various available implementations do that and everything in between. That said , SBCL and the commercial implementations can be extremely fast, especially if you specify types on tight loops. SBCL comes with a disassembler that shows you right in the REPL the Assembly a function compiles to so you can even get close to C performance.
Common Lisp doesn't use (expensive) CLOS dispatch in the core language, e.g. to add two numbers or find the right equality operator. That's one known pain point due to CLOS having been "bolted-on" rather than part of the language which makes the divide between internal (using typecase and similar) and external (generic functions) dispatch pretty ugly; and gave use the eql/equal/equalp/etc... hell.
Thing is that you need a complex JIT like Julia's or stuff like https://github.com/marcoheisig/fast-generic-functions to offset the cost of constant dynamic dispatch.
I actually had such a conversation on that comparison earlier this year: https://lwn.net/Articles/1032617/
What is CLOS in this context?
The Common Lisp Object System: https://en.wikipedia.org/wiki/Common_Lisp_Object_System
Common Lisp Object System. The language amazing version of what OOP can be.
Just like Smalltalk and SELF, also Lisp Machines and Interlisp-D.
Usually comes down from a urban myth that Python is special and there was no other dynamic language before it came to be.
The JIT research on those platforms is what gave us leading JIT capabilities on modern runtimes, OpenJDK HotSpot traces back to Smalltalk and StrongTalk, while V8 traces back to SELF.
Especially in Smalltalk and SELF, you can change anything at any time across the whole image, and have the JIT pick up on that and re-optimize.
Granted what messes up Python, or better said CPython implemenation, is that C extensions are allowed to mess up with its internals thus making void many possible optimizations that would be otherwise available.
A reason why JVM, CLR, V8, ART make use of handles and have marshaling layers not allowing such kind of liberties with native extensions.
Great explanation. Five years ago I did the genealogical work to discover that StrongTalk begat HotSpot (by virtue of having some of the same authors) It was quite a joy to discover!
The problem with this is that the main value of Python is its ecosystem. SPy aims to be able to import Python libraries, but also not implement all Python features. If you are not 100% compatible how can you reliably import libraries?
SPy seems most likely to be more likely to be appealing as a more Pythonic alternative to Cython rather than a Python replacement.
hello, author of the blog post and author of SPy here.
> how can you reliably import libraries?
the blog post specifies it but probably not in great level of detail. Calling python libs from spy will go through libpython.so (so essentially we will embed CPython). So CPython will import the library, and there will be a SPy<=>CPython interop layer to convert/proxy objects on the two worlds.
I did a similar project, a typed perl. cperl. I could import most the modules, and did add types to some of the important modules. Eg testing was 2x faster. I needed typing patches for about 10% for most CPAN packages.
A type is a contract, not a hint!
This seems to be going for a somewhat similar goal to Mojo [0] - anyone here who used both and is willing to offer a comparison?
[0] https://www.modular.com/mojo
Time for me to remind everyone of the Shedskin Python compiler.
https://shedskin.github.io/
Based on my understanding, Mojo aims to make number crunch computation faster (GPU), while as SPy aims to make generic Python application logic faster. Very similar, but different sweet spots and use cases.
While GPU is a focus of Mojo, it is also planned to make it a general system programming language similar to C++ and Rust.
> SPy is [...] a compiler
> SPy is not a "compiler for Python"
I think it's funny how it's confusing from the first paragraph
Reading the next sentence clears the confusion:
> SPy is not a "compiler for Python". There are features of the Python language which will never be supported by SPy by design. Don't expect to compile Django or FastAPI with SPy.
Yeah but then don't say that SPy is a (interpreter and) compiler in the first place? Just say it's a interpreter.
It is a compiler. It is not a compiler for Python, because there are valid Python programs it can't compile and isn't intended to compile.
To make it more confusing: SPy is not spyware (at least, I hope)
I like the idea of a compiled language that takes the look and ethos of Python (or at least the "looks like pseudocode, but runs"-ethos)
I don't think the article gives much of an impression on how SPy is on that front.
This is what F# provides. It has a similar whitespace syntax to Python, but is statically typed and can be compiled AoT.
Bubble sort Python:
Bubble sort F#:I believe that Python is as popular and widely used as it is because it's old enough to have an expansive ecosystem of libraries. It's easy enough to implement one in pure Python and possible to optimize it later (Pydantic is a great recent-ish example, switching to a Rust core for 2.0). That same combination of Python + (choose a compiled language) makes it quite difficult for any new language to tap into the main strength of Python.
You can have that today with Nim.
Nim feels like a really amazing language. There were some minor things that I wanted to do with it. Like trying to solve a codeforces question just out of mere curiosity to build something on top of it.
I felt like although it was similar to python. You can't underestimate the python's standard library features which I felt lacking. I am not sure if these were skill issues. Yes these are similar languages but I would still say that I really welcome a language like SPy too.
The funny thing is that I ended up architecting a really complicated solution to a simple problem in nim and I was proud of it and then I asked chatgpt thinking no way there can be anything simpler for it in nim and I found something that worked in 7-10 or 12* lines and my jaw dropped lol. Maybe chatgpt could be decent to learn nim imo or reading some nim books for sure but the packages environment etc. felt really brittle as well.
I think that there are good features of both nim and SPy and I welcome both personally.
This looks like a very interesting approach bringing comptime to a static version of python. This version of comptime can then be used to define new types in the same way Zig does it.
I absolutely hate the terminology though red/blue redshifting etc. Why do blue functions disappear when redshifting? If you red shift blue then it goes down in frequency so you might get green or red. Perhaps my physics brain is just over thinking it!
> The other fundamental concept in SPy is redshifting.
> Each expression is given a color:
> blue expressions are those which can safely be evaluated ahead of time, because they don't have side effects and all operands are statically known.
> red expressions are those which needs to be evaluated at runtime.
> During redshifting we eagerly evaluate all the blue parts of the code: it's a form of partial evaluation. This process plays very well with the freezing that we discussed above, because a lot of operations on frozen data become automatically blue: for example, if we statically know the type of an object, the logic to look up a method inside the frozen class hierarchy is a blue operation and it's optimized away, leaving just a direct call as a result.
Please just rename it comptime then at least people who have learnt Zig will know what it means immediately.
In FORTH these would have been called IMMEDIATE words. Namely functions which run at "compile" time rather than run time.
If you want different parts of your code to be a statically typed Python lookalike Cython is a mature option
Yes, it's mature, but you (and your potential audience) basically need to learn a new language, a lot of quirks and "weird" (I'd even say counter-intuitive) nuances, and it's also significantly less readable in comparison with strict and typed Python. Even its modern syntax doesn't click immediately (also performance wise the new syntax somehow is a bit slower in my tests)
Good level of detail (for me) to understand (some things).
I can't view the site on my mobile without accepting cookies.
No cookie notice at all for me using Firefox on Android with the "I Still Don't Care About Cookies" extension.
Specifically Google Analytics cookies, but I found you can uncheck the box.
It's still pretty confusing: Uunchecking the box doesn't seem to do much (is it actually unchecked when you click it? There's still a checkmark); you still have to click Accept to see the text; what are you accepting?
In any case, pre-checked boxes are not valid consent under GDPR (“Planet49”).