Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Re: What's Perl good at or better than Python.

by Marshall (Canon)
on Oct 17, 2021 at 00:07 UTC ( [id://11137642]=note: print w/replies, xml ) Need Help??


in reply to What's Perl good at or better than Python.

I am just starting to learn Python.

Python will be much less efficient than Perl in multi-threaded applications due to the GIL (Global Interperter Lock). To get around that, Python provides the multiprocessing module to run multiple instances of the Python interpreter on separate cores. State can be shared by way of shared memory or server processes, and data can be passed between process instances via queues or pipes. You still have to manage state manually between the processes. Plus, there’s no small amount of overhead involved in starting multiple instances of Python and passing objects among them.

Perl multi-threading can be quite efficient. This is best seen in a "number crunching" task. I had one job with 20K items to process. Processing each object required a lot of CPU. When I split the job onto 4 cores, the overall task completed 3.7-3.8x faster. You can't get 4x because there is some OS and intra-thread overhead. I haven't coded that application in Python, but if I did, I don't expect those excellent results.

Python requires obscure statements to increase speed.
There is a concept called "List Comprehension".
Basically if you squeeze the loops into a single statement, it will run faster. That is not true with Perl. Perl allows easier to understand code that runs just as fast as shorter, more obtuse code.

Python: def print2D(table) : # print LoL print('\n'.join([' '.join([str(score) for score in row]) for row i +n table])) Perl: sub print2D { my $table_ref = shift; foreach my $row_ref (@$table_ref) { print "@$row_ref\n"; } }
Python is now the most popular programming language - there is little doubt about that because it is being taught in essentially all CS curricula world wide.

If you are processing some very large file, with a very small amount of processing per line, multi-threading won't help at all if you are reading the file in serial, linear fashion. The processing time will be dominated by the time to read the file from disk! Even doing something complicated and "tricky" to say give 1/4 of the file to each of 4 threads won't help because at the end of the day, how fast you can read the bits off of the disk will be limiting (assuming again the processing per line is relatively "cheap").

Update: And Perl faster than Python, especially when it comes to creating reports from text and putting in CSV or excel? Perl has C libraries for CSV stuff - they run very quickly. I suppose that Python has those also. I am not sure that it makes much difference. We would have to write some benchmark code.

Update2: Some years ago, I talked with a guy who wrote code for the high volume day commodity trading market in Chicago. They used Perl to parse the incoming trades and then C to crunch the data for the actual trading algorithms. I would have thought that C would have been used for all of it. But evidently not true (at least at the time). Perl is quite good at the right job.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11137642]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others browsing the Monastery: (10)
As of 2024-04-19 08:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found