Well, high-trafficked Websites such as slashdot, Amazon, and may others do quite fine with Perl every day. As many developers discover over time, the primary issue with performance and scalability is whether or not your tech people are competent. If you have questions about their competency, it can be very easy to build slow, hard-to-maintain Web sites in any language. In fact, Perl may be worse than others for this because less experienced developers don't know how to harness its power.
However, if they're competent, Perl can be a fantastic choice because of how easy and fast development is. Much of performance comes from things outside of the programming language. Do you have competent DBAs? Decent load-balancing? Are you dynamically generating pages with static content? Do you understand how your site is going to be used and know what to target for profiling? How's your network set up? Do you have media servers separate from database and app servers? Heck, if you feel you don't need stuff like that, you probably won't have enough traffic or usage (not the same thing), to worry about whether or not Perl is a bottleneck.
In short, performance and scalability depend on many factors, not just Perl. (Side note: I've done a lot of work with "enterprise" level Perl in high-demand environments. I have no problems with it)
| [reply] |
Guarav:
For the most part, I'd imagine that your website would be limited not by the processing speed of perl, but of the time spent in general-purpose activities (reading/writing to sockets, querying database, etc.). I've found that today's computers are *so fast* that CPU is rarely the bottleneck in my applications. It's nearly always I/O speed or database latency.
So long as you choose good algorithms, you should be fine.
Also, I wouldn't worry a whole lot about speed at first. Instead, I'd worry about getting it running nicely. (Sounds like you're just about there now.) Then use a tool to generate a good heavy load on the system. Then examine system usage (I/O time, system time, etc.) and profile the code to see where your hotspots are. Then fix those hotspots. It's often said that 90% of the time is spent in 10% of the code--and I believe it. I'd code in whatever I like, and if it turns out that you *have* a CPU issue that you need to address...then find it and choose a different algorithm, and if that doesn't work, then code that chunk up in C.
One last thing: I worked on a project a few years back, and we spent buckets of cash on getting *BIG* systems, optimized the software to work as efficiently as possible. I don't know how many million$ were spent on the system. But I wrote the financial reports, and I *do* know how many customers it had before it folded. The high point was 26. Yep ... 26 people paying $20/month to use our system. At least they didn't have to wait on the system!
...roboticus
| [reply] |
http://vox.com is a very large application with lots of users running on Catalyst.
Catalyst is very scalable.
However, your application may require effort (depending on what it actually is). | [reply] |