Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re: Mathematics eq CompSci

by BrowserUk (Patriarch)
on May 02, 2005 at 07:15 UTC ( [id://453154]=note: print w/replies, xml ) Need Help??


in reply to Mathematics eq CompSci

I think that the mathematical symbolism used in the description, exploration and characterisation of algorithms from a formal CS perspective is frequently unhelpful.

Besides that characterising one algorithm as O(log n) and another as O(n) often completely obscures that the former must be implemented at a higher, less efficient level than the latter, such characterisations often have unwritten (or legalpoint) riders like "in the general case", "given equal speed memory access", or "ignoring cache coherency".

They also almost always assume unbounded, constant performance, memory; generalised (often properly random) datasets; and idealised, uniform, single-tasking operating environments. The real-world constraints often include bounded-RAM with much slower secondary storage sourcing and spillage; biased datasets with reducable, non-influencing commonalities; and pre-emptive, multi-tasking (and multiplicitous) operating environments with unquantifiable coexisting & competing demands.

Whilst some of the more CS-favoured algorithms (sorts, shuffles, searches etc.) have been studied in depth for the influence of things like cache coherency, these studies again often make the preassumption that the hardware running the algorithm is entirely dedicated to the running of the algorithm. Once you place that algorithm onto hardware that can, at any given point in the algorithm, completely destroy the cache conherency by task switching to an entirely different process, most of the benefits extractable by tailoring the algorithm to maximise cache coherency go out of the window.

Even when programming at a level where the effects of machine level "built-in" routines versus high-level user-coded algorithms does not mean a ratio of 10s or 100s to 1 performance advantage of the former over the latter (Perl), the percentage of code written that actually fits into that set of well-analysed algorithms is very small, maybe 10% to 20% at best.

Most real-world code is dominated by interactions with external events. User inputs, shared devices and databases, chaotic networks and the omnipresent contention for cpu and other resources. Whilst we all benefit from highly tuned sorts and tree-traversal algorithms when we need them, the benefits derived from their tuning, in the reality of our tasks spending 50%, 70% or even 90% of their time task-swapped or waiting on IO, is usually much less than those ascribed to them via intensive studies performed under idealised conditions.

And to answer your question. Yes, it is possible to describe algorithms without formal mathematic notation, though these tend to require much more careful, wordful construction. One of the benefits of such wordful, less specialist construction is that more minds are likely to see the flaws. The problem with formal symbolisms is that they tend to become more and more specialist as they evolve to deal with higher and higher levels of abstraction.

Whilst many thousands, maybe 100s of 1000s have read much of Knuth's tomes, the percentage of those that really understood his notations is much less, and many of those that did are only writing code in academia.

Just as there are only a handful of people in the world that will ever see the flaw in Wiles proof---if there is one. It falls to a pretty select group of people to find the limits, flaws and constraints on Knuth's work. Beyond the typos that he pays a peice rate for, I wouldn't mind betting that he personally has made more corrections to his own work than all the informal users of it combined.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about question the status quo. Questioning authority".
The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.

Replies are listed 'Best First'.
Re^2: Mathematics eq CompSci
by fergal (Chaplain) on May 02, 2005 at 12:58 UTC
    I've seen plenty of mathsy comp sci that was very much rooted in reality. A good example was a graph search algortihm for searching graphs which require 10 of gigs to store (DNA matching is one practical example). This algorithm took into account disk speed, memory speed, cache speed etc.

    Many researchers developing these algorithms are also using them too. They can't ignore reality.

    As for using symbolism. It's quite like the difference between talking about writing a complex program and actually writing one. It all seems so simple when you start but when you get right down to the details of what do I need to pass to ths function and where will I get it, you start to hit problems that you didn't see before. Similarly, analysing an algorithm mathematically and symbolically (hopefully) prevents you from skipping any details so although less people will understand it, those that do will be able to poke holes and find mistakes much more easily than if they had to perform all the analysis independently themselves.

    Of course it's also a good idea to describe the algorithm well in a natural language but we'd all be sitting in the dark if Maxwell had just said "we have these waves, electric and magnetic and they go up and down and they're always perpendiular to each other..."

      Two comments.

    • Algorithms coming out of active, front-line research are a quite different animal to the "classical" algorithms taught in CS classes at BSc. and even MSc. level, especially where the the CS is just one component of a combined disciplines degree.
    • Of course, formal symbolic descriptions have their place and are extremely useful for those exploring the field from that perspective, but given the need to implement an algorithm, which do you prefer to work from?

      formal

      http://www.nist.gov/dads/HTML/avltree.html & informal


      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
      "Science is about question the status quo. Questioning authority".
      The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.
Re^2: Mathematics eq CompSci
by Anonymous Monk on Jun 20, 2005 at 17:31 UTC
    Most real-world code is dominated by interactions with external events.
    We're painting with pretty broad strokes there aren't we? Maybe that would be better phrased as "In my limited experience, I find a lot of code is dominated by interactions with external events. Although I've heard rumors of old fogies using C and Fortran and running compute intensive simulations for hours and days and months. But its just a rumor, so you can safely discount it."

      Hmmm. Let's see software that deals with external devices and events:

      Browsers, compilers, operating systems, databases, communications systems, radar system, weapons systems, guidance systems, mp3 sofware, windtunnel software, engine management systems, video games, avionics, disk/tape/display/CD/DVD/USB/Printer/Network card/etc. device drivers, camera software, picture editing software, spreadsheets, editors, interpreters, Genome analysis, web servers, ftp, network OS, viruses, trojans, XML, stock control, calculators, phone software, microwaves, washing machines, accounting software, central heating controllers, tills, atms, petrol pumps, clocks, satellites ....

      Although I've heard rumors of old fogies using C and Fortran and running compute intensive simulations for hours and days and months. But its just a rumor, so you can safely discount it.

      Counterpoint?


      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
      "Science is about questioning the status quo. Questioning authority".
      The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.
        Hmmm. I was thinking of people who use computer to actually, you know, compute things. People like mathematicians, scientists, engineers, etc., working on problems like circuit analysis, 3D electromagnetic field solvers, place and route algorithms, fluid dynamics, cryptography, signal processing, image/voice recoginition, theorem provers, natural language processing, program analysis, structural analysis, expert system (chess playing, credit risk analysis, ...), vehicle routing, drug chemistry, particle physics simulations, seismic modeling, weather prediction...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://453154]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (7)
As of 2024-03-28 14:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found