How about parallel computing with finance

Robert G. Brown rgb at phy.duke.edu
Thu Dec 7 06:56:16 PST 2000


On Thu, 7 Dec 2000, Horatio B. Bogbindero wrote:

>
>
> i went to closes bookstore here in our area and i could not find a single
> book about neural network here. this is the drawback for being in a third
> world country like the philippines. hehehe. anyway, i have been surfing
> the net and am beginning to find it interesting. anyway, thanks for the
> information.

Try amazon.com or barnesandnoble.com.  I'm sure both have lots of
titles.  I don't know much about the shipping costs for overseas,
though.

   rgb

>
> On Wed, 6 Dec 2000, Robert G. Brown wrote:
>
> > On Wed, 6 Dec 2000, Terrence E. Brown wrote:
> >
> > > I am also interested in the business and managerial application as well as other
> > > industrial apps.
> > >
> > > I would certainly like to talk with another with similar thoughts. I have even
> > > started an org dedicated to that objective.
> > >
> > > Terrence
> > >
> > > "Horatio_B._Bogbindero_ (Horatio B. Bogbindero )" wrote:
> > >
> > > > i would just like to know about building neural networks in clusters.
> > > > i am not into neural network but some people here in the university
> > > > maybe interested. however, we do not know where to start. i would to know
> > > > where i can get some sample NN code. maybe for something trivial.
> >
> > Hmmm, I don't know how much such of such a discussion should occur on
> > this list.  The following (up to <shameless marketing> is probably
> > reasonable.
> >
> > Neural networks (and the genetic algorithms that underlie a really good
> > one in a problem with high dimensionality) are certainly fascinating
> > things.  They are even in some sense a fundamentally parallel thing, as
> > their processing capabilities result from a tiered composition of
> > relatively simple (but nonlinear) transfer functions.  A general
> > discussion of NN's and how they work is clearly not appropriate for this
> > list though.  There are some particular issues that are.
> >
> > In practice, most the parallelization issues of NN's are a small part of
> > the overall problem UNLESS you are interested in constructing custom
> > hardware or building NN ASIC's or the like.  This is because computers
> > generally run neural network SIMULATORS and use what amounts to
> > relatively small-scale linear algebra (transmogrified through an e.g.
> > logistic function) to do a net evaluation.  Since this is so small that
> > it will often fit into even L1 (and almost certainly into L2) there is
> > no possible way that it can be distributed in parallel except via
> > (embarrassingly parallel) task division in a profitable way.
> >
> > Evaluation of networks' values applied to training/trial set data makes
> > up the bulk of the numerical effort in building a network and is at the
> > heart of the other tasks (e.g. regression or conjugate gradient
> > improvement of the weights).  For large training/trial sets and "big"
> > networks, this can be split up (and my experiences splitting it up are
> > recorded in one of the talks available on the brahma website).  For
> > small ones, the ratio of the time spent doing parallel work to the time
> > spent doing parallel communication isn't favorable and one's parallel
> > scaling sucks.  As in even two nodes may complete in more time than one
> > working alone.
> >
> > I'm working on an improved algorithm that splits up NN
> > construction/training in a way that is more functionally coherent.  That
> > way one or two of the distinct tasks can be parallelized very
> > efficiently and thoroughly and the results fed back into a mostly serial
> > or entirely serial step further down the pipeline.  I expect that this
> > will permit a very nice master/slave implementation of a neural network
> > constructor where nodes are slaves that can be working on any of a
> > number of parallelized tasks according to the directions of the master
> > (quite possibly with internode IPC's, though), and all the serial work
> > can be done on the master.
> >
> > <shameless marketing>
> > NN's (parallelized or not) are, as one might expect, incredibly useful
> > and potentially profitable.  After all, a successful predictive model
> > "tells the future", at least probabilistically, by construction, and
> > does even better than a delphic oracle ever did in that they can often
> > provide a quantitative (although probabilistic) answer to "what if"
> > questions as well.  In ancient times the words of the oracle were just
> > fate and nothing you could do would change them.  In business, one would
> > like to predict what is likely to happen if you follow plan A instead of
> > plan B. Just about any business manager has a list of questions about
> > the future (what if or otherwise) they would love to have the answers
> > to.  That's one of Market Driven's foci -- providing answers and
> > expertise in business optimization.
> > </shameless marketing>
> >
> > Anyway, let me know if you're interested in more discussion of this (or
> > how NN's work or how they and predictive modeling in general can be
> > applied in business and managerial situations) offline.
> >
> >    rgb
> >
> > --
> > Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
> > Duke University Dept. of Physics, Box 90305
> > Durham, N.C. 27708-0305
> > Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu
> >
> >
> >
>
>
> ---------------------
> william.s.yu at ieee.org
>
> "I am, therefore I am."
> -- Akira
>
>

-- 
Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu







More information about the Beowulf mailing list