[Beowulf] SC13 wrapup, please post your own
tjrc at sanger.ac.uk
Tue Nov 26 08:26:29 PST 2013
OK, yes, but that's still counting papers, just with a more sensible scoring matrix. It has exactly the same problem in really demonstrating in a grant proposal or budget justification just what the ROI is going to be in those terms.
It's quite funny watching vendors flounder whenever they claim they can offer great ROI and you ask them to actually do so. Usually their crude spreadsheets end up telling you're going to spend $millions more using their solution.
Dr Tim Cutts
Acting Head of Scientific Computing
Wellcome Trust Sanger Institute
On 26 Nov 2013, at 14:54, "Peter St. John" <peter.st.john at gmail.com> wrote:
> Oh there is another metric besides number of papers published; Citation Indexing and Impact Factor, the predecessors of Google "Page" Ranking.
> Instead of counting papers, or counting citations to papers, you count citations weighted by their own citations, recursively.
> So one year Andrew Wiles publishes two papers. Those two papers are read by maybe six specialists in arithmetic algebraic geometry. But those six guys are read by many more, etc (the recursion converges rapidly, which is why Page Ranking is so effective), so Wiles' Impact Factor is extravagant.
> On Tue, Nov 26, 2013 at 9:39 AM, Tim Cutts <tjrc at sanger.ac.uk> wrote:
> On 25 Nov 2013, at 23:03, Prentice Bisbal <prentice.bisbal at rutgers.edu> wrote:
>> 4. I went to a BoF on ROI on HPC investment. All the presentations in
>> the BoF frustrated me. Not because they were poorly done, but because
>> they tried to measure the value of a cluster by number of papers
>> published that used that HPC resource. I think that's a crappy, crappy
>> metric, but haven't been able to come up with a better one myself yet. I
>> was very vocal with my comments and criticisms of the presentations, so
>> if any of the presenters are reading this now, I apologize for
>> hi-jacking your BoF. Getting good ROI on a cluster is close to my heart,
>> but is also difficult to quantify and measure. I hope I can be part of
>> the discussion next year.
> I can't think of another metric either. At the top of my organisation, publications are *the* key metric that all scientists are judged on. Publications are *the* product of any scientific institution. We don't sell anything, so we can't measure revenue. All we can measure are papers published per unit time. The problem is that the publication of the paper is very distant from the building of your compute infrastructure, so it's very hard to put a sensible number on ROI for this stuff.
>> 7. The cover band 'London Calling' played the IBM Platform
>> Computing/Intel party again. Despite calling themselves 'London Calling'
>> they still do not play any Clash songs. They are a good cover band, but
>> it's starting to get boring seeing the same band play the same set year
>> after year.
> It did at least have more atmosphere than either of the parties I went to earlier in the week, which were pretty much like drinking in a morgue. I won't name them, but we probably all know which ones. Both were at establishments on the 16th street Mall. I had a lot more fun and useful conversation in bars after abandoning ship…
> -- The Wellcome Trust Sanger Institute is operated by Genome Research Limited, a charity registered in England with number 1021457 and a company registered in England with number 2742969, whose registered office is 215 Euston Road, London, NW1 2BE.
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
The Wellcome Trust Sanger Institute is operated by Genome Research
Limited, a charity registered in England with number 1021457 and a
company registered in England with number 2742969, whose registered
office is 215 Euston Road, London, NW1 2BE.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Beowulf