[Beowulf] Accelerator for data compressing
Nifty niftyompi Mitch
niftyompi at niftyegg.com
Thu Oct 2 15:07:16 PDT 2008
On Thu, Oct 02, 2008 at 04:09:36PM -0400, Xu, Jerry wrote:
> Currently I generate nearly one TB data every few days and I need to pass it
> along enterprise network to the storage center attached to my HPC system, I am
> thinking about compressing it (most tiff format image data) as much as I can, as
> fast as I can before I send it crossing network ... So, I am wondering whether
> anyone is familiar with any hardware based accelerator, which can dramatically
> improve the compressing procedure.. suggestion for any file system architecture
> will be appreciated too.. I have couple of contacts from some vendors but not
> sure whether it works as I expected, so if anyone has experience about it and
> want to share, it will be really appreciated !
If I recall correctly TIFF files are hard to compress any more than they already are.
Linux has a handful of compression tools -- how much compression are
you able to get on your data with each of these tools and each command line set
My guess is that the best and most cost effective hardware solution
you will find is a hot Opteron or Intel box with not too many cores and
a good chunk fast DRAM in it.
You might find that contrary to the rest of linux a good optimizing
compiler like PGI, Pathscale, Intel... will speed up the compression code
enough to matter so consider rebuilding things like bzip2, gzip, p7zip
and benchmarking the best compression tool for speed and correctness.
And when you find the best compression program for your data you might
look for some good DSP cards to run your compression on. My bet is that
a new hot box will win. Since this is a cluster mailing list just bang
the compression out to as a cluster job.
T o m M i t c h e l l
Found me a new hat, now what?
More information about the Beowulf