Cluster computing, to my knowledge, is when a program distributes the same task over multiple devices; which results in it being done faster. "I'll mow the lawn, and you clean the bathrooms." You're limited by the speed of the network connecting them and the sophistication of the distribution program, but otherwise the computation power is improved additively.
If that was true, then why isn't clustering more common? Practical considerations aside, wouldn't it be worlds cheaper to frankenstein a bunch of older hardware from a surplus store instead of buying shinier parts?
I am just learning now about how clustering works, and I had this fantasy of duct taping a bunch of raspberry pi units to my tower instead of dropping five grand on a graphics card. I'm sure things aren't that easy, but I am curious about what is possible.
Submitted December 30, 2019 at 07:43AM by Syagrius https://ift.tt/39ppoIz via TikTokTikk
No comments:
Post a Comment