Last year, the FileCatalyst team was approached with an interesting proposition.

A PhD researcher from the Anhalt University, Dmitry Kachan, wanted to test multiple transfer solutions from competing vendors as part of his work. He would take a copy of our software, run some tests to see how it performed under various network conditions, and lay down an objective comparison of various products.

Part of me wanted to say “No, thank you” – the risks are huge for a company like ours as performance is our bread and butter.

If the paper’s findings are off (a misconfigured system at 10 Gbps can behave sluggishly) and these figures are published, it could harm our reputation. If the paper’s author has ulterior motives, this provides an easy mark to setup our product to shine in negative light.

From speaking with customers who have evaluated multiple products, we knew indirectly how our technology stacked up to those of competitors. However, these findings had never been published and the methodology never peer reviewed.

It’s one thing to claim to be the best. It’s quite another to show up to the fight in a neutral ring.

To add to the risks, the paper’s co-author was the ex-CEO of one of our competitors. Another concern was if the results would be an accurate reflection of our product’s performance.

Although these risks were perhaps too much for a few of our competitors to take, part of me was singing a different tune. It’s the voice all software developers feel after they develop a product. A product where the code runs. Fast. Part of you wants to scream out how good it is, and you want to share this with the rest of the world. We had spent months working, refining our algorithm. I knew what our software was capable of doing and just how good our solution to the problem was.

Our customers know. But many of them view us as a competitive advantage, so our achievements are not always widely published.

So we said “Yes”.

To keep it fair and impartial, we maintained a hand-off approach to the software test. We were contacted on a few occasions in July, and walked the researcher through our protocol and setup much like we do with a customer needing a high speed transfer solutions (more care is required when you perform 1+Gbps transfers). The 2-4 hour process normally involves talking about tuning options in our protocol, and how to make it scream.

Then silence.

The fall development routine started to kick in, and the project more or less drifted out of mind as the daily activities kept me busy at work.

Last week, we got word the the results are being published (ICNS 2013 : The Ninth International Conference on Networking and Services) and my fears proved to be unfounded.

Out of 5 vendors tested, only 3 of them could push data past 100Mbps under even light packet loss scenarios (0.1%).

Of those three left standing…. well… the data speeds for itself.


I honestly could not be more proud of the work our team puts into our product, and the quality research and development we put into our FileCatalyst platform.

To Chris, John, Elton, John E, Marcel, Jack, Greg, and the entire testing and support team, you guys rock!

Congratulations also to Dmitry Kachan and Eduard Siemens on the paper publication, Comparison of Contemporary Solutions for High Speed Data Transport on WAN 10 Gbit/s Connections. Your research was well done, fair, and deserves a round of applause!