Bigger does not always mean better. Dan Woods writes for Forbes about how all the emphasis on big data these days is at least partially misplaced. Rather than developing methods to handle any enormous data set, Woods believes we should instead be focusing on identifying specific data sets internally and externally that affect the supply chain at various points. These specific sets should in turn be processed in such a way that they are immediately made available to affected parties so that appropriate responses may be planned. Doing this is what makes a distributed data supply chain. Wood identifies the following forces as driving the transformation toward distributed data:
- Big data analysis will become a product.
- Quality of data is more important than quantity.
- More data that matters is more important than the size of any data set.
- The number and value of external data sets will rise.
- Focusing your efforts is crucial.
- Broadening use of data is the key value creator.
He goes on to list a variety of companies with products all suited to building a distributed data supply chain, including Actian, Splunk, and Apigee Insights. Consult the full article for more details on how to resist the temptation of big data and go down a more intelligently crafted road for success.