The following Supply Chain Matters guest blog contribution comes from Josh Miramant, CEO of data science agency Blue Orange Digital.

Cloud-based data science solutions are getting increased attention across industries. The supply chain industry is no exception. Smart industrial equipment, the low price of data-collecting sensors, and increased availability of IoT devices make the Supply Chain 4.0 particularly inviting for data-driven solutions in the cloud. Supply Chain Matters Guest Contribution

What makes data-driven applications feasible for companies of all sizes?

  1. Data processing in the cloud is more accessible and affordable than ever. In order to make sense of data, companies used to require designated infrastructure, in-house teams of IT professionals, and data science expertise. That is the old model, in which access to data science and analytics was restricted to a select few organizations that had the appropriate resources. For the non-technical company that handles a lot of data, simply storing it (let aside collecting, analyzing, and reporting) would require storage capabilities outside of their reach.

Nowadays, storage and computational power can easily be rented in the Cloud. The more that is needed, the more can be rented since the Cloud motto is “pay for what you use”. Similarly, algorithms and tools for data analysis are already available and provided by service providers. Data science professionals that already master such tools can be hired, together with their expertise on similar data transformation projects. They can tackle anything from the smallest data processing challenge to the most advanced predictive analytics project. In such a context, data science applications are accessible to companies of all sizes.

The competition among the top cloud service providers (Amazon, Microsoft, and Google) brings a huge price benefit to the data science experts (and to the companies that hire them). In their attempt to offer better services and increase the quality of their tools, they constantly lower the costs of their suite of cloud services. Storage, computational power, and ready-to-use data analysis tools are becoming cheaper and cheaper.

  1. The security of data in the cloud is highly enforced.

Many companies are collecting, storing, and processing their data in the Cloud. This gives service providers the responsibility to ensure the highest data security standards. While organizations can choose where and how to store their data (Cloud, On-premise, or Hybrid), there are multiple aspects that make the Cloud a safer choice.

A service  provider has its reputation and profitability directly impacted by its capacity to securely handle customer data. For this reason, they hire large teams of cybersecurity professionals who are in charge of data security standards and protocols in the cloud. Since their core mission is to deliver a secure environment for handling data, data scientists can leave their security worries aside and simply focus on data processing and analysis.

Even if the data is stored in the Cloud, data engineers still have access to a wide range of cybersecurity tools and access controls. This enables them to configure layered data access schemes, according to their custom needs. For example, an organization can define which employees have access to which data, according to their role and identity. Like this, organizations gain fine-grained control over their data and can stay protected from both internal and external threats.

 

  1. Data analysis pipelines are robust and easy to implement.

Data analysis pipelines have been developed for all kinds of organizations and data. The process is now well established and data scientists have a clear path to follow, all the way from data collection to business insights.

At first operational data is collected and aggregated and goes through initial pre-processing steps (such as cleanup, filtering, transformation). Then the data is analyzed using powerful algorithms and advanced neural architectures. Results can be visualized and reports can be generated via BI software. Based on the outcomes, data-based decisions are taken and internal processes are optimized. The entire cycle is then repeated.

The recipe is clear and organizations apply it again and again since it works on any type of data. Data transformation strategies are now ubiquitous due to increasingly available processing tools and modern data storage solutions.

To exemplify, traditional data storage meant that department data was stored in isolated silos. Analysis, interpretation, and decisions were taken based on department-specific, isolated data, which only represented the partial truth. With the advent of modern storage data collection, it is now possible to collect all relevant data into a single, unified data source. Both internal (e.g. demand, sales & customer data, equipment performance logs) and external (e.g. market trends, competitor prices) data sources can be combined for analysis and interpretation. Like this, organizations gain a 360 degrees view of their data.

Similarly, algorithms and analytics used to run forecasts every week or month. The modern approach is to run analytics in almost real-time. This is only possible due to modern data processing pipelines and analytics services in the cloud. Furthermore, data analysis tools and algorithms are becoming so ubiquitous, that cloud service providers even offer ready-made solutions for the most common applications (Amazon offers out-of-the-box predictive maintenance solutions for IoT manufacturing equipment).

Benefits and potential of cloud-based data science in the Supply Chain Industry

Companies pursue data-based solutions since they have proven repeatedly to have a direct impact on cost optimization strategies and on their productivity. Below are some of the most popular directions in which data science in the cloud can impact the supply chain industry:

 

  • Improve operational efficiency since software, hardware, and sensor infrastructures can work 24/7

 

  • Reduce operational costs since algorithms can find solutions to the most complex supply chain problems

 

  • Increase the competitive advantage by modelling and learning from the intricate dependencies of the various supply chain parameters

 

  • Provide accurate, data-based insights and enable informed decision making since all stakeholders have access to a single source of truth

 

  • Enable new applications in the various phases of supply chain management, from production and inventory to transportation and planning

 

An example of these technologies being used everyday come from IoT supply chain company Clearblade. In their manufacturing of IoT sensors for an airplane manufacturer they describe how the application of analytics, AI, and machine learning not only brings visibility across the supply chain but also allows them to amplify the productivity of workers.

Instead of Fred, who’s been working in the machine shop for 50 years, putting his hand on a compressor and knowing when it’s going to fail. Now I take Fred’s knowledge and I put it in this rule that runs, streaming the data off the machine. And now I can take Fred and multiply him times a hundred thousand across all my devices.

-Clearblade

 

The hardest part of reaping the benefits from these advanced technologies is connecting the devices out in the field. Companies such as Clearblade have developed efficient methods of deployment and implementation which have driven the cost of these down considerably. With the lower cost of computing and implementation these technologies are more accessible and affordable than ever before

You’ve got an instrument to get the data, and I think that’s harder than people realize, but it’s getting the sensors out in the field in a cost effective way of getting the devices and streaming all to a common database platform. And it’s one of the things we at Clearblade are very good at is connecting to the devices, streaming the data to wherever it needs to go. There’s lots of IOT products or solutions out there that I call science experiments because companies have to spend a year or two years building something we already have. We’ve got something out of box, that connects with existing systems and easily works with what’s been there for decades. You can’t just throw out what you have.

-Clearblade

 

All in all, the current landscape of data science tools and expertise available to companies make it more affordable than ever to pursue successful data transformation strategies.

Companies have a chance to not let data turn into waste and actually transform it into a useful asset, that drives strategies and optimizes operations.

 

About The Author: Josh Miramant is the CEO of New York City based data science and analytics firm Blue Orange Digital. Blue Orange specializes in data transformation, data visualization, and predictive analytics.