Big Data and GD&T Together?

Maybe it should be stated that the new Big Data is product part’s measured points and the new analytics is the software that can answer the what-if questions of designs intent and manufacture capabilities that live in many silos that is hard for companies to be able to realize them. Product Lifecycle Management (PLM) is the backbone of most manufacturers’ information systems, and they do not capture this data today.

In Oleg Shilovitsky’s blog post over at Beyond PLM titled Big Data Opportunity and PLM Intelligence he writes:

What is my conclusion? Data will be a key element of a future paradigm shift. Ability to collect, share and communicate around data will become an essential prerequisite of future PLM intelligent applications. How to get the data? How to share right data in a company? How to get information about what product is actually customer using. All these questions are hard to answer with today PLM infrastructure and tools focusing on control of data and business processes. So, leave the process of crushing organization silos to companies and business teams. Focus on how to break data silos and deliver right data to users.

Oleg also notes that it is software, not the hardware that will catapult Big Data.

Ultimately data is held by CAD, PDM, PLM, ERP and variety of other legacy applications.

Measured Data is the New Big Data

We focus on product tolerances, and we build solutions that help to break down the data silos so users can easily; define, share, verify, and analyze product tolerances to assure that the part specifications and the manufacturability are aligned.

KOTEM’s EVOLVE Suite is a complete GD&T solution that works from design and verification by using all the measured points gather by inspections. Engineers can now ask better questions and see possible outcomes in the setting of tolerances and understanding machine capabilities.  In the past, this has been done by engineers with years of experience and various tools that did not always allow integration and process of all the data available.

Therese Corrigan-Bastuk