How engineering and IT can work together to leverage test data
Competition, market forces, and new innovations require companies to evaluate the people, processes, and technologies used to develop products and services. For test and measurement companies, that evaluation is being driven by the emergence of the Big Analog Data problem, which includes collecting and analysing the raw data from the physical world around us
Unlike the big data typically associated with traditional IT data sources such as social media and enterprise applications, Big Analog Data solutions represent a vastly untapped well of information and insight that test and measurement companies can use to identify and create competitive advantages in data-centric engineering. This is no small feat considering the IDC estimates that only 5 percent of the data collected today is even being analysed.
In this push to better acquire, store, and leverage Big Analog Data solutions, specifically for test data, as it is known in automated test, engineers must start by recognising the role that IT plays in managing it. At present, the sheer amount of data being generated by engineering departments is causing a chasm between IT and engineering. Unless these groups work together to develop tools and methods to better use the data, this chasm will grow deeper.
The first step to cohesion is understanding how big data is classified: structured, unstructured, or semi-structured. Historically, most big data solutions have focused on structured data. Defined by the user, structured data embodies a distinct relationship to the user, who inputs numerous values (name, birthday, address) as raw data. Unstructured data contains no metadata, schema, or other pre-assigned, established organisation.
The third category, semi-structured, is influenced by the dramatic increase in the amount of test data being collected. As more test systems are deployed for 24/7 test data collection, the volume of test data will soon surpass that of human-generated data. Because test data yields so much information, assigning structured value to each and every byte is difficult. Creating hierarchies of data provides structure and makes mining the data after capture easier. This semi-structured test data is typically marked with a timestamp and then analysed across a set time period or for a set stimulus/response event.
Most companies suboptimally implement their solutions for test data because they haven’t anticipated how valuable the correlation of the information gleaned from this stimulus/response data might be at the time of implementation. The most effective methods to combat this issue combine test data analytics with traditional IT tools, but this architecture requires new approaches to data integration and management. This includes new infrastructures and skills to store, mine, and analyse the information-rich data collected. These solutions need to be designed to analyse new data sources and integrate with existing data stores.
Though IT departments haven’t traditionally included test data from engineers and scientists in their overall objectives, they now see the compelling business value to applying analytics and algorithms for exploiting this mountain of data and driving new business opportunities. But making this challenging transition requires asking the following questions:• Is more than half of your analysis manual? • Does your team spend more than five hours a week searching for data trends? • How much data are you actually analysing? Is it less than 80 percent of the data you’re collecting? • Do you have a streamlined process across departments? Or are different teams using different tools? Form a cross-functional data management team To effectively transform into a test data-centric organisation, a cross-functional team should jointly test solutions and ensure compatibility. This team should include a representative from IT, an engineer tasked with data collection, a data scientist, and a manager with a high-level view of how new solutions will roll out to other departments. Additionally, an executive should have a vested interest in the outcome of the inclusion of test data analytics to ensure key members of the cross-functional team are held accountable for progress.
Don’t expect results immediately Many companies make the mistake of expecting a full data analytics solution in an unreasonable amount of time. Underestimating the effort required to align multiple teams while trying to overhaul existing workflow processes usually leads teams into proposing solutions without understanding their true data needs. This results in an unusable solution that end users don’t adopt.
A full data analytics solution for test data involves smaller, incremental steps and builds momentum for end users, IT professionals, business leaders, and so on. Best-in-class companies often run an internal pilot within a single department before documenting data analytics requirements. This pilot typically includes integrating a data storage mechanism specifically for test data into existing IT infrastructure, testing multiple data analytics software packages, and defining a process to analyse data from start to finish.
This allows key stakeholders to understand the flow of the collected data and identify data bottlenecks. It also gives the IT department time to learn the differences between traditional big data and test data and to pinpoint strategies for adopting the different tools required for the successful implementation of test data solutions. Addressing bottlenecks also improves yield, quality, and time to market as well as prevents sending inadequate products to market by catching more errors or tests out of specification. These benefits will increase the company’s overall profit.
Architect for expansion Companies need to keep the big picture in mind when starting pilot programs in test automation. They need to remember that solutions architected for certain groups won’t scale when rolling out test data analytics solutions to different departments. In addition, companies can send their engineering and design teams weekly reports to identify key trends for avoiding failures or tightening margins. This can jump-start a redesign process that addresses all possible scenarios.
By prioritising a long-term vision when designing a test data analytics solution architecture, companies can set tangible goals for expansion and IT can plan accordingly and add more servers as the solution is implemented across multiple departments.
Invest now for enormous payoffs Implementing a test data solution can add tremendous value to an organisation by enabling a more productive workforce while lowering costs and increasing profit. The companies that choose to make the shift to data-centric organisations will be market leaders with access to up to 95 percent more data than competitors, which can make them 20 percent more cost-efficient. Forming a strong relationship between IT and engineering to meet these test data challenges is critical for extracting not only the analytics but also the knowledge in large test data sets.
Stephanie Amrite, Product Marketing Manager, National Instruments
How engineering and IT can work together to leverage test data