Camozzi
Camozzi

Your future advertising space? Our media data

Absolent
Absolent

Your future advertising space? Our media data

OEM Update
.

Big analog data and data acquisition

May 14, 2013 12:44 pm

An outlook by National Instruments to highlight some of the most pressing trends and challenges engineers building data acquisition systems face
Differentiation is no longer about who can collect the most data; it’s about who can quickly make sense of the data they collect. There was once a time when hardware sampling rates, limited by the speed at which analog-to-digital (A/D) conversion took place, physically restricted how much data was acquired. But today, hardware vendors have accelerated data collection rates so quickly and enabled engineers and scientists to break through rate and resolution barriers so rapidly that they have ultimately triggered a new wave of data consequences. Simply put, hardware is no longer the limiting factor in acquisition applications; the management of acquired data is the challenge of the future.
Advancements in computing technology — including increasing microprocessor speed and hard drive storage capacity — combined with decreasing costs for hardware, and software have provoked an explosion of data coming in at a blistering pace. In test and measurement applications in particular, engineers and scientists can collect vast amounts of data every second. For every second that the Large Hadron Collider at CERN runs an experiment, the instrument generates 40 terabytes of data. For every 30 minutes that a Boeing jet engine runs, the system creates 10 terabytes of operations information. For a single journey across the Atlantic Ocean, a four-engine jumbo jet can create 640 terabytes of data. Multiply that by the more than 25,000 flights flown each day, and you can understand the enormous amount of data being generated.
The technology research firm IDC recently performed a study on digital data, which includes the world’s measurement files, videos, music files and so on. This study estimates that the amount of data available is doubling every 2 years. The fact that data production is doubling every 2 years mimics one of electronics’ most famous Moore’s law. In 1965, Gordon Moore stated that the number of transistors on an IC doubled approximately every 2 years, and he expected the trend to continue for at least 10 years. Forty-eight years later, Moore’s law still influences many aspects of IT and electronics. If the production of digital data continues to mimic Moore’s law, success as an organisation will hinge on the speed at which acquired data can be turned into useful knowledge.
The big data phenomenon adds new challenges to data analysis, search, integration, reporting, and system maintenance that must be met to keep pace with the exponential growth of data. The sources of data are many. However, among the most interesting to the engineer and scientist is data derived from the physical world. This is analog data that is captured and digitised; thus, it can be called “Big Analog Data”. It is collected from measurements of vibration, RF signals, temperature, pressure, sound, image, light, magnetism, voltage and so on. Challenges unique to “Big Analog Data” have provoked three technology trends in data acquisition.
Contextual data miningThe physical characteristics of some real-world phenomena prevent information from being gleaned unless acquisition rates are high enough, which makes small data sets an impossibility. When the characteristics of the measured phenomena allow more information gathering, small data sets often limit the accuracy of conclusions and predictions in the first place. Consider a gold mine where only 20 per cent of gold is visible; the remaining 80 per cent is in the dirt where you can’t see it. Mining is required to realise the full value of the contents of the mine. This leads to the term “digital dirt,” meaning digitised data can have concealed value. Hence, data analytics and data mining are required to achieve new insights that have never before been seen.
Data mining is the practice of using the contextual information saved along with data to search through and pare down large data sets into more manageable, applicable volumes. By storing raw data alongside its original context or metadata, it becomes easier to accumulate, locate and later manipulate and understand.  For example, examine a series of seemingly random integers: 5126838937. At first glance, it is impossible to make sense of this raw information. However, when given context — (512) 683-8937— the data is much easier to recognise and interpret as a phone number. Descriptive information about measurement data context provides the same benefits and can detail anything from sensor type, manufacturer, or calibration date for a given measurement channel to revision, designer, or model number for an overall component under test. In fact, the more context that is stored with raw data, the more effectively that data can be traced throughout the design life cycle, searched for or located, and correlated with other measurements in the future by dedicated data post-processing software.
Intelligent DAQ nodesData acquisition applications are incredibly diverse, but across a variety of industries and applications, data is rarely acquired simply for the sake of acquiring it. Engineers and scientists invest critical resources into building advanced acquisition systems, but the raw data produced by those systems is not the end game; instead, raw data is collected so that it can be used as an input to analysis or processing algorithms that lead to the actual results system that designers seek for. For example, automotive crash tests can collect gigabytes of data in a few tenths of a second that represent speeds, temperatures, forces of impact, and acceleration. But one of the key pieces of pertinent knowledge that can be computed from this raw data is the Head Injury Criterion (HIC) — a single scalar, calculated value representing the likelihood of a crash dummy to experience a head injury in the crash.
Additionally, some applications — particularly in the environmental, structural or machine condition monitoring spaces — avail themselves to periodic, slow acquisition rates that can be drastically increased in bursts when a noteworthy condition is detected. This technique keeps acquisition speeds low and keeps logged data to a minimum while allowing sampling rates that are adequate enough for high-speed waveforms when necessary in these applications.
To incorporate tactics such as processing raw data into results or adjusting measurement details when certain criteria are met, one must integrate intelligence into the DAQ system.
Although it’s common to stream test data to a host PC (intelligence) over standard buses like USB and Ethernet, high-channel-count measurements with fast-sampling rates can easily overload the communication bus. An alternative approach is to store data locally and transfer files for post-processing after a test is run, which increases the time it takes to realise valuable results. To overcome these challenges, the latest measurement systems integrate leading technology from ARM, Intel and Xilinx to offer increased performance and processing capabilities as well as off-the-shelf storage components to provide high-throughput streaming to disk. With onboard processors, the intelligence of measurement systems has become more decentralised by having processing elements closer to the sensor and the measurement itself. Modern data acquisition hardware includes high-performance multicore processors that can run acquisition software and processing-intensive analysis algorithms in-line with the measurements. These intelligent measurement systems can analyse and deliver results more quickly without waiting for large amounts of data to transfer — or without having to log it in the first place — which optimises the system to use disk space more efficiently.
Intelligent measurement systems can analyse and deliver results more quickly without waiting for large amounts of data to transfer — or without having to log it in the first place — which optimises the system to use disk space more efficiently.

Cookie Consent

We use cookies to personalize your experience. By continuing to visit this website you agree to our Terms & Conditions, Privacy Policy and Cookie Policy.

Tags:
Webinar
Webinar

Your future advertising space? Our media data

OEM Update QR Code
OEM Update QR Code

Events

Metal Forming Expo
Metal Forming Expo
amtex
amtex
Fastener Fair India
Fastener Fair India
Himtex 2024
Himtex 2024
Pharma India Expo
Pharma India Expo
World of Photonics India
World of Photonics India
IFFE Expo
IFFE Expo
India Essen Welding and Cutting Expo
India Essen Welding and Cutting Expo

eMagazine April 2024

eMagazine April 2024
eMagazine April 2024

Your future advertising space? Our media data

Our Sponsors

Carl Zeiss India
Carl Zeiss India
STMCNC
STMPC
B&R Automation
BR Automation
Pragati Gears
Pragati Gears
Messer Cutting
Messer Cutting
Fuji Electric India
Fuji Electric India
Bibus India
Bibus India
DMG Mori India
DMG Mori India
PMT Machines Ltd
PMT Machines Ltd
Igus India Pvt Ltd
Igus India Pvt Ltd
Vega India Level Ltd
Vega India Level Ltd
Dosatron
Dosatron
Fagor Automation
Fagor Automation
Super Slides
Super Slides
Quality Spares Center
Quality Spares Center
Widma
Widma
Autonics
Autonics
ACE Micromatic Group
ACE Micromatic Group
IMTMA- DMTX
IMTMA- DMTX
Chicago Pneumatic Tools
Chicago Pneumatic Tools
Fenwick and Ravi
Fenwick and Ravi
MMC Hardmetal Pvt Ltd
MMC Hardmetal Pvt Ltd
Mennekes
Mennekes
Fietest
httpswww.fietest.com