Compression of an Array of Similar Crash Test Simulation Results
Stefan Peter Muller
Compression of an Array of Similar Crash Test Simulation Results
Stefan Peter Muller
Big data thrives on extracting knowledge from a large number of data sets. But how is an application possible when a single data set is several gigabytes in size?
The innovative data compression techniques from the field of machine learning and modeling using Bayesian networks, which have been theoretically developed and practically implemented here, can reduce these huge amounts of data to a manageable size. By eliminating redundancies in location, time, and between simulation results, data reductions to less than 1 % of the original size are possible. The developed method represents a promising approach whose use goes far beyond the application example of crash test simulations chosen here.
This item is not currently in-stock. It can be ordered online and is expected to ship in approx 4 weeks
Our stock data is updated periodically, and availability may change throughout the day for in-demand items. Please call the relevant shop for the most current stock information. Prices are subject to change without notice.
Sign in or become a Readings Member to add this title to a wishlist.