How big is big data? For NASA missions, hundreds of terabytes are
gathered every hour. Just one terabyte is equivalent to the information
printed on 50,000 trees worth of paper.
"Scientists use big data for everything from predicting weather on Earth to monitoring ice caps on Mars to searching for distant galaxies," said Eric De Jong of JPL, principal investigator for NASA's Solar System Visualization project, which converts NASA mission science into visualization products that researchers can use. "We are the keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps and movies to find patterns and verify theories.
JPL is involved with archiving the array's torrents of images: 700 terabytes of data are expected to rush in every day. That's equivalent to all the data flowing on the Internet every two days. Rather than build more hardware, engineers are busy developing creative software tools to better store the information, such as "cloud computing" techniques and automated programs for extracting data.
"We don't need to reinvent the wheel," said Chris Mattmann, a principal investigator for JPL's big-data initiative. "We can modify open-source computer codes to create faster, cheaper solutions." Software that is shared and free for all to build upon is called open source or open code. JPL has been increasingly bringing open-source software into its fold, creating improved data processing tools for space missions. The JPL tools then go back out into the world for others to use for different applications.
- More Here
"Scientists use big data for everything from predicting weather on Earth to monitoring ice caps on Mars to searching for distant galaxies," said Eric De Jong of JPL, principal investigator for NASA's Solar System Visualization project, which converts NASA mission science into visualization products that researchers can use. "We are the keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps and movies to find patterns and verify theories.
JPL is involved with archiving the array's torrents of images: 700 terabytes of data are expected to rush in every day. That's equivalent to all the data flowing on the Internet every two days. Rather than build more hardware, engineers are busy developing creative software tools to better store the information, such as "cloud computing" techniques and automated programs for extracting data.
"We don't need to reinvent the wheel," said Chris Mattmann, a principal investigator for JPL's big-data initiative. "We can modify open-source computer codes to create faster, cheaper solutions." Software that is shared and free for all to build upon is called open source or open code. JPL has been increasingly bringing open-source software into its fold, creating improved data processing tools for space missions. The JPL tools then go back out into the world for others to use for different applications.
- More Here
No comments:
Post a Comment