Quantcast, an internet audience measurement and ad targeting service, processes over 20 petabytes of data per day using Apache Hadoop and its own custom file system called Quantcast File System (QFS).
Add Symantec to the rapidly growing list of tech vendors aiming to groom Apache Hadoop for the enterprise. The company today announced Symantec Enterprise for Hadoop, an add-on with which companies ...
Big data can mean big threats to security, thanks to the tempting volumes of information that may sit waiting for hackers to peruse. BlueTalon hopes to tackle that problem with what it calls the first ...
Cloud computing is a new technology which comes from distributed computing, parallel computing, grid computing and other computing technologies. In cloud computing, the data storage and computing are ...
Hadoop accomplishes this by applying more efficient formats and file systems to large datasets that would normally have been out of the reach of standard analytics solutions. For more articles on this ...
Like many others, I continue to be impressed by the open-source Apache Hadoop software framework. With aplomb, it manages Big Data, images, documents, web logs–and tons of other unstructured and ...
With the new release of its Hadoop distribution, Cloudera has radically expanded the set of supporting tools for the data processing framework. “What we saw was that most organizations deploy quite a ...
In this whitepaper, Yahoo engineers Konstantin Shvachko, Hairong Kuang, Sanjay Radia, and Robert Chansle look at HDFS, the file system component of Hadoop. While the interface to HDFS is patterned ...
When it comes to big data, Microsoft has more in the works than just Windows Azure and Windows Server versions of the Hadoop big-data framework. The company is working on a number of supplementary ...
When it emerged from stealth in 2011, MapR was an outlier in the Hadoop community. At the time, Hadoop was defined largely by two projects adapted from Google research: MapReduce, which introduced ...