Transformation of Communication Processes: Data Journalism

Reading time: 5 minute
...

📝 Abstract

This article gives a brief review of the history of data journalism, as well as the prerequisites for its appearance. The authors describe the advantages of employing data-journalism skills in the newsrooms. Finally, the article provides a review of data-driven journalism projects all over the world, state of art 2014.

💡 Analysis

This article gives a brief review of the history of data journalism, as well as the prerequisites for its appearance. The authors describe the advantages of employing data-journalism skills in the newsrooms. Finally, the article provides a review of data-driven journalism projects all over the world, state of art 2014.

📄 Content

414 TRANSFORMATION OF COMMUNICATION PROCESSES: DATA JOURNALISM Introduction Data journalism can be mistaken as the new buzzword for infographics. This is true, but only to some extent. Visualizing data to tell stories started long before the Internet came into being. What makes an attractive infographics, is not the great design, but, more importantly, the insight it gives. The John Snow’s map of cholera outbreaks from nineteenth century London, is an example of reference. The English physician mapped the cases of cholera deaths in the Soho district of London. Back then, the very notion of germs did not exist. Being mapped, the plain data gave an insight: the outliers clustered around the pump. This is just one of many examples of maps and charts that give another perspective on the same dataset. Another important part of data journalism – or, more precisely, data-driven journalism, is the use of computers, math and statistical analysis. Computer-assisted reporting was born in the American newsroom in the late 1960s, where technologi- cal advances met the social sciences. It was during the US national elections in 1952 when journalists first used a computer to predict the outcome of the vote – and the machine got it right. Fifteen years later Philip Meyer, a journalist working at the Detroit Free Press, used IBM 360 to cover the Detroit riots in 1967. Through a machine survey, he was able to investigate the dataset and sketch a profile of the rioters. This Pulitzer-win- ning story was the first attempt by a journalist to use analytical methods from sociol- ogy, behavioral science research methods and similar domains in the context of a newsroom. Philip Meyer himself called this “precision journalism.” The increasing size of datasets was another step on the way to data-driven journalism. In 2006, Adrian Holovaty, an American web developer, journalist and entrepreneur, wrote a blogpost which turned out to be a manifesto for data-driven journalism.1 His main point was that “newspapers need to stop the story-centric 1 See <http://chicago.everyblock.com/crime/ is an example of Holovaty’s implementa- tion of his own manifesto>. N. Kayser-Bril Journalism++, A. Valeeva University of Siegen, I. Radchenko ITMO University 415 worldview.” What is required from media, is “structured information: the type of information that can be sliced-and-diced, in an automated fashion, by computers.” As Holovaty wrote, the media of the future “have to build an infrastructure that turns them into reliable data hubs, able to analyze even very large and complex data- sets internally and to build stories on their insights.” What changed in the late 2000’s was that Internet gave access to an unprec- edented wealth of information and computing power. Data from the public bod- ies and corporations are becoming increasingly available, in a movement known as open data. The vast amount of information calls for new methods to find and convey meaning from the original data. From the other end, the tools to handle data permit the evolution in the news- room to happen. What used to be the exclusive domain of computer scientists can now be done by any journalist. Free software allows anyone to manage, analyze and visualize data. Open source alternatives abound to analyze geographic data or visual- ize it. Further, solutions on how to scrape, collect and store vast amounts of data are starting to come to the market. Newsrooms meet data journalism The first major news organization to adopt the term is The Guardian, which launched its Datablog in March 2009. Its editor Simon Rogers and his team regular- ly published stories based on a dataset which he also made public for others to use. At the same time in France, Le Post (now Le Huffington Post), a property of Le Monde, commissioned an interactive ranking of French members of Parliaments (MPs) summarizing the number elective mandates they held besides their position as MPs. The work involved scraping, visualization and investigation and was done by a team of external contractors,2 in-house developers and journalists. What would today be called data journalism was then done with no clear framework and no de- scriptive term. These first experiments created the foundations of the European data journal- ism for three reasons. The first one is technical. The second is political. The third one is circumstantial. In 2007, Apple launched the first version of its iPhone. It did not allow Ado- be’s Flash software to run on the device, mostly as a way to save power. Flash had been the ubiquitous tool for interactive content for the previous ten years. Apple’s move meant that content produced with Flash would not be visible on the iPhone and later on the iPad. It fastened dramatically the demise of Flash and the rise of 2 Disclosure: Nicolas Kayser-Bril was one of them. 416 JavaScript-based, browser-rendered interactives. Without delving into technical de- tails,

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut