Distant Reading the History of Swedish Film Politics—in 4,500 Governmental SOU Reports

Together with my PhD student Fredrik Norén, I am currently writing an article for a forthcoming thematic issue on ‘archives’ in the Journal of Scandinavian Cinema. Our article gives a new archival perspective of the history of Swedish film politics and policy making, by so called distance reading and topic modeling of one particular collection, 4,500 digitised Swedish Governmental Official Reports (from 1922 to 1991). Basically we argue that by using different computational methods, digitised collections (and archives) can today be scholarly scrutinised in their entirety. In the current version, the introduction to the article reads as follows:

Archives are never neutral—through them bits and pieces from the past are studied. At best, scholars glean representative segments and traces from them. At worst, only specific samples are used, stressing the particular: “the anecdotal nature of evidence”, as digital humanist Andrew Piper has polemically characterised the traditional ways of doing humanistic research (Piper 2016). The ‘anecdotal nature’ of humanistic inquiries and scholarly work has, however, been due to the analogue nature of archives and collections. Since all archives are (and have always been) epistemic grounds from which history is written, the structure and organisation of them have determined what type of research that could be executed. Then digitisation happened—and scholarly work could suddenly be drastically reconfigured.

The digitisation of historical films and related film historical material have increased enormously during the past two decades. Different forms and formats of humanistic infrastructures have gradually developed, which film and media historians have (and will increasingly) benefit from. In Sweden, the National Library (togehther with the Swedish Film Institute) is step by step increasing the amount of films available at filmarkivet.se, and at the sister site, filmarkivforskning.se—which one of us authours (Snickars) has been responsible for—a tremendous amount of contextual film and media historial material have been digitised, amassed and made available. Arguably, the ways in which Swedish film history and cinema culture is researched and understood is gradually bound to change as a result. If the so called New Film History turned out to be a reconceptualisation of the history of moving images in the late 1980s and 1990s—and to some extent Media Archeology a decade later—at present, digitally inclined scholars are witnessing a similar reorganisation of (and around) the practices of doing film and media historical research due to digital technology and humanistic infrastructures built around them. The growing amount of digital databases and material in the cultural heritage domain are, as a Bernhard Rieder and Theo Röhle has put it, simply “begging to be analysed” with new digital methods (Rieder & Röhle 2012).

Following David Bordwell, the role models of New Film History scholars were young historians “hunched over microfilm machines cranking through day after day of Moving Picture World or sitting in archives paging through studio memos” (Klenotic 1994). Today, the ideal counterpart—at least as she is envisioned within the digital humanities—is a scholar involved in cross-disciplinary projects using digital methods and tools like data mining, visualisations, topic modeling or GIS analyses. During the last decade, digital technology has played an increasingly important role in numerous film historical projects and platforms, ranging from early cinema programs collected within The Siegen Cinema Databases (Ross & Garncarz 2006) to statistical data about film editing at Cinemetrics (Tsivian 2009) and the radiant Timeline of Historical Film Colors (Flückinger 2011) to the so called New Cinema History, with its computational focus on the circulation and consumption of films (Verhoeven 2012)—not to mention the highly elaborate platform, Lantern, used for searching, exploring and visualising the vast collections of the Media History Digital Library (Hoyt & Hagenmaier 2011). What these projects (and others) have in common is the ways in which computational analysis and quantitative research offered by databases, visualisations and data mining, have allowed film historical researchers to gather new and different information about both film style and the aesthetics of the medium as well as the history of cinema exhibition and reception. Such information would previously have been impossible to gather—or way too labor intensive to undertake. A common denominator is also that digitised archives have (via different computational methods) frequently been scholarly scrutinised in their entirety.

Then again, what precisely does computation allow one “to claim that has not been seen before or that was uncertain in the world of anecdotalism?” (Piper, 2016). This article claims to give a new archival perspective of the history of Swedish film politics and policy making by distance reading and topic modeling 4,500 digitised Governmental Official Reports (from 1922 to 1991). Our article examines different probabilistic topics related to film (and media) that the algorithm within our topic modeling software (Mallet) extracted from the immense text corpora of all these reports. Initially, the article describes the work performed by governmental commissions and the SOU-genre and its relation to film politics, policy making and film scholarly research. Foremost, however, our text methodologically recounts and analyses novel ways to understand and situate the history of Swedish film politics and policy through topic modeling a massive SOU corpora. Basically, the article captures a number of film discourses and trends in the joint corpus, otherwise more or less impossible to detect and apprehend through a traditional archival investigation. Topic modeling is methodological approach to study themes in texts by accentuating words that tend to co-occur, and together create a topic. The most apparent filmic topic within our corpus, for example, included the following ten terms in strict succession: “film”, “cinema”, “screening”, “producer”, “production”, “entertainment tax”, “film production”, “short film”, “video” and “censor agency”—stressing, for example, that a general theme of Swedish film politics between 1920 and 1990 was devoted to economical issues. Within topic modeling a term or a word may also be part of several topics with different degrees of probability. In addition, the article therefore constructs and traces a broader context of Swedish film politics over time, especially in relation to other media formats and institutional actors.


Governmental Commissions & the SOU Genre
Before submitting a proposal for new legislation, the Swedish Government regularly examines alternatives, a task carried out by an appointed Committee or Commission of Inquiry. Basically, the governmental committee process is a way of accessing knowledge around particular issues of concern. These might range from a major policy decisions affecting Swedish society as a whole, to small and technically complex issues. Arguably, governmental commissions have during the last 80 years—as both a cause and effect of the establishment of the Swedish welfare state—developed into an effective instrument of majority parliamentarism. Decision making via governmental commissions, however, has a long history in Sweden. The national legislative process providing the Government—and prior, the King—with a proper decision support, dates back to the 17th century. Over the years governmental commissions became accustomed to the most diverse tasks for policy making; the preparatory system was dynamic, inexpensive and hence survived during centuries.

From 1922 and onwards, the work of governmental commissions (usually) resulted in a written report. It was published in a series known as the Swedish Government Official Reports, Statens offentliga utredningar (in Swedish, abbreviated as SOU). Since then, all such reports have been published with a distinct numeral: the first SOU devoted to film, for example, has the number SOU 1930:26. In essence, SOU-reports and work performed within governmental committees had the task of preparing the State for apt and rational decision making. After 1945 “the range of subjects covered by governmental committees has expanded to include virtually every area of the Swedish welfare state”, according to political scientist Rune Premfors. Stressing the importance of the work executed by governmental committees, he has stated that some 40 percent of all legislation in Sweden around 1970 were based on commission proposals (Premfors 1983). During the 1960s, certain policy issues also started to be investigated internally within governmental Ministries, resulting in reports in the so called Ds-series [Departementsserien]. The SOU-series, however, with its external investigations performed by a commission staffed with some four to five members and usually running for a numbers of years, is arguably the more important historical source on Swedish governmental policy making—not the least since governmental commissions were modeled on ideas around scientific inquiries with SOU reports often engaged in ambitious efforts to (re)write history.

Today, some 200 SOU-inquiries are usually in progress at the same time. After a governmental commission has submitted a report to the responsible Minister within the Government, it is also dispatched for consideration to relevant authorities, advocacy groups and the public. “They are given an opportunity to express their views on the conclusions of an inquiry before the Government formulates a legislative proposal” (Government Offices of Sweden 2016). Consequently, it has historically been argued that this particular legislative feed-back system is a “distinctive working method” within Swedish democracy itself (Meijer 1956: 8). Naturally, governmental commissions have occasionally been accused of being biased. When the Government appoints a commission, it also provides a set of guidelines—instructions that might have a political tendency. Guidelines ascribe what specific issue a commission should examine, what problems that need to be solved, as well as date when the final report should be completed. As a consequence, SOU reports usually propose various committee suggestions, sometimes even including how these ought to be financed.

As Jan Johansson has argued in his dissertation, Det statliga kommittéväsendet [The Governmental Committee System], the two most essential features of the governmental commission system, has been (and still is) the focus on expert knowledge as well as an urge to reach some form of compromise or consensus (Johansson 1992). Governmental commissions’ have, in short, been an arena for the exchange of factual arguments among experts (including academics), situated within a rationally oriented Swedish style of policy-making, ultimately geared towards reaching an agreement. Committee work have, in fact, often been organised in a similar manner, involving different actors and persons—that nevertheless need to agree. One concurring step has usually been to give a dedicated name to the committee, such as the “Data Archiving Commission” or the “Film Commission 1968”. The orientation towards compromise and consensus has, hence, in various ways been “structurally fixed”, were committee members simply “have to arrive at a final report” (Veit 2009:186).

Regarding the medium of film, one of the first governmental commissions devoted their work to draft a new subsidy system for the production of Swedish feature films, Statligt stöd åt svensk filmproduktion (SOU 1942:36). Heading that commission was legal adviser Carl Romberg, from the Swedish Department of Justice, and committee members also included the head of Swedish Radio, Carl Anders Dymling, and well known film directors as Victor Sjöström and Arne Bornebusch. The mixture of persons, some with filmic hands-on experience, can be seen as representative how governmental commissions on film were usually staffed. One of the most important governmental commissions on film, the “Film Commission 1968”, whose work resulted in the SOU, Samhället och filmen [Society and Cinema] from 1970, included both the Head of the Swedish Film Institute, Harry Schein as well as the directors Jan Troell and Kjell Grede. In addition, the director turned film historian, Gösta Werner from Stockholm University contributed with a lengthy history of “Swedish film during 75 years”—in nineteen chapters. The work performed by the “Film Commission 1968”, in many ways testifies to the textual significance of the SOU genre. Reports were often of book length, and some were even published in a number of volumes. Commission work usually went on for years. In the case of the SOU, Samhället och filmen, it took five years and appeared in four volumes, together spanning some 750 pages.

The SOU genre, thus, indeed bears textual witness to and gives evidence of contemporary societal conceptions—around film and film policy making, in this case—not the least since committee members often disagreed. According to law, governmental commission work had to be archived, and preserved papers from the “Film Commission 1968”—in 29 volumes at the Swedish National Archives (Riksarkivet)—attest for example to Schein being displeased with Werner’s film historical survey. In short, he wanted less film aesthetics and more focus on cinema audiences and film production (Schein 1973). Since governmental commissions relied on expert knowledge, they can hence also shed light on the role of academics and the involvement of research. In general, governmental commissions relied on scientific research; sometimes separate research anthologies were published in conjunction with SOU reports. One of us authors (Snickars) has elsewhere argued that media studies in Sweden in fact arose at the intersection between the media industry and its public, the needs of media and cultural policy—filtered through governmental reports (and other official inquiries)—as well as academia’s newfound interest in media (at both social science and humanities faculties). Especially, the 57 media-related SOU reports (between 1960 and 1980) were an important engine for the development and institutionalisation of media research in Sweden (Hyvönen, Snickars & Vesterlund 2017). The same holds true for the medium of film as well, especially during the 1960s and 1970s. Gösta Werner has been mentioned, and in addition, the later professor of film studies at Stockholm University, Leif Furhammar, took part in the so called “Film Censorship Commission” from 1964 to 1969. After 1980, however, it seems to have been more rare to include academics in governmental commission work on film. Media historian Mats Björkin (then at Stockholm University) was all likely the last academic to be involved as a secretary in the governmental commission on the preservation of documentary film heritage, SOU 1999:41. Suffices to say, SOU reports on film have over the years been frequently used by film historical research in Sweden; in Furhammar’s classical study, Filmen i Sverige (1991) some 15 SOU’s around film are for example discussed and referenced.

Föreläsning på kulturarvskonferens i Umeå

Imorgon ska jag hålla öppningsföredraget på kulturarvskonferensen, Framtiden knackar på – från historiska kartor till hardcorescenen. En konferens om arkivens komplexa uppdrag. Arrangörer är Sveriges depåbibliotek och lånecentral samt Forskningsarkivet på Umeå universitetsbibliotek (i vars styrelse jag sitter). Jag ska prata om olika sätt att digitalisera och analysera kulturarvet med utgångspunkt i projektet, Digitala Modeller. Mina 85 slides är nu klara – och kan laddas ned som en PDF. Dessvärre inkluderar de inte flertalet animationer, vilka kanske är de mest intressanta i sammanhanget: snickars_digitala_modeller_umea.

Det moderna genombrottets medier – workshop i Trondheim

I veckan ska jag delta som inbjuden föreläsare på en workshop på NTNU i Trondheim kring “Det moderna genombrottets medier“. Jag ska prata under titeln, “Digitala lägg. Om modernitetens medier filtrerade genom svensk dagspress 1850 till 1900”, och tanken är att summera ihop vad jag och min kollega Johan Jarlbrink ägnat åt oss inom detta projekt. Utgångspunkten för denna workshop är att flera forskare på NTNU förbereder en större ansökan med arbetstiteln, “Mapping the Rise of Contemporary Media Culture”. Min föreläsning kommer därför att koncentrera sig på såväl material som metod, liksom ett par tips när det gäller ansökningsdesign. Och det ska – som alltid – bli fantastiskt att återse Nidarosdomen.

Spotify & Digital Methods – forthcoming thematic section in the journal Culture Unbound

Within our research project on streaming media and Spotify, we are currently putting together a special thematic section for the journal Culture Unbound. Six articles will be included, three long ones, and three short ones, and we are currently wrapping up all writing. Together with Rasmus Fleischer, I am myself finishing the introduction to the thematic section – which in the present version reads like follows:

With a user base now officially reaching 100 million—including 40 million paying subscribers—the music streaming platform Spotify is today widely recognised as the solution to problems caused by the last decades of digital disruption within the music and media industries. Spotify resembles Netflix, YouTube—and lately Apple Music—as an epitome of streaming’s digital zeitgeist envisioned to shape our future. Industry interviews, trade papers, academic books, and the daily press reiterate innumerous versions of this “technological solutionism” (Morozov 2014) in almost as many variations.

This thematic section of Culture Unbound is broadly concerned with the music service Spotify, and novel ways to approach and do academic research around streaming media. Approached through various forms of digital methods, Spotify serves as the object of study (but basically any other streaming media services can be studied in similar ways.) The six articles presented—three long ones, and three short ones—emanates from a cross-disciplinary research project entitled, “Streaming Heritage: Following Files in Digital Music Distribution”. It was initially conceived at the National Library of Sweden (hence the heritage connection), but the project has predominantly been located at the digital humanities hub, Humlab at Umeå University, where the research group has continuously worked with the lab’s programmers. The project involves four researchers and one PhD student, and is funded by the Swedish Research Council between 2014 and 2018.

While most previous scholarship on Spotify has primarily focused the music industry, digital music economy, intermediation or piracy (Wikström 2013; Wikström & DeFilippi 2016; Allen Anderson 2015; Galuszka 2015; Andersson Schwarz 2013), our project takes a software studies and digital humanities approach towards streaming media. The project broadly engages in reverse engineering Spotify’s algorithms, aggregation procedures, meta-data, and valuation strategies, all in order to study the platforms underlying norms and structures. In short, reverse engineering starts with the final product (the music service Spotify in our case) and tries to take it apart—backwards, step by step. Basically, our scholarly purpose is thus to draw a more holistic picture by using Spotify as a lens to explore social, technical, and economic processes associated with digital media distribution. The key research idea within the project is to ‘follow files’—rather than the people making, using, or collecting them—on their distributive journey through the streaming ecosystem, and hence to take empirical advantage of inherent data flows at media platforms (as Spotify).

Over the last ten years, within the extensive field of media and internet studies, different types of digital methods have been taken up as key instruments for developing novel ways to analyse and understand, ‘the digital’, ‘the internet’, as well as digital media production, distribution and consumption. Following the catchphrase, ”the system is the method” (Bruhn Jensen 2011), digital methodologies are increasingly deployed for performing broad social science or humanistic inquiries on, for example, big data and black-boxed media platforms (as Spotify) that today increasingly serve as key delivery mechanisms for cultural materials (Ruppert, Law & Savage 2013). As a research practice, digital methods “strive to follow the evolving methods of the medium” (Rogers 2013:1). Furthermore, the issue of data of, about, and around the Internet, as Klaus Bruhn Jensen has eloquently stated, “highlights the common distinction between research evidence that is either ‘found’ or ‘made’”. If one removes complexities, basically all ‘evidence’ needed for Internet or digital studies is hence already at hand. When listening to music at Spotify, for example, user data is constantly being produced. Such data is “documented in and of the system”, and “with a little help from network administrators and service providers” it can be utilised as the empirical base for doing research (Bruhn Jensen 2011:52).

To researchers seeking to take empirical advantage of data flows at contemporary media platforms, however, it quickly becomes clear “that such platforms do not present us with raw data, but rather with specially formatted information” (Marres & Gerlitz 2015). Data, in short, is often biased (in one way or the other). Twitter, for example, determines what data are available and how it can be accessed, and researchers often have a hard time knowing what relevant data that might be missing. Nevertheless, the major academic problem confronting media scholars seeking to work with digital method problem is the lack of access to data. In our research project, the difficulty in working with Spotify, is in essence that the company does not share any data. As a consequence, user data has to be acquired and compiled through other means—for example by deploying bots as research informants, or by recording and aggregating self-produced music and sounds. Building on the tradition of ‘breaching experiments’ in ethnomethodology, our project has as a consequence, tried (in different ways) via repeated and modified interventions and experiments, to ‘break into’ the hidden infrastructures of digital music distribution. On the one hand we have been interested in broadly studying different data patterns and media processes at Spotify, but we have also been keen on producing and obtaining research data, for example by using bots as virtual listeners, by documenting (and tracing) Spotify’s history through constantly changing interfaces, or by tracking and archiving advertisement flows (through software as Ghostery).

Localising Spotify
Departing from the interventionist and experimental approaches we have used in our research project—which both metaphorically and practically tries to track and follow the transformation of audio files into streamed experiences in the simple way a postman would follow the route of a parcel (from packaging to delivery)—the notion of localisation has become salient. Following files is more or less a technical impossibility in a streaming media context, yet approaching and circumscribing Spotify—both as a company and a service—has also proven to be hard. Within our research project, we have in fact, repeatedly asked the insidiously simple question: where and when is Spotify? It might seem naive, but during the research process it has become increasingly difficult for us to understand and grasp our object of study.

A Google search on “What is a Spotify?” generates the answer: “a Swedish commercial music streaming, podcast and video service that provides digital rights management— protected content from record labels and media companies” (Google 2016). But such an answer hides more than it shows, and can easily be problematised. Is Spotify, for example, a content platform, distribution service or a media company? Furthermore, music naturally lies at the heart of Spotify (even if podcast and video seems increasingly important), but what kind of content is actually accepted—i.e how is ‘music’ defined? And what about the ‘Swedishness’ of Spotify—where is the company actually located? Headquarters is still to be found in central Stockholm on Birger Jarlsgatan 61, but the service is now available in some 60 countries, not to mention the digital variety of desktop and mobile versions (which all differ slightly). In addition, how to situate Spotify commercially and financially—i.e. how much money is Spotify actually making (or losing), and how to measure its economic impact?

As is apparent from the four issues above—and one could easily have inserted more of them—localising Spotify is easier said than done. Starting, however, with the question if Spotify is a tech or a media company, for a number of years it was obvious that Spotify foremost offered a technological solution for record companies struggling with piracy. In a private conversation a couple of years ago, one of us (Snickars) asked Sophia Bendz (at the time Head of Spotify marketing) what kind of company Spotify actually was. Without hesitating for a second, Bendz stated that Spotify was a tech company, only distributing content produced by others. The tech identity, however, was even back then in 2012 dubious, and has become increasingly harder to sustain. Advertisement serves a case in point. In endless discussion with record labels (around rights management) Spotify took the stance that a freemium version (Spotify Free) with recurrent advertisement, would in the long run be the best solution, foremost as an incentive to scale businesses and attract global listeners. Spotify’s classification as a strict ‘tech company’, hence, misses the fact that a core part of its businesses has been to provide content to audiences, and selling those audiences to advertisers. Other music services decided otherwise, and Spotify has consequently had to struggle—and increasingly become more of a media company—all in order to keep to its business plan with Free and Premium. Arguably, the music industry still sees Spotify as the top streaming service around, “but the company has done little to address the lack of new music from a large collection of major artists when their albums are released” (Singleton 2016). That is, in a digital environment where streaming music becomes default, focusing on tech and distribution only, will result in missed business opportunities. True, Spotify has not really entered into content production (like Netflix), even though some self-made videos are provided, such as interviews with artists as well as other content (like pop-ups that explain lyrics). Hence, stating that Spotify is only a tech company (in the form of a streaming service) fails to see other defining characteristics of the enterprise.

Föreläsning för Svensk förening för informationsspecialister

Igår kväll höll jag en föreläsning på KTHs bibliotek för Svensk förening för informationsspecialister på temat: “Digital humaniora – vad är det?” Jag pratade dels om forskningsfältet kring digital humaniora, dels om digitala metoder – och därtill om några aktuella forskningsprojekt som jag ägnar mig åt (vilka knöt an till temat). För den intresserade kan min presentation laddas ned som PDF här: snickars_dh_kthb.

Vad är ett bibliotek?

Igår höll jag en föreläsning för biblioteksvetenskaps-studenterna i Umeå inom ramen för kursen “Bokens och bibliotekets historia och samhällsroll”. Det hela handlade bland annat om digitalisering, Kungliga biblioteket, den nationella biblioteksstrategin och forskningsbibliotekens roll. Mest intressant var dock diskussionerna kring vad ett bibliotek numera egentligen är för något – är det en institution eller en funktion? Det finns knappast något enkelt svar på den frågan. Hursom, för den intresserade kan mina föreläsnings-slides laddas ned här: snickars_bib_infovetenskap_2016.

Radio Looping

Together with my colleague, Rasmus Fleischer, I am editing a special thematic section on “Spotify and digital methods” for a forthcoming number of the journal Culture Unbound. In all, six to seven articles will be included (we hope); Rasmus is currently finishing a piece on the commodification of online music, another article will deal with streaming music and gender, and a third will have a look at the #backaspotify-campaign. The thematic issue will all likely be out next year. In addition, we are currently together writing another article to be included on so called radio looping, that is a critical discussion on Spotify Radio, its history, functionality and ‘looping’ as media theoretical category. The tentative introduction to our article currently reads as follows, and sort of gives a hint at what we aim to address.

Sometime during Winter 2014 someone posted the following question on Quora: why does “my Spotify radio sounds so repetitive? I feel I am getting a few artists repeated in my Spotify radio.” Well, the Finnish ‘infojunkie’ Heikki Hietala swiftly replied (some time later in March), that’s because “the radio functionality in Spotify is very crude.” At the time, Spotify Radio had been around for more than two years, yet users seemed somewhat dissapointed. Maybe Spotify will “come up with something soon”, Hietala remarked, as for now “it’s very annoying”. Hietala apparently has had the same experience of repetition of songs played on Spotify Radio, and instead recommended the music streaming service Pandora. According to him, the latter had more succesfully “chopped the music up into tiny pieces of metadata, and [Pandora] are able to deliver a truly mesmerising radion function due to the vast amount of information they have on the music” (Hietala 2014).

Quora is a so called ‘question-and-answer’ site. Questions are posted—and subsequently answered, edited and organized by the community of users on the same site. Quite a number of questions on Quora evolve arund tech—which is hardly surprising since the company is co-founded by two former Facebook employees and based in Mountain View, California (Google headquarters). Quora also seems to be a site frequented by tech employees themselves. Tech Lead at Spotify, Erik Bernhardsson, has for example published almost 30 posts, some adjacent to the discussions around Spotify Radio. A couple of months after Hietala’s post, a similar disapproval reappered on Quora. In fact, almost identical questions around the dodgy functionality of Spotify Radio kept being repeatedly posted: “How do I get Spotify to stop playing the same few songs for every artists?”; “How do I teach a Spotify radio station to play a wider array of songs?”; “Why does my Spotify Radio play the same artists over and over for me?” (our italics).

The last of these question was asked by web designer, Bas Leijder Havenstroom, who in a re-entry specified in more detail what he was puzzled about: “I re-asked this one because this frustrates me as well. Even if I start a radio station based on a playlist with many many artists, I find that some (specific) artists keep coming back. I have the feeling that this all has to do with commercial reasons. I believe record labels pay Spotify to have their artists to show up in radio stations and random functions more often” (Leijder Havenstroom 2015). Apparently, the algorithms running Spotify Radio are identical, independent whether one uses the Free or the Premium service (the only difference is that advertisements play in the latter version, which also cannot stream higher audio qualities). The issue, however, if track repetition on Spotify Radio has commercial reasons remains obscure. Basically, the same uncertainty und unpredicatbility goes for trying to research the different algorithms regulating music recommendations on Spotify Radio. In addition, these vary and have naturally been altered and improved since the release of the radio functionality. 2012 was the year when Spotify began updating its desktop software with several new features, including a Pandora-like radio station. “Spotify to Take On Pandora With Radio service”, was subsequently announced online (Hachman 2012). The commercial whiff was hard to hide; an online radio offering “would advance Spotify’s strategy of attracting users with free, ad-supported services who can be converted later into paying subscribers”, Bloomberg reported (Fixmer 2012).

Today, Spotify Radio is one the standard features of the music service, available on all platforms. The Spotify Radio “lets you sit back and listen to music you love. The more you personalize the stations to match your tastes the better they get”, the company announces online (Spotify Radio 2016). Spotify Radio is arguaby a popular service. The functionality allows people (or rather various algorithms) to discover new music within the vast back-catalogue of Spotify, offering a potential infinite avenue of discovery. Then again, following the different conversations on Quora and elsewhere on the Spotify comminity Web for example, the service has also been disliked. In fact, it has repeatedly aroused disappointment, and even substantial criticism. This was definitively the case when the service was launched in 2012, or as user stated at the comminity Web: “better radio algorithms … there are too many repetitions” (lehwark 2012). Yet, as is evident from the later discussions on Quora and the Spotify comminity Web, the sometimes quite devastating critique has remained: “the terrible radio algorithm repeats the same songs over and over (see [the linked] thread, which has been going for 2+ years)”, user ‘tellure’ groaned in late 2015. “Need to update the algorithims for Radio”, the user ‘zaliad’ lamented a couple of months later, “the repetitions are SAD at this point within 1 hour I can easily hear the same song three times” (zaliad 2016).

Within our research project on Spotify we therefore decided to set up an experiment that would try to explore Spotify Radio, and ultimately the limitations and restrains found within ‘infinite archives’ of music streaming services. Our hypothesis was that many streaming services’ radio functions (like Spotify) appear to consist of a series of tracks that are played over and over again. Spotify Radio claims to be never-ending, yet there seems to be a loop pattern. If our hypotheses would prove to be true, what would such loop patterns look like? Are Spotify Radio’s music loops finite—or infinite (given that ‘the algorithm’ can after all choose between 30 million songs)? How many tracks (or steps) does a ‘normal’ loop consist of? Importantly, how is the size of a ‘music loop’ on Spotify Radio affected by user interaction in the form of skips, likes and dislikes? Does for example n amounts of ‘likes’ expand the music loop in terms of novel songs and artists? In short, how much preprogrammed imagination does streaming platforms really display?

In order to answer these research questions, at the digital humanities hub, Humlab (Umeå University), we set up an experiment in the form of a reversed engineered radio loop, with 40 bot ‘listeners’. Essentially, reverse engineering starts with the final and implemented product—in our case the Spotify Radio application within the streaming service desktop client—and takes it apart “seeking clues as to why it was put together in the way it was and how it fits into an overall architecture” (Gheel 2014:10). Our bots were Spotify Free users—with literally no track record; they had ‘heard’ no music before they were put into action. They were programmed to start Spotify Radio based on Abba’s “Dancing Queen”, document all subsequent tracks played in the loop, and (inter)act within the Spotify Web client as an ‘obedient’ listener, a ‘skipper’, a ‘liker’ and a ‘disliker’. On the one hand this article recounts, analyses and discusses the intervention we set up. Yet, on the other hand, the article also describes the background and the establishment of the ‘radio functionality’ at streaming services, as well as tries to ponder and reflect around the media theoretical concept of looping per se.

Digital humaniora – en lägesrapport publicerad i Respons

Jag har idag publicerat en lägesrapport om digital humaniora i tidskriften Respons. Ingressen ger vid handen att artikeln handlar om detta: “Digital humaniora är ett vitalt forskningsfält som förutsätter tekniskt kunnande och dialog med programmerare. Det saknas inte kritiker av digital humaniora, men ur ett metodologiskt perspektiv är utvecklingen omistlig och det är inte nödvändigt att ställa den klassiska humanioran mot den digitala. Det hävdar Pelle Snickars, professor i medie- och kommunikationsvetenskap vid Umeå universitet, som besökt DH-fältets största konferens för att ta tempen på ett forskningsfält i ropet.” Texten kan också laddas ned som PDF: PS Reportage DH.

Meta-Academic Reflections from Inside the Swedish Public Service Broadcasting Commission

I am currently writing an article for the upcoming Ripe@2016-conference, Public Service Media In a Networked Society? set in Antwerp in a couple of weeks. The article needs to be finished before the conference, and my take is a kind of meta-academic reflections from the inside of the Swedish Public Service Broadcasting Commission, that I worked with during last year. In short, the article will describe the work we did in the Broadcasting Commission, as well as debates surrounding it. Yet, I will also argue that academics as lobbyists cannot be avoided when it comes to issues regarding the role that public service should play in a digitized and globally networked media landscape. The topic is timely – and in fact the start of my article uses the “open letter” to Alice Bah Kunhke (Swedish Minister of Culture and Democracy), that the board chair of Mittmedia, Jan Friedman, published in Dagens Nyheter print edition today (which was available online already yesterday). Thus, my article starts in the following way:

During August 2016, the major news story dominating all Swedish media—about other Swedish media—was inside information from a recent board meeting of the media group Mittmedia at Arlanda airport in Stockholm. Following rumours, Mittmedia, one of Sweden largest media group with almost 30 local newspapers in the midst of the country, would in a year to come conceivably cut 75 percent of editorial services—and as a consequence leave almost 500 local journalists unemployed. In a small country as Sweden, the news was outrageous, potentially affecting almost a million local readers. The scoop was published by the leading daily newspaper, Dagens Nyheter; instantly appalled and critical comments filled social media as well as as other news outlets (Delin 2017). As was to be expected, the Swedish Minister of Culture and Democracy, Alice Bah Kuhnke (from the Green Party), was interviewed. Ultimately, she does bear the utmost responsibility for national media politics. The Minister stated that Swedish media, and especially local newspapers, seemed to be moving towards “the edge of a cliff”. I did “not sleep well at all”, she asserted when the reporter talked to her the day after the Mittmedia-news broke. In fact, she was utterly concerned, arguing that these dreadful revelations did not “only mean something for individual journalists or media in general. It meant something unheard of for Swedish democracy” (Jones 2016). Due to the effects of digitisation and globally networked media, the present (and mostly grim) media situation in Sweden naturally resembles similar ones in Western European countries. Yet, what made the news of Mittmedia’s downfall so dismay to many, was that this particular media enterprise had been a digital pioneer among (at least) local media groups in Sweden. Then again, money were being lost at a fast rate, and the quite bleak scenario hence put forward by the board—all in order to try to steer Mittmedia’s strategies (and finances) away from a fruitless digital business model.

Leading the board of Mittmedia is Jan Friedman, a businessman and a professional chairman in a number of Swedish business boards. After reading about the Swedish sleepless Minister of Culture and Democracy, and her concern about Mittmedia he decided to send her an “open letter”. It was published a few days later in Dagens Nyheter as well. “Dear Alice”, Friedman began—within the private media sector most curves have been pointing in the wrong direction for years. “Subscribers are diminishing and aging, while advertisers prefer other channels [than local newspapers], especially foreign owned ones as Google and Facebook. Overall, most local media companies display diminished margins”, he lamented. Nevertheless, he contiued by explicitly addressing the Minister: you could “help us in different ways by forging an imprint that makes it easier for us and others to succeed.” You are the Minister responsible for our national media after all, and you “own yourself a toolbox”. In it, Friedman stated, are “four tools that would really make a difference.” He then divided his letter into four sections—each one for each tool—where the last three of these basically had to do with press subsidies and taxes. Friedman, for example, urged the Minister to reconstruct the contemporary Swedish press subsidy system, and sincerely hoped this would be the case when the on-going Swedish Media Inquiry—initiated by the government in May 2015—would present their long awaited proposal later in November 2016. Friedman furthermore argued for the abolishment of the so called Swedish advertisement tax (a truly peculiar tax on only print media advertising revenues), as well as the strikingly similar printing VAT, both causing a lot of financial troubles for local newspapers.

The point I want to make with this introduction, however, is that Friedman’s major concern was Swedish Public Service—and particularly so at the regional and local level. It was, in fact, the number one tool for the Minister to reconsider. Traditionally, both Swedish Television (SVT) and Swedish Television (SR)—which are linked entities, but remain two separate institutions—have been present across Sweden at various regional locations. Local public service, Friedman hence told the Minister, sometimes triggered stimulating competition, but more often it has proven “counterproductive”. Above all, Friedman stated, it is indeed “commercially challenging to meet part of SVT’s and SR’s free online services, while trying to make editorial investments and create tempting conditions for reader paid news activities in a digital world.” If even local news are free on svt.se or sr.se—why should users in the middle of Sweden pay Mittmedia money for the same news? Naturally, Friedman told the Minister, he was aware that the present Media Inquiry investigated this hotly debated question. Yet, he urged the Minister to really pay attention to the matter, and not let go of the issue of “public service increased negative market impact.” One concrete way, he suggested, to deal with the issue at the local level “could be to increase collaboration between private and public service companies … and even consider to privately outsource the kind of local media production that regional and local public service companies carry out today” (Friedman 2016).

Jan Friedman’s open letter—both friendly and sarcastic in tone—to Minister Alice Bah Kuhnke, is but the latest news item in a seemingly neverending media discussion around Swedish public service. The issue has naturally and historically been debated before. But during the last decade the discussion has become way more intensified, as testified by an a rapidly increasing number of articles in the Swedish Media Retriever Database. In 2005 a serch for “public service” generated 1,100 articles, ten years later the amount was 2,700 (Media Retriever 2016).

Spotify, akademiska interventioner & musikmetoder

Efter sjutton år är jag tillbaka på mitt favoritbibliotek – numera ombyggda Stabi Ost i Berlin. Den nya (nåja) lässalen är tyskt proper; och mycket angenäm att arbeta i. Jag skriver för närvarande på en artikel till en bok som Christopher Kulleberg på Göteborgs unviversitet sätter samman om digitala metoder. Den handlar delvis om övergripande frågor i ämnet (som jag debatterat tidigare), och delvis om det projekt jag leder som behandlar Spotify. Boken lär väl bli färdig i början av nästa år – och delar av texten ser för närvarande ut på följande sätt:

Mediefilosofen John Durham Peters har argumenterat för att vi numera tycks förutbestämda – ja, nästintill predestinerade – av den data vi producerar, konsumerar och inte minst ständigt delar med varandra. Digitala medier, påtalar han i sin bok, The Marvellous Cloud, ”point to a fundamental task of order and maintenance, the ways in which data ground our being, and the techniques that lie at the heart of human dwelling on earth” (Durham Peters 2015:8). Dagens kodbaserade medieformer skiljer sig därför från 1900-talets massmedier, menar Durham Peters. Spetsar man till frågan, handlar medier numera – i form av ett slags integrerade dataflöden – allt mindre om innehåll, program eller opinioner, och alltmer om organisation och positionering, metadata och hypermedialitet, insamling och mätning, kalkylering och analys. Det är en karakteristiska som passar väl på vad ett företag som Spotify ägnar sig åt.

Durham Peters är en humanistiskt orienterad medievetare; att hans perspektiv skiljer sig från den mer samhällsvetenskapliga medieforskning (som jag refererade till ovan) är därför inte konstigt. I den forskningsdisciplin jag själv främst är verksam inom, digital humaniora, är den här typen av synsätt helt standard. Digital humaniora handlar delvis om utveckling av nya digitala metoder för humanistisk forskning, vilka förutsätter tekniskt kunnande, tvärvetenskapligt samarbete samt att programmerare tidigt kommer in i forskningsprocessen och driver den framåt i återkommande dialog (Snickars 2016). Debatten om vilka metoder som ska användas är intensiv, men bruket av beteckningen digitala metoder helt vedertagen (Clement 2016; Moretti 2013; Jockers 2013). Även inom den mer samhällsvetenskapligt orienterade medieforskningen har frågan om ett slags metodmässigt skifte naturligtvis diskuterats när det gäller mediemätning och datainsamling – även så i Sverige. Min kollega, medievetaren Jonas Andersson Schwartz har exempelvis i ett flertal publikationer arbetat med insamlad Twitterdata, och utfört empiriska studier av svensk politik på Twitter (Andersson Schwarz et al 2015a; Andersson Schwarz 2015b). Denna typ av datainsamling ger helt nya forskningsperspektiv, men Andersson Schwartz har också varnat för att när verktyg och metoder låter analytikern närläsa dataflöden ”i realtid och på löpande basis [så är] riskerna för överdrifter och felaktiga tolkningar uppenbara” (Andersson Schwarz et al 2016).

En annan medieforskare med ett snarlikt perspektiv, Klaus Bruhn Jensen, har i ett flertal artiklar diskuterat den metodologiska skillnad som uppstår i en nätkontext mellan att vetenskapligt arbeta med upphittad och tillverkad data – det vill säga, att exempelvis samla in Twitterdata via hashtaggen #svpol jämfört med att i en enkät fråga människor hur de använder sociala medier. Att den senare datan är tillverkad (och stundtals rentav fabricerad) är uppenbart om man kort betänker hur en typisk SOM-fråga är ställd: ”Hur ofta har du under de senaste 12 månaderna använt internet?” Å den ena sidan finns här uppenbara metodologiska problem i det att själva frågeställningen reglerar svarsinnehållet, i analogi till ordspråket, ”som man frågar får man svar”. Å den andra sidan innefattar den här typen av subjektiv självskattning (som frågan triggar) alltid ett slags introspektiva knivigheter; hur ärlig ska (eller kan) man vara? Tillverkad data vilar såtillvida på ett slags dubbelt epistemologiskt tvivelaktig grund. Det är denna diskrepans Bruhn Jensen menar karakteriserar skillnaden mellan tillverkad och upphittad data, vilken är speciellt iögonfallande i en nätkontext. ”The issue of data of, about, and around the Internet highlights the common distinction between research evidence that is either ’found’ or ’made’”, skriver han – med tillägget att all evidens och bevisföring som nätforskaren behöver redan finns för handen: ”all the evidence needed for Internet studies is already there, documented in and of the system, with a little help from network administrators and service providers.” På så vis menar Bruhn Jensen att själva systemet – det vill säga internet – blir till en sorts metod per se: ”the system is the method” (Bruhn Jensen 2011:52). Det är ett synsätt värt att hålla i minnet i det följande. I en senare artikel har Bruhn Jensen också framhållit den skillnad (gentemot tidigare medieformer) som numera alltid återfinns i digitala nätverkssammanhang eftersom all nätanvändning alltid lämnar spår: ”digital networks, in and of their operation, document aspects of technologically mediated communication that were mostly lost in previous media – meta-data that can be found and that invite further research” (Bruhn Jensen 2012:436).

Att få korn på denna typ av metadata som Bruhn Jensen skriver om står på flera sätt i fokus för det forskningsprojekt om Spotify som jag leder, ”Strömmande kulturarv. Filförföljelse i digital musikdistribution”. Med hjälp av digitala metoder och digital etnografi har min (och min forskargrupps) ambition varit att observera musikfilers färd genom det digitala eko-system som utgör den strömmande mediekulturens svarta låda. Inom projektet har vi ofta använt oss av en brevbärarar-metafor för att beskriva vad vi försöker åstadkomma – det vill säga, att följa ett paket (en musikfils) väg från skapelse till mottagare. Visserligen ’rör’ sig inte musikfiler rent datatekniskt inom Spotifys musik-ekosystem på ett sådant sätt överhuvudtaget. Tekniken är långt mer komplicerad, och innefattar bland annat en fristående Spotify-klient (ett eget program), som också finns i en webbaserad version liksom för en rad olika mobilplattformar. Själva strömningstjänsten av musik vilar därtill på fildelningsteknik, där olika användares uppkoppling används för att skicka (och cashe-lagra) musik sinsemellan för att minimera bandbredd. Till det ska man addera den så kallade ”den musikaliska hjärnan” som företaget EchoNest byggt upp av ”1,2 miljarder små loggade informationsbitar som berättar någonting om världens musik”, användardata som sparas i en databas som Spotify betalade 800 miljoner kronor för 2014 (Larsson 2016).

Begreppet filförföljelse är med andra ord strängt taget en metafor. Ändå är grundtanken med vårt forskningsprojekt att digitalisering av medieobjekt förändrat hur de bör konceptualiseras, analyseras och förstås, och det just med utgångspunkt i de spår av information och betydande mängder metadata som (musik)filer oupphörligen lämnar i olika nätverk; som sagt – systemet är metoden. Från ett akademiskt perspektiv handlar det om att helt växla fokus; från studiet av statiska musikartefakter till ökat vetenskapligt fokus på dynamiskt aktiva filer med ett slags inherent information om exemplevis bredbandsinfrastruktur, fildistribution och så kallad aggregation, användarpraktiker, klick-frekvens, sociala spellistor, delning och upprepning – ”all the evidence needed for Internet studies is already there, documented in and of the system”, som Bruhn Jensen framhållit. Det är med andra ord inte ett forskningsprojekt om musiklyssnande – snarare om hur strömningstjänster som Spotify reglerar och paketerar lyssnade, samt vilken data som kan extraheras fram när musik ’färdas’ genom Spotifys nätverk.