Join thousands of book lovers
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.You can, at any time, unsubscribe from our newsletters.
An innovative investigation of the inner workings of Spotify that traces the transformation of audio files into streamed experience.Spotify provides a streaming service that has been welcomed as disrupting the world of music. Yet such disruption always comes at a price. Spotify Teardown contests the tired claim that digital culture thrives on disruption. Borrowing the notion of "teardown” from reverse-engineering processes, in this book a team of five researchers have playfully disassembled Spotify's product and the way it is commonly understood.Spotify has been hailed as the solution to illicit downloading, but it began as a partly illicit enterprise that grew out of the Swedish file-sharing community. Spotify was originally praised as an innovative digital platform but increasingly resembles a media company in need of regulation, raising questions about the ways in which such cultural content as songs, books, and films are now typically made available online.Spotify Teardown combines interviews, participant observations, and other analyses of Spotify's "front end” with experimental, covert investigations of its "back end.” The authors engaged in a series of interventions, which include establishing a record label for research purposes, intercepting network traffic with packet sniffers, and web-scraping corporate materials. The authors' innovative digital methods earned them a stern letter from Spotify accusing them of violating its terms of use; the company later threatened their research funding. Thus, the book itself became an intervention into the ethics and legal frameworks of corporate behavior.
The fortieth-anniversary edition of a classic and prescient work on death and dying.Much of today's literature on end-of-life issues overlooks the importance of 1970s social movements in shaping our understanding of death, dying, and the dead body. This anniversary edition of Lyn Lofland's The Craft of Dying begins to repair this omission. Lofland identifies, critiques, and theorizes 1970s death movements, including the Death Acceptance Movement, the Death with Dignity Movement, and the Natural Death movement. All these groups attempted to transform death into a "positive experience,” anticipating much of today's death and dying activism. Lofland turns a sociologist's eye on the era's increased interest in death, considering, among other things, the components of the modern "face of death” and the "craft of dying,” the construction of a dying role or identity by those who are dying, and the constraints on their freedom to do this. Lofland wrote just before the AIDS epidemic transformed the landscape of death and dying in the West; many of the trends she identified became the building blocks of AIDS activism in the 1980s and 1990s. The Craft of Dying will help readers understand contemporary death social movements' historical relationships to questions of race, class, gender, and sexuality and is a book that everyone interested in end-of-life politics should read.
A new theory about the origins of consciousness that finds learning to be the driving force in the evolutionary transition to basic consciousness.What marked the evolutionary transition from organisms that lacked consciousness to those with consciousness—to minimal subjective experiencing, or, as Aristotle described it, "the sensitive soul”? In this book, Simona Ginsburg and Eva Jablonka propose a new theory about the origin of consciousness that finds learning to be the driving force in the transition to basic consciousness. Using a methodology similar to that used by scientists when they identified the transition from non-life to life, Ginsburg and Jablonka suggest a set of criteria, identify a marker for the transition to minimal consciousness, and explore the far-reaching biological, psychological, and philosophical implications.After presenting the historical, neurobiological, and philosophical foundations of their analysis, Ginsburg and Jablonka propose that the evolutionary marker of basic or minimal consciousness is a complex form of associative learning, which they term unlimited associative learning (UAL). UAL enables an organism to ascribe motivational value to a novel, compound, non-reflex-inducing stimulus or action, and use it as the basis for future learning. Associative learning, Ginsburg and Jablonka argue, drove the Cambrian explosion and its massive diversification of organisms. Finally, Ginsburg and Jablonka propose symbolic language as a similar type of marker for the evolutionary transition to human rationality—to Aristotle's "rational soul.”
An introduction to awe-inspiring ideas at the brink of paradox: infinities of different sizes, time travel, probability and measure theory, and computability theory.This book introduces the reader to awe-inspiring issues at the intersection of philosophy and mathematics. It explores ideas at the brink of paradox: infinities of different sizes, time travel, probability and measure theory, computability theory, the Grandfather Paradox, Newcomb's Problem, the Principle of Countable Additivity. The goal is to present some exceptionally beautiful ideas in enough detail to enable readers to understand the ideas themselves (rather than watered-down approximations), but without supplying so much detail that they abandon the effort. The philosophical content requires a mind attuned to subtlety; the most demanding of the mathematical ideas require familiarity with college-level mathematics or mathematical proof.The book covers Cantor's revolutionary thinking about infinity, which leads to the result that some infinities are bigger than others; time travel and free will, decision theory, probability, and the Banach-Tarski Theorem, which states that it is possible to decompose a ball into a finite number of pieces and reassemble the pieces so as to get two balls that are each the same size as the original. Its investigation of computability theory leads to a proof of Gödel's Incompleteness Theorem, which yields the amazing result that arithmetic is so complex that no computer could be programmed to output every arithmetical truth and no falsehood. Each chapter is followed by an appendix with answers to exercises. A list of recommended reading points readers to more advanced discussions. The book is based on a popular course (and MOOC) taught by the author at MIT.
A unified treatment of the generation and analysis of brain-generated electromagnetic fields.In Brain Signals, Risto Ilmoniemi and Jukka Sarvas present the basic physical and mathematical principles of magnetoencephalography (MEG) and electroencephalography (EEG), describing what kind of information is available in the neuroelectromagnetic field and how the measured MEG and EEG signals can be analyzed. Unlike most previous works on these topics, which have been collections of writings by different authors using different conventions, this book presents the material in a unified manner, providing the reader with a thorough understanding of basic principles and a firm basis for analyzing data generated by MEG and EEG.The book first provides a brief introduction to brain states and the early history of EEG and MEG, describes the generation of electromagnetic fields by neuronal activity, and discusses the electromagnetic forward problem. The authors then turn to EEG and MEG analysis, offering a review of linear and matrix algebra and basic statistics needed for analysis of the data, and presenting several analysis methods: dipole fitting; the minimum norm estimate (MNE); beamforming; the multiple signal classification algorithm (MUSIC), including RAP-MUSIC with the RAP dilemma and TRAP-MUSIC, which removes the RAP dilemma; independent component analysis (ICA); and blind source separation (BSS) with joint diagonalization.
An introductory text in linguistic semantics, uniquely balancing empirical coverage and formalism with development of intuition and methodology.This introductory textbook in linguistic semantics for undergraduates features a unique balance between empirical coverage and formalism on the one hand and development of intuition and methodology on the other. It will equip students to form intuitions about a set of data, explain how well an analysis of the data accords with their intuitions, and extend the analysis or seek an alternative. No prior knowledge of linguistics is required. After mastering the material, students will be able to tackle some of the most difficult questions in the field even if they have never taken a linguistics course before.After introducing such concepts as truth conditions and compositionality, the book presents a basic symbolic logic with negation, conjunction, and generalized quantifiers, to serve as the basis for translation throughout the book. It then develops a detailed compositional semantics, covering quantification (scope and binding), adverbial modification, relative clauses, event semantics, tense and aspect, as well as pragmatic phenomena, notably deictic pronouns and narrative progression.A Course in Semantics offers a large and diverse set of exercises, interspersed throughout the text; those labeled "Important practice and looking ahead” prepare students for material to come; those labeled "Thinking about ” invite students to think beyond the content of the book.
An advanced treatment of modern macroeconomics, presented through a sequence of dynamic equilibrium models, with discussion of the implications for monetary and fiscal policy.This textbook offers an advanced treatment of modern macroeconomics, presented through a sequence of dynamic general equilibrium models based on intertemporal optimization on the part of economic agents. The book treats macroeconomics as applied and policy-oriented general equilibrium analysis, examining a number of models, each of which is suitable for investigating specific issues but may be unsuitable for others.After presenting a brief survey of the evolution of macroeconomics and the key facts about long-run economic growth and aggregate fluctuations, the book introduces the main elements of the intertemporal approach through a series of two-period competitive general equilibrium models—the simplest possible intertemporal models. This sets the stage for the remainder of the book, which presents models of economic growth, aggregate fluctuations, and monetary and fiscal policy. The text focuses on a full analysis of a limited number of key intertemporal models, which are stripped down to essentials so that students can focus on the dynamic properties of the models. Exercises encourage students to try their hands at solving versions of the dynamic models that define modern macroeconomics. Appendixes review the main mathematical techniques needed to analyze optimizing dynamic macroeconomic models. The book is suitable for advanced undergraduate and graduate students who have some knowledge of economic theory and mathematics for economists.
Researchers rethink tactics for inventing and disseminating research, examining the use of such unconventional forms as poetry, performance, catalogs, interactive machines, costume, and digital platforms.Transmission is the research moment when invention meets dissemination—the tactical combination of making (how theory, methods, and data shape research) and communicating (how research is shown and shared). In this book, researchers from a range of disciplines examine tactics for the transmission of research, exploring such unconventional forms as poetry, performance, catalogs, interactive machines, costume, and digital platforms. Focusing on transmissions draws attention to a critical part of the research process commonly overlooked and undervalued. Too often, the results of radically experimental research methodologies are pressed into conventional formats. The contributors to Transmissions rethink tactics for making and communicating research as integral to the kind of projects they do, pushing against disciplinary edges with unexpected and creative combinations and collaborations.Each chapter focuses on a different tactic of transmission. One contributor merges literary styles of the empirical and poetic; another uses an angle grinder to construct machines of enquiry. One project invites readers to participate in an exchange about value; another provides a series of catalog cards to materialize ordering systems of knowledge. All the contributors share a commitment to uniting the what with the how, firmly situating their transmissions in their research and in each unique chapter of this book.ContributorsNerea Calvillo, Rebecca Coleman, Larissa Hjorth, Janis Jefferies, Kat Jungnickel, Sarah Kember, Max Liboiron, Kristina Lindström, Alexandra Lippman, Bonnie Mak, Julien McHardy, Julia Pollack, Ingrid Richardson, Åsa Ståhl, Laura Watts
An argument that what makes science distinctive is its emphasis on evidence and scientists' willingness to change theories on the basis of new evidence.
This “well-researched, nuanced” study of the rise of social media activism explores how marginalized groups use Twitter to advance counter-narratives, preempt political spin, and build diverse networks of dissent (Ms.) The power of hashtag activism became clear in 2011, when #IranElection served as an organizing tool for Iranians protesting a disputed election and offered a global audience a front-row seat to a nascent revolution. Since then, activists have used a variety of hashtags, including #JusticeForTrayvon, #BlackLivesMatter, #YesAllWomen, and #MeToo to advocate, mobilize, and communicate. In this book, Sarah Jackson, Moya Bailey, and Brooke Foucault Welles explore how and why Twitter has become an important platform for historically disenfranchised populations, including Black Americans, women, and transgender people. They show how marginalized groups, long excluded from elite media spaces, have used Twitter hashtags to advance counternarratives, preempt political spin, and build diverse networks of dissent. The authors describe how such hashtags as #MeToo, #SurvivorPrivilege, and #WhyIStayed have challenged the conventional understanding of gendered violence; examine the voices and narratives of Black feminism enabled by #FastTailedGirls, #YouOKSis, and #SayHerName; and explore the creation and use of #GirlsLikeUs, a network of transgender women. They investigate the digital signatures of the “new civil rights movement”—the online activism, storytelling, and strategy-building that set the stage for #BlackLivesMatter—and recount the spread of racial justice hashtags after the killing of Michael Brown in Ferguson, Missouri, and other high-profile incidents of killings by police. Finally, they consider hashtag created by allies, including #AllMenCan and #CrimingWhileWhite.
An argument that Modernism is a cognitive phenomenon rather than a cultural one.At the beginning of the twentieth century, poetry, music, and painting all underwent a sea change. Poetry abandoned rhyme and meter; music ceased to be tonally centered; and painting no longer aimed at faithful representation. These artistic developments have been attributed to cultural factors ranging from the Industrial Revolution and the technical innovation of photography to Freudian psychoanalysis. In this book, Samuel Jay Keyser argues that the stylistic innovations of Western modernism reflect not a cultural shift but a cognitive one. Behind modernism is the same cognitive phenomenon that led to the scientific revolution of the seventeenth century: the brain coming up against its natural limitations. Keyser argues that the transformation in poetry, music, and painting (the so-called sister arts) is the result of the abandonment of a natural aesthetic based on a set of rules shared between artist and audience, and that this is virtually the same cognitive shift that occurred when scientists abandoned the mechanical philosophy of the Galilean revolution. The cultural explanations for Modernism may still be relevant, but they are epiphenomenal rather than causal. Artists felt that traditional forms of art had been exhausted, and they began to resort to private formats—Easter eggs with hidden and often inaccessible meaning. Keyser proposes that when artists discarded their natural rule-governed aesthetic, it marked a cognitive shift; general intelligence took over from hardwired proclivity. Artists used a different part of the brain to create, and audiences were forced to play catch up.
Why economists' attempts to help poorer countries improve their economic well-being have failed.Since the end of World War II, economists have tried to figure out how poor countries in the tropics could attain standards of living approaching those of countries in Europe and North America. Attempted remedies have included providing foreign aid, investing in machines, fostering education, controlling population growth, and making aid loans as well as forgiving those loans on condition of reforms. None of these solutions has delivered as promised. The problem is not the failure of economics, William Easterly argues, but the failure to apply economic principles to practical policy work.In this book Easterly shows how these solutions all violate the basic principle of economics, that people—private individuals and businesses, government officials, even aid donors—respond to incentives. Easterly first discusses the importance of growth. He then analyzes the development solutions that have failed. Finally, he suggests alternative approaches to the problem. Written in an accessible, at times irreverent, style, Easterly's book combines modern growth theory with anecdotes from his fieldwork for the World Bank.
In an 1828 letter to his partner, Nicéphore Niépce, Louis Daguerre wrote, "I am burning with desire to see your experiments from nature." In this book, Geoffrey Batchen analyzes the desire to photograph as it emerged within the philosophical and scientific milieus that preceded the actual invention of photography. Recent accounts of photography's identity tend to divide between the postmodern view that all identity is determined by context and a formalist effort to define the fundamental characteristics of photography as a medium. Batchen critiques both approaches by way of a detailed discussion of photography's conception in the late eighteenth and early nineteenth centuries. He examines the output of the various nominees for "first photographer," then incorporates this information into a mode of historical criticism informed by the work of Michel Foucault and Jacques Derrida. The result is a way of thinking about photography that persuasively accords with the medium's undeniable conceptual, political, and historical complexity.
The most comprehensive English-language overview of the modern Chinese economy, covering China's economic development since 1949 and post-1978 reforms--from industrial change and agricultural organization to science and technology.
The relationship of texts and maps, and the mappability of literature, examined from Homer to Houellebecq.Literary authors have frequently called on elements of cartography to ground fictional space, to visualize sites, and to help readers get their bearings in the imaginative world of the text. Today, the convergence of digital mapping and globalization has spurred a cartographic turn in literature. This book gathers leading scholars to consider the relationship of literature and cartography. Generously illustrated with full-color maps and visualizations, it offers the first systematic overview of an emerging approach to the study of literature.The literary map is not merely an illustrative guide but represents a set of relations and tensions that raise questions about representation, fiction, and space. Is literature even mappable? In exploring the cartographic components of literature, the contributors have not only brought literary theory to bear on the map but have also enriched the vocabulary and perspectives of literary studies with cartographic terms. After establishing the theoretical and methodological terrain, they trace important developments in the history of literary cartography, considering topics that include Homer and Joyce, Goethe and the representation of nature, and African cartographies. Finally, they consider cartographic genres that reveal the broader connections between texts and maps, discussing literary map genres in American literature and the coexistence of image and text in early maps. When cartographic aspirations outstripped factual knowledge, mapmakers turned to textual fictions.ContributorsJean-Marc Besse, Bruno Bosteels, Patrick M. Bray, Martin Brückner, Tom Conley, Jörg Dünne, Anders Engberg-Pedersen, John K. Noyes, Ricardo Padrón, Barbara Piatti, Simone Pinet, Clara Rowland, Oliver Simons, Robert Stockhammer, Dominic Thomas, Burkhardt Wolf
A comprehensive textbook that integrates tools from technology, economics, markets, and policy to approach energy issues using a dynamic systems and capital-centric perspective.The global energy system is the vital foundation of modern human industrial society. Traditionally studied through separate disciplines of engineering, economics, environment, or public policy, this system can be fully understood only by using an approach that integrates these tools. This textbook is the first to take a dynamic systems perspective on understanding energy systems, tracking energy from primary resource to final energy services through a long and capital-intensive supply chain bounded by both macroeconomic and natural resource systems.The book begins with a framework for understanding how energy is transformed as it moves through the system with the aid of various types of capital, its movement influenced by a combination of the technical, market, and policy conditions at the time. It then examines the three primary energy subsystems of electricity, transportation, and thermal energy, explaining such relevant topics as systems thinking, cost estimation, capital formation, market design, and policy tools. Finally, the book reintegrates these subsystems and looks at their relation to the economic system and the ecosystem that they inhabit. Practitioners and theorists from any field will benefit from a deeper understanding of both existing dynamic energy system processes and potential tools for intervention.
A work that reveals the profound links between the evolution, acquisition, and processing of language, and proposes a new integrative framework for the language sciences.
A comprehensive, state-of-the-art guide to site planning, covering planning processes, new technologies, and sustainability, with extensive treatment of practices in rapidly urbanizing countries.
A foundational analysis of the co-evolution of the internet and international relations, examining resultant challenges for individuals, organizations, firms, and states.In our increasingly digital world, data flows define the international landscape as much as the flow of materials and people. How is cyberspace shaping international relations, and how are international relations shaping cyberspace? In this book, Nazli Choucri and David D. Clark offer a foundational analysis of the co-evolution of cyberspace (with the internet as its core) and international relations, examining resultant challenges for individuals, organizations, and states.The authors examine the pervasiveness of power and politics in the digital realm, finding that the internet is evolving much faster than the tools for regulating it. This creates a "co-evolution dilemma”—a new reality in which digital interactions have enabled weaker actors to influence or threaten stronger actors, including the traditional state powers. Choucri and Clark develop a new method for addressing control in the internet age, "control point analysis,” and apply it to a variety of situations, including major actors in the international and digital realms: the United States, China, and Google. In doing so they lay the groundwork for a new international relations theory that reflects the reality in which we live—one in which the international and digital realms are inextricably linked and evolving together.
An examination of how artists have combined performance and moving image for decades, anticipating our changing relation to images in the internet era.In Performing Image, Isobel Harbison examines how artists have combined performance and moving image in their work since the 1960s, and how this work anticipates our changing relations to images since the advent of smart phones and the spread of online prosumerism. Over this period, artists have used a variety of DIY modes of self-imaging and circulation—from home video to social media—suggesting how and why Western subjects might seek alternative platforms for self-expression and self-representation. In the course of her argument, Harbison offers close analyses of works by such artists as Robert Rauschenberg, Yvonne Rainer, Mark Leckey, Wu Tsang, and Martine Syms.Harbison argues that while we produce images, images also produce us—those that we take and share, those that we see and assimilate through mass media and social media, those that we encounter in museums and galleries. Although all the artists she examines express their relation to images uniquely, they also offer a vantage point on today's productive-consumptive image circuits in which billions of us are caught. This unregulated, all-encompassing image performativity, Harbison writes, puts us to work, for free, in the service of global corporate expansion. Harbison offers a three-part interpretive framework for understanding this new proximity to images as it is negotiated by these artworks, a detailed outline of a set of connected practices—and a declaration of the value of art in an economy of attention and a crisis of representation.
A new approach to Hume's problem of induction that justifies the optimality of induction at the level of meta-induction.Hume's problem of justifying induction has been among epistemology's greatest challenges for centuries. In this book, Gerhard Schurz proposes a new approach to Hume's problem. Acknowledging the force of Hume's arguments against the possibility of a noncircular justification of the reliability of induction, Schurz demonstrates instead the possibility of a noncircular justification of the optimality of induction, or, more precisely, of meta-induction (the application of induction to competing prediction models). Drawing on discoveries in computational learning theory, Schurz demonstrates that a regret-based learning strategy, attractivity-weighted meta-induction, is predictively optimal in all possible worlds among all prediction methods accessible to the epistemic agent. Moreover, the a priori justification of meta-induction generates a noncircular a posteriori justification of object induction. Taken together, these two results provide a noncircular solution to Hume's problem.Schurz discusses the philosophical debate on the problem of induction, addressing all major attempts at a solution to Hume's problem and describing their shortcomings; presents a series of theorems, accompanied by a description of computer simulations illustrating the content of these theorems (with proofs presented in a mathematical appendix); and defends, refines, and applies core insights regarding the optimality of meta-induction, explaining applications in neighboring disciplines including forecasting sciences, cognitive science, social epistemology, and generalized evolution theory. Finally, Schurz generalizes the method of optimality-based justification to a new strategy of justification in epistemology, arguing that optimality justifications can avoid the problems of justificatory circularity and regress.
Leading economists discuss post-financial crisis policy dilemmas, including the dangers of complacency in a period of relative stability.The Great Depression led to the Keynesian revolution and dramatic shifts in macroeconomic theory and macroeconomic policy. Similarly, the stagflation of the 1970s led to the adoption of the natural rate hypothesis and to a major reassessment of the role of macroeconomic policy. Should the financial crisis and the Great Recession lead to yet another major reassessment, to another intellectual revolution? Will it? If so, what form should it, or will it, take? These are the questions taken up in this book, in a series of contributions by policymakers and academics. The contributors discuss the complex role of the financial sector, the relative roles of monetary and fiscal policy, the limits of monetary policy to address financial stability, the need for fiscal policy to play a more active role in stabilization, and the relative roles of financial regulation and macroprudential tools. The general message is a warning against going back to precrisis ways—to narrow inflation targeting, little use of fiscal policy for stabilization, and insufficient financial regulation.ContributorsDavid Aikman, Alan J. Auerbach, Ben S. Bernanke, Olivier Blanchard, Lael Brainard, Markus K. Brunnermeier, Marco Buti, Benoît Coeuré, Mario Draghi, Barry Eichengreen, Jason Furman, Gita Gopinath, Pierre-Olivier Gourinchas, Andrew G. Haldane, Philipp Hildebrand, Marc Hinterschweiger, Sujit Kapadia, Nellie Liang, Adam S. Posen, Raghuram Rajan, Valerie Ramey, Carmen Reinhart, Dani Rodrik, Robert E. Rubin, Jay C. Shambaugh, Tharman Shanmugaratnam, Jeremy C. Stein, Lawrence H. Summers
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.