New Things Under the Sun

Matt Clancy

Synthesizing academic research about innovation, science, and creativity.

Start Here
Do Academic Citations Measure the Impact of New Ideas?
Jul 5 2022
Do Academic Citations Measure the Impact of New Ideas?
A huge quantity of academic research that seeks to understand how science works relies on citation counts to measure the value of knowledge created by scientists. This measure of scientific impact is so deeply embedded in the literature that it's absolutely crucial to know if it’s reliable. So today I want to look at a few recent articles that look into this foundational question: are citation counts a good measure of the value of scientific contributions?This podcast is an audio read through of the (initial draft of the) post Do Academic Citations Measure the Impact of New Ideas?, originally published on New Things Under the Sun.Articles Mentioned:Teplitsky, Misha, Eamon Duede, Michael Menietti, and Karim R. Lakhani. 2022. How Status of Research Papers Affects the Way They are Read and Cited. Research Policy 51(4): 104484. Sean M., and David M. Blei. 2010. A Language-based Approach to Measuring Scholarly Impact. Proceedings of the 26th International Conference on Machine Learning: 375-382. Aaron, Yuenig Hu, Jordan Boyd-Graber, and James Evans. 2018. Measuring Discursive Influence Across Scholarship. Proceedings of the National Academy of Science 115(13): 3308-3313. Felix, Dietmar Harhoff, Fabian Guesser, and Stefano Baruffaldi. 2019. Science Quality and the Value of Inventions. Science Advances 5(12). Yian, Yuxiao Dong, Kuansan Wang, Dashun Wang, and Benjamin Jones. 2021. Science as a Public Good: Public Use and Funding of Science. NBER Working Paper 28748. David, and Stefano DellaVigna. 2020. What do Editors Maximize? Evidence from Four Economics Journals. The Review of Economics and Statistics 102(1): 195-217. Iman, and Lutz Bornmann. 2019. What do Citation Counts Measure? An Updated Review of Studies on Citations in Scientific Documents Published Between 2006 and 2018. Scientometrics 121: 1635-1684. Kayvan, and Mike Thelwell. 2016. Are Wikipedia Citations Important Evidence of the Impact of Scholarly Articles and Books? Journal of the Association for Information Science and Technology 68(3): 762-779.
Do Academic Citations Measure the Impact of New Ideas?
Jul 5 2022
Do Academic Citations Measure the Impact of New Ideas?
A huge quantity of academic research that seeks to understand how science works relies on citation counts to measure the value of knowledge created by scientists. This measure of scientific impact is so deeply embedded in the literature that it's absolutely crucial to know if it’s reliable. So today I want to look at a few recent articles that look into this foundational question: are citation counts a good measure of the value of scientific contributions?This podcast is an audio read through of the (initial draft of the) post Do Academic Citations Measure the Impact of New Ideas?, originally published on New Things Under the Sun.Articles Mentioned:Teplitsky, Misha, Eamon Duede, Michael Menietti, and Karim R. Lakhani. 2022. How Status of Research Papers Affects the Way They are Read and Cited. Research Policy 51(4): 104484. Sean M., and David M. Blei. 2010. A Language-based Approach to Measuring Scholarly Impact. Proceedings of the 26th International Conference on Machine Learning: 375-382. Aaron, Yuenig Hu, Jordan Boyd-Graber, and James Evans. 2018. Measuring Discursive Influence Across Scholarship. Proceedings of the National Academy of Science 115(13): 3308-3313. Felix, Dietmar Harhoff, Fabian Guesser, and Stefano Baruffaldi. 2019. Science Quality and the Value of Inventions. Science Advances 5(12). Yian, Yuxiao Dong, Kuansan Wang, Dashun Wang, and Benjamin Jones. 2021. Science as a Public Good: Public Use and Funding of Science. NBER Working Paper 28748. David, and Stefano DellaVigna. 2020. What do Editors Maximize? Evidence from Four Economics Journals. The Review of Economics and Statistics 102(1): 195-217. Iman, and Lutz Bornmann. 2019. What do Citation Counts Measure? An Updated Review of Studies on Citations in Scientific Documents Published Between 2006 and 2018. Scientometrics 121: 1635-1684. Kayvan, and Mike Thelwell. 2016. Are Wikipedia Citations Important Evidence of the Impact of Scholarly Articles and Books? Journal of the Association for Information Science and Technology 68(3): 762-779.
How common is independent discovery?
Jun 22 2022
How common is independent discovery?
An old divide in the study of innovation is whether ideas come primarily from individual/group creativity, or whether they are “in the air”, so that anyone with the right set of background knowledge will be able to see them. In this episode, I look at how much redundancy there is in innovation: if the discoverer of some idea had failed to find it, would someone else have figured it out later?This podcast is an audio read through of the (initial draft of the) post How common is independent discovery?, originally published on New Things Under the Sun.Articles Mentioned:Ogburn, William F., and Dorothy Thomas. 1922. Are Inventions Inevitable? A Note on Social Evolution. Political Science Quarterly 37(1): 83-98. Warren O. 1974. Competition in Science. American Sociological Review 39(1): 1-18. Ryan, and Carolyn Stein. 2020. Scooped! Estimating Rewards for Priority in Science. Working Paper.Painter, Deryc T., Frank van der Wouden, Manfred D. Laubichler, and Hyejin Youn. 2020. Quantifying simultaneous innovations in evolutionary medicine. Theory in Biosciences 139: 319-335. Michaël. 2020. Idea Twins: Simultaneous discoveries as a research tool. Strategic Management Journal 41(8): 1528-1543. Ina, Jeffrey Lin, and Nicholas Reynolds. 2020. The Paper Trail of Knowledge Spillovers: Evidence from Patent Interferences. American Economic Journal: Applied Economics 12(2): 278-302. Sonja, Benjamin Balmier, Florian Seliger, and Lee Fleming. 2020. Early Disclosure of Invention and Reduced Duplication: An Empirical Test. Management Science 66(6): 2677-2685.  Alessandro, Carlo Schwarz, and Fabian Waldinger. 2018. Frontier Knowledge and Scientific Production: Evidence from the Collapse of International Science. Quarterly Journal of Economics: 927-991. George J., and Kirk B. Doran. 2012. The Collapse of the Soviet Union and the Productivity of American Mathematicians. The Quarterly Journal of Economics 127(3): 1143-1203. Ryan, and Carolyn Stein. 2021. Race to the bottom: competition and quality in science. Working paper.Cotropia, Christopher Anthony, and David L. Schwartz. 2018. Patents Used in Patent Office Rejections as Indicators of Value. SSRN Working Paper
Science is getting harder
Jun 1 2022
Science is getting harder
A basket of indicators all seem to document a similar trend. Even as the number of scientists and publications rises substantially, we do not appear to be seeing a concomitant rise in new discoveries that supplant older ones. Science is getting harder.This podcast is an audio read through of the (initial draft of the) post Science is getting harder, published on New Things Under the Sun.Articles mentioned:Bloom, Nicholas, Charles I. Jones, John Van Reenen, and Michael Webb. 2020. Are Ideas Getting Harder to Find? American Economics Review 110(4): 1104-1144. Dashun and Albert-László Barabási. 2021. The Science of Science. Cambridge: Cambridge University Press. Jichao, Yian Yin, Santo Fortunato, and Dashun Wang. 2019. A dataset of publication records for Nobel Laureates. Scientific Data 6: 33. Patrick and Michael Nielsen. 2018. Science is Getting Less Bang for Its Buck. The Atlantic. Chu, Johan S.G. and James A. Evans. 2021. Slowed canonical progress in large fields of science. PNAS 118(41): e2021636118. Staša. 2015. Quantifying the cognitive extent of science. Journal of Informetrics 9(4): 962-973. Nicolas, Agenor Lahatte, and Oscar Llopis. 2019. The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science. SSRN working paper. Vincent, Éric Archambault, & Yves Gingras. 2007. Long-term patterns in the aging of the scientific literature, 1900–2004. Proceedings of ISSI 2007, ed. Daniel Torres-Salinas and Henk F. Moed. Haochuan, Lingfei Wu, and James A. Evans. 2022. Aging scientists and slowed advance. arXiv 2202.04044. Matt, and Aaron Fuegi. Reliance on Science: Worldwide Front-Page Patent Citations to Scientific Articles.
Steering Science with Prizes
Mar 24 2022
Steering Science with Prizes
New scientific research topics can sometimes face a chicken-and-egg problem. Professional success requires a critical mass of scholars to be active in a field, so that they can serve as open-minded peer reviewers and can validate (or at least cite!) new discoveries. Without that critical mass,undefined working on a new topic topic might be professionally risky. But if everyone thinks this way, then how do new research topics emerge; how do groups of people pick which topic to focus on?One way is via coordinating mechanisms; a small number of universally recognized markers of promising research topics. This podcast looks at some evidence about how well prizes and other honors work at helping steer researchers towards specific research topics.This is an audio read through of the (initial version of) "Steering Science with Prizes", published on New Things Under the Sun.Articles mentioned:Azoulay, Pierre, Toby Stuart, and Yanbo Wang. 2014. Matthew: Effect or Fable? Management Science 60(1): 92-109. Brian P., Pierre Azoulay, and Toby E. Stuart. 2018. Status Spillovers: The Effect of Status-conferring Prizes on the Allocation of Attention. Administrative Science Quarterly 63(4): 819-847. Ching, Yifang Ma and Brian Uzzi. 2021. Scientific prizes and the extraordinary growth of scientific topics. Nature Communications 12: 5619. Pierre J., Michael Wahlen, and Ezra W. Zuckerman Sivan. 2019. Death of the Salesman but Not the Sales Force: How Interested Promotion Skews Scientific Valuation. American Journal of Sociology 125(3): 786-845. Pierre, Christian Fons-Rosen, and Joshua S. Graff Zivin. 2019. Does Science Advance One Funeral at a Time? American Economic Review 109(8): 2889-2920.
Pulling more fuel efficient cars into existence
Feb 25 2022
Pulling more fuel efficient cars into existence
If you want to shape the direction of technology, you can try to pull the kinds of technology you want into existence by shaping how markets will receive different kinds of technology.One specific context where we have some really nice evidence about the efficacy of pull policies is the automobile market. Making fuel more expensive or just flat out mandating carmakers meet certain emissions standards seems to pretty reliably nudge automakers into developing cleaner and more fuel efficient vehicles. We’ve got two complementary lines of evidence here: patents and measures of progress in fuel economy. This podcast is an audio read through of the (initial version of the) article "Pulling more fuel efficient cars into existence," published on New Things Under the Sun. Articles mentioned:Aghion, Philippe, Antoine Dechezleprêtre, David Hémous, Ralf Martin, and John Van Reenen. 2016. Carbon Taxes, Path Dependency, and Directed Technical Change: Evidence from the Auto Industry. Journal of Political Economy 124(1): 1-51. Rik, and Herman R.J. Vollebergh. 2021. Policy-Induced Innovation in Clean Technologies: Evidence from the Car Market. CESifo working paper no. 9422. Christopher R. 2012. Automobiles on Steroids: Product Attribute Trade-Offs and Technological Progress in the Automobile Sector. American Economic Review 101: 3368-3399. Thomas, and Joshua Linn. 2016. The effect of vehicle economy standards on technology adoption. Journal of Public Economics 133: 41-63. Takahiko. 2019. Environmental Policy and Induced Technological Change: Evidence from Automobile Fuel Economy Regulations. Environmental and Resource Economics 74: 785-810. Mathias. 2021. Abatement Strategies and the Cost of Environmental Regulations: Emission Standards on the European Car Market. The Review of Economic Studies 88(1): 454-488. Mohammad, and Benjamin F. Jones. 2017. The Dual Frontier: Patented inventions and prior scientific advance. Science 357(6351): 583-587. Michael, and Wesley M. Cohen. 2013. Lens or Prism? Patent Citations as a Measure of Knowledge Flows from Public Research. Management Science 59(2): 504-525.
"Patent Stocks" and Technological Inertia
Feb 9 2022
"Patent Stocks" and Technological Inertia
There’s this idea that technology is characterized by path dependency: once you start going down one technology trajectory, you kind of get locked in and it’s hard to switch to another, possibly better trajectory. That can happen for lots of reasons, but one possibility is that it’s something about the nature of knowledge itself. The more you know, the more you can learn: knowledge begets more knowledge. So whichever technology trajectory we start on becomes the one we know the most about, and therefore the one it makes most sense to stick with. One line of evidence about this comes from dynamics of patenting. This podcast is an audio read through of the (initial version of the) article "Patent Stocks" and Technological Inertia, published on New Things Under the Sun. Articles Mentioned:Aghion, Philippe, Antoine Dechezleprêtre, David Hemous, Ralf Martin, and John Van Reenen. 2016. Carbon taxes, path dependency, and directed technical change: Evidence from the auto industry. Journal of Political Economy 124(1): 1-51. Rik, and Herman R.J. Vollebergh. 2021. Policy-Induced Innovation in Clean Technologies: Evidence from the Car Market. CESifo working paper no. 9422. Joëlle and Roger Smeets. 2015. Directing technical change from fossil-fuel to renewable energy innovation: An application using firm-level data. Journal of Environmental Economics and Management 72: 15-37. David. 2002. Induced Innovation and Energy Prices. American Economic Review 92(1): 160-180. Michael E., and Scott Stern. 2000. Measuring the “ideas” production function: evidence from international patent output. NBER Working Paper 7891. Itziar, Linda Nøstbakken, and Martino Pelli. 2017. From fossil fuels to renewables: the role of electricity storage. European Economic Review 99: 113-129. Gwangman, and Yongtae Park. 2006. On the measurement of patent stock as knowledge indicators. Technological Forecasting and Social Change 73(7): 793-812. Matthew S. 2017. Combinations of technology in US patents, 1926-2009: a weakening base for future innovation? Economics of Innovation and New Technology 27(8): 770-785.
Conservatism in Science
Jan 21 2022
Conservatism in Science
It might seem obvious that we want bold new ideas in science. But in fact, really novel work poses a tradeoff. While novel ideas might sometimes be much better than the status quo, they might usually be much worse. Moreover, it is hard to assess the quality of novel ideas because they’re so, well, novel. Existing knowledge is not as applicable to sizing them up. For those reasons, it might be better to actually discourage novel ideas, and to instead encourage slow and incremental expansion of the knowledge frontier. Or maybe not.For better or worse, the scientific community has settled on a set of norms that appear to encourage safe and creeping science, rather than risky and leaping science.This podcast is an audio read through of the (initial version of the) article Conservatism in Science, published on New Things Under the Sun. Articles mentioned:Azoulay, Pierre, Christian Fons-Rosen, and Joshua S. Graff Zivin. 2019. Does Science Advance One Funeral at a Time? American Economic Review 109(8): 2889-2920. Jian, Reinhilde Veugelers, and Paula Stephan. 2017. Bias against novelty in science: A cautionary tale for users of bibliometric indicators. Research Policy 46(8): 1416-1436. Danielle. 2017. Expertise versus bias in evaluation: evidence from the NIH. American Economic Journal: Applied Economics 9(2): 60-92. Charles, Michele Pezzoni, and Fabiana Visentin. 2021. Does i pay to do novel science? The selectivity patterns in science funding. Science and Public Policy 48(5): 635-648. Kevin J., Eva C. Guinan, Karim R. Lakhani, Christoph Riedl. 2016. Looking across and looking beyond the knowledge frontier: intellectual distance, novelty, and resource allocation in science. Management Science 62(10): 2765-2783.
Publication Bias is Real
Jan 20 2022
Publication Bias is Real
Publication bias is when academic journals make publication of a paper contingent on the results obtained. How big of an issue is this really?This podcast is an audio read through of the (initial version of the) article Publication Bias is Real, published on New Things Under the Sun.Articles mentioned:Frankel, Alexander and Maximilian Kasy. Forthcoming. Which findings should be published? American Economic Journal: Microeconomics. Nate, Eike Mark Rinke, Alexander Wuttke, Muna Adem, Jule Adriaans, Amalia Alvarez-Benjumea, Henrik K. Andersen, et al. 2021. Observing Many Researchers Using the Same Data and Hypothesis Reveals a Hidden Universe of Uncertainty. MetaArXiv. March 24. doi:10.31222/osf.io/cd5j9.Dwan, Kerry, Douglas G. Altman, Juan A. Arnaiz, Jill Bloom, An-Wen Chan, Eugenia Cronin, et al. 2008. Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias. PLoS ONE 3(8): e3081. Annie, Neil Malhotra, and Gabor Simonovits. 2014. Publication bias in the social sciences: Unlocking the file drawer. Science 345(6203): 1502-1505. DOI: 10.1126/science.1255484Andrews, Isaiah, and Maximilian Kasy. 2019. Identification of and Correction for Publication Bias. American Economic Review 109(8): 2766-94. Colin F., Anna Deber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johanson et al. 2016. Evaluating replicability of laboratory experiments in economics. Science 351(6280): 1433-1436. Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349(6251) aac4716. Garret, and Edward Miguel. 2018. Transparency, Reproducibility, and the Credibility of Economics Research. Journal of Economic Literature 56(3): 920-80. Paul J., and Dale Belman. 2015. 15 years of research on U.S. employment and the minimum wage. Tuck School of Business Working Paper No. 2705499.
Publication bias without editors? The case of preprint servers
Jan 20 2022
Publication bias without editors? The case of preprint servers
Publication bias can distort our picture of scientific evidence. One plausible solution to publication bias is to create a home for work that for, whatever reason, struggles to find a home in a good journal. Would that work? One place to get some evidence on this is to look at our experience with preprint servers. This podcast is a read through of the (initial version of the) article Publication bias without editors? The case of preprint servers, published on New Things Under the Sun.Articles mentioned:Frankel, Alexander, and Maximilian Kasy. Forthcoming. Which Findings Should be Published? American Economic Journal: MicroeconomicsBaumann, Alexandra, and Klaus Wohlrabe. 2020. Where have all the working papers gone? Evidence from four major economics working paper series. Scientometrics 124: 2433-2441. Vincent, Cassidy R. Sugimoto, Benoit Macaluso, Staša Milojević, Blaise Cronin, and Mike Thelwall. 2014. arXiv E-prings and the journal of record: An analysis of roles and relationships. Journal of the Association for Information Science and Technology 65(6): 1157-1169. Hiroyuki, Yuan Sun, Masaki Nishizawa, Xiaomin Liu, and Kou Amano. 2020. The influence of bioRxiv on PLOS ONE’s peer-review and acceptance time. Proceedings of the Association for Information Science and Technology 57(1) e398. Daniele, Rodrigo Costas, and John P. A. Ioannidis. 2017. Meta-assessment of bias in science. PNAS 114(14): 3714-3719. Annie, Neil Malhotra, and Gabor Simonovits. 2014. Publication bias in the social sciences: Unlocking the file drawer. Science 345(6203): 1502-1505. Abel, Nikolai Cook, and Anthony Heyes. 2020. Methods Matter: p-hacking and publication bias in causal analysis in economics. American Economic Review 110(11): 3634-60.
How a field fixes itself: the applied turn in economics
Jan 20 2022
How a field fixes itself: the applied turn in economics
Getting an academic field to change its ways is hard. But it does happen. And I think changes in the field of economics are a good illustration of some of the dynamics that make that possible.This podcast is an audio read through of the (initial version of the) article How a field fixes itself: the applied turn in economics, published on New Things Under the Sun.Articles mentioned:Leamer, Edward E. 1983. Let’s Take the Con Out of Econometrics. American Economic Review 73(1): 31-43. Daniel S. 2013. Six Decades of Top Economics Publishing: Who and How? Journal of Economic Literature 51(1): 162-72. Roger E., and Béatrice Cherrier. 2017. The age of the applied economist: the transformation of economics since the 1970s. History of Political Economy 49 (annual supplement): 1-33. Joshua D., and Jörn-Steffen Pischke. 2010. The credibility revolution in empirical economics: how better research design is taking the con out of econometrics. Journal of Economic Perspectives 24(2): 3-30. Josh, Pierre Azoulay, Glenn Ellison, Ryan Hill, and Susan Feng Lu. 2020. Inside job or deep impact? Extramural citations and the influence of economic scholarship. Journal of Economic Literature 58(1): 3-52. Florent, Isabelle Guérin, and François Roubaud. 2020. Randomized control trials in the field of development. Oxford University Press.Mercier, Hugo and Dan Sperber. 2017. The enigma of reason. Harvard University Press.Akerlof, George A., and Pascal Michaillat. 2018. Persistence of false paradigms in low-power sciences. PNAS 115(52): 13228-13233. Thomas. 1970. The Structure of Scientific Revolutions. University of Chicago Press.Smaldino, Paul E., and Cailin O’Connor. 2021. Interdisciplinarity can aid the spread of better methods between scientific communities. Preprint. James J., and Sidharth Moktan. 2020. Publishing and promotion in economics: the tyranny of the top five. Journal of Economic Literature 58(2): 419-70. Thomas V., Charles Seguin, Yongjun Zhang, and Andrew P. Davis. 2020. Social scientists’ testimony before Congress in the United States between 1946-2016, trends from a new dataset. PLOS ONE 15(3): e0230104. Matthew, and John D. Singleton. 2017. The empirical economist’s toolkit: from models to methods. History of Political Economy 49(annual supplement): 127-157. Souza Leão, Luciana, and Gil Eyal. 2019. The rise of randomized controlled trials (RCTs) in international development in historical perspective. Theory and Society 48: 383-418.
An example of successful innovation by distributed teams: academia
Jan 20 2022
An example of successful innovation by distributed teams: academia
It’s long been assumed that the best sorts of innovation happen when smart people work in an environment where spontaneous face-to-face interaction is the norm. Importantly, if that’s true, it implies the widespread transition to more remote work - where spontaneous face-to-face interaction is not possible - poses a threat to innovation. In this podcast, I want to look at a case study for a sector that:engages in frontier knowledge workhas strong incentives to adopt practices that produce better outcomeshas been well studiedhas increasingly moved to a model of remote collaborationI am talking, of course, about academia. This podcast is an audio read through of the (initial version of the) article An example of successful innovation by distributed teams: academia, published on New Things Under the Sun.Articles Mentioned:Agrawal, Ajay, John McHale, and Alexander Oettl. 2015. Collaboration, Stars, and the Changing Organization of Science: Evidence from Evolutionary Biology. In The Changing Frontier: Rethinking Science and Innovation Policy, eds. Adam B. Jaffe and Benjamin F. Jones, pgs. 75-102. Richard B., Ina Ganguli, Raviv Murciano-Goroff. 2015. Why and Wherefore of Increased Scientific Collaboration. In The Changing Frontier: Rethinking Science and Innovation Policy, eds. Adam B. Jaffe and Benjamin F. Jones, pgs. 17-48. Matthew. 2020. The Case for Remote Work. The Entrepreneurs Network Briefing Paper. Agrawal, Ajay, John McHale, and Alexander Oettl. 2017. How stars matter: Recruiting and peer effects in evolutionary biology. Research Policy 46(4): 853-867. Pierre, Jean-Charles Rochet, and Jean-Marc Schlenker. 2014. Productivity and mobility in academic research: evidence from mathematicians. Scientometrics 98: 1669-1701. Fabian. 2012. Peer Effects in Science: Evidence from the Dismissal of Scientists in Nazi Germany. The Review of Economic Studies 79(2): 838-861. Fabian. 2016. Bombs, Brains, and Science: The Role of Human and Physical Capital for the Creation of Scientific Knowledge. The Review of Economics and Statistics 98(5): 811-831. Pierre, Joshua S. Graff Zivin, and Jialan Wang. 2010. Superstar Extinction. The Quarterly Journal of Economics 125(2): 549-589. E. Han, Adair Morse, and Luigi Zingales. 2009. Are elite universities losing their competitive edge? Journal of Financial Economics 93(3): 353-381. Keith, Yao Amber Li, and Asier Minondo. 2019. Geography, Ties, and Knowledge Flows: Evidence from Citations in Mathematics. The Review of Economics and Statistics 101(4): 713-727. Christiane, and Lukas Kuld. 2021. No place like ho
Measuring Knowledge Spillovers: The Trouble with Patent Citations
Jan 20 2022
Measuring Knowledge Spillovers: The Trouble with Patent Citations
As a source of data for studying innovation, patents are really seductive. There’s nothing else quite like them. And at first glance, one of the most appealing things patents is that they cite each other. That means, patents might help us understand how knowledge spills over from one application to another, which is one of the most distinctive things about innovation, as compared to other economic activities.But there are dangers. A citation might not mean quite what you think. This podcast looks at their shortcomings, while ultimately concluding they can still provide value, especially when they can be complemented with other sources of data.This podcast is an audio read through of the (initial version of the) post Measuring Knowledge Spillovers: The Trouble with Patent Citations, published on New Things Under the Sun.Articles mentioned:Moser, Petra, Joerg Ohmstedt, and Paul W. Rhode. 2016. Patent Citations—An Analysis of Quality Differences and Citing Practices in Hybrid Corn. Management Science 64 (4) 1926-1940. Adam B., Manuel Trajtenberg, and Michael S. Fogarty. 2000. The Meaning of Patent Citations: Report on the NBER/Case-Western Reserve Survey of Patentees. NBER Working Paper 7631. J., Younge, K. and Marco, A. 2020. Patent citations reexamined. The RAND Journal of Economics 51: 109-132. Ryan. 2012. Strategic Citation. The Review of Economics and Statistics, 94(1), 320-333. Adam B., Manuel Trajtenberg, and Rebecca Henderson. 1993. Geographic Localization of Knowledge Spillovers as Evidenced by Patent Citations. The Quarterly Journal of Economics 108, no. 3: 577-98. Roach, and Wesley M. Cohen. 2013. Lens or Prism? Patent Citations as a Measure of Knowledge Flows from Public Research. Management Science 59 (2) 504-525. Kenneth A. and Jeffrey M. Kuhn. 2016. Patent-to-Patent Similarity: A Vector Space Model. SSRN Working paper. Sijie. 2020. The proximity of ideas: An analysis of patent text using machine learning. PLoS ONE 15(7): e0234880.
One question, many answers
Jan 20 2022
One question, many answers
Suppose you set loose a bunch of scientists on the same question, letting each use their best judgment on the method to answer a question. Would you expect them to come to the same conclusions?Unfortunately, the truth is the state of our “methodological technology” just isn’t there yet. There remains a core of unresolvable uncertainty and randomness in the best of circumstances. Science isn’t certain. This podcast is an audio read through of (initial version of the) article One question, many answers, published on New Things Under the Sun.Articles mentionedHuntington-Klein, Nick, Andreu Arenas, Emily Beam, Marco Bertoni, Jeffrey R. Bloem, Pralhad Burli, et al. 2021. The influence of hidden researcher decisions in applied microeconomics. Economic Inquiry, 59: 944– 960. R, Uhlmann EL, Martin DP, et al. 2018. Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results. Advances in Methods and Practices in Psychological Science: 337-356. Nate, Eike Mark Rinke, Alexander Wuttke, Muna Adem, Jule Adriaans, Amalia Alvarez-Benjumea, Henrik K. Andersen, et al. 2021. Observing Many Researchers Using the Same Data and Hypothesis Reveals a Hidden Universe of Uncertainty. MetaArXiv. March 24. A. Bastiaansen, Yoram K. Kunkels, Frank J. Blaauw, Steven M. Boker, Eva Ceulemans, Meng Chen, Sy-Miin Chow, et al. 2020. Time to get personal? The impact of researchers choices on the selection of treatment targets using the experience sampling methodology. Journal of Psychosomatic Research 137(110211). Isaac, Jason M. Lindo, and Krishna Regmi. 2020. Stable Income, Stable Family. NBER Working Paper 27753.