Neil Armstrong: 'Research is creating new knowledge.'

Research is creating new knowledge.

The quote by Neil Armstrong, "Research is creating new knowledge," is a concise yet powerful statement that encapsulates the essence and significance of research. In straightforward terms, this quote implies that research is not merely about gathering existing information but rather about generating fresh insights and uncovering previously unknown truths. It emphasizes the transformative nature of research and highlights its role in facilitating progress and innovation in various fields.At first glance, this quote may seem self-explanatory, as it points out the fundamental purpose of conducting research. However, by introducing an unexpected philosophical concept, we can delve deeper into its meaning and stimulate intriguing discussions. Let us consider the philosophy of constructivism in the context of this quote to illustrate the subtle complexities involved in the process of knowledge creation.Constructivism, in essence, posits that knowledge is actively constructed by individuals based on their experiences, interpretations, and interactions with the world. It suggests that knowledge is not simply passively absorbed from the environment but rather emerges through the active engagement of an individual's cognitive processes. This approach challenges the traditional notion of knowledge as an objective and static entity, and instead portrays it as a dynamic and subjective construct.When we apply the concept of constructivism to Armstrong's quote, a thought-provoking comparison and contrast emerge. Research, as discussed earlier, is undoubtedly a means of creating new knowledge. However, the philosophical lens of constructivism encourages us to ponder the extent to which researchers themselves shape and construct the knowledge they produce. This perspective encourages us to question the role of researchers as active participants in the process of knowledge creation.In the realm of scientific research, for instance, researchers often rely on a hypothesis-driven methodology. They formulate a hypothesis, design experiments, gather data, and analyze the results to draw conclusions. In this process, researchers engage in critical thinking, creative problem-solving, and reasoning to interpret the data and derive meaningful insights. These cognitive processes are inherently subjective and influenced by various factors such as personal biases, cultural backgrounds, and intellectual perspectives.While this subjectivity does not undermine the rigor and credibility of scientific research, it does remind us that the process of knowledge creation is not devoid of human influence. Researchers, as active agents, contribute their unique perspectives, interpretations, and intellectual abilities to generate new knowledge. This understanding calls for a more nuanced appreciation of research as a dynamic interplay between the objective world and the subjective interpretations of researchers.Another intriguing aspect to consider is the collaborative nature of research. In many cases, research efforts are not undertaken by lone individuals but involve teams of researchers from diverse backgrounds and disciplines. The interplay of different perspectives, expertise, and methodologies in collaborative research projects can lead to an even richer creation of new knowledge. Through collective brainstorming, interdisciplinary exchanges, and the synthesis of ideas, collaborative research can produce innovative and multidimensional insights that surpass the boundaries of individual contributions.In conclusion, Neil Armstrong's quote, "Research is creating new knowledge," encapsulates the transformative nature and significance of research. While appearing straightforward at first, a deeper exploration of the philosophical concept of constructivism sheds light on the dynamic and subjective nature of knowledge creation. By acknowledging the role of researchers as active participants who construct knowledge through their unique perspectives and experiences, we gain a more profound understanding of the complexities involved in the research process. Additionally, the collaborative nature of research highlights the value of diverse perspectives and the potential for even greater knowledge creation. Ultimately, this quote serves as a reminder of the boundless possibilities that research offers in expanding our understanding of the world.

Neil Armstrong: 'That's one small step for a man, one giant leap for mankind.'

Neil armstrong: 'mystery creates wonder and wonder is the basis of man's desire to understand.'.

The Health Foundation Logo

How research is creating new knowledge and insight

31 May 2017

  • Darshan Patel
  • Share on Twitter
  • Share on LinkedIn
  • Share on Facebook
  • Share on WhatsApp
  • Share by email
  • Link Copy link

research is to create new knowledge

The pursuit of knowledge and discovery has always been an intrinsic human characteristic, but when new knowledge is curated and put in the right hands it has the power to bring about high value change to society. Here at the Health Foundation developing and sharing evidence on what works and why is central to our work to bring about better health and health care for people in the UK.

Generating new knowledge and insight

We have a long-standing commitment to research. As I write this, the Health Foundation is currently supporting or working on over 160 research projects. And since 2004 for every £3 of grant funding we have awarded, around £1 has been invested in research and evaluation. All of this work has developed our understanding of how to improve aspects of health and health care .

But how do we go about this? Well, we generate new knowledge and insight through a blend of in-house and externally commissioned analysis and research, innovative researcher-led open calls, by supporting individuals through research Chairs and Fellowships , and through rigorous evaluations of our improvement work , as well as evaluating the work of others .

And we don’t stop there. We try to ensure that our researchers are also well connected to their research colleagues, policy makers and practitioners working in the health service. We hold safe-space networking and knowledge exchange events, share new research widely and support researchers to develop relationships and new connections with policy makers and health care professionals. We also draw on independent expertise to ensure our research is well grounded in policy and practice.

One example of this is our Efficiency Research Programme Advisory Group which provides expertise and guidance on efficiency across the health sector to seven research teams we are funding. This month its chair Professor Peter Smith , explains why efficiency is such an important subject area in health and social care and outlines some of the challenges that researchers face. Our aim in proactively supporting research in these ways is to share new knowledge and insights quickly and in ways that can be applied by those who make decisions or deliver care.

Informing policy and improvement work

We are proud to see all our effort having impact. Over the last few years we have actively contributed to effective policy development nationally and seen an increase in the number of areas where our views and expertise have been used.

Our research has contributed to the Five Year Forward View , formed the basis for the key recommendations on safety measurement in the Berwick Review of Patient Safety in England, and also underpinned funding increases for the NHS in Wales , to name but a few.

Our research and evaluation places a strong emphasis on improving health service delivery and patient care, and our recent work to advise on the national evaluation of complex new models of care is bearing fruit . Our Improvement Analytics Unit is providing rapid feedback on the impact of new care models, allowing those delivering frontline improvements to make real-time course correction.

As we look to a future with a healthier population, we want decision makers to have evidence that is useful and that reflects the nature of the health challenges we face today. We are working with Dr Harry Rutter on ways to overcome the research challenges of evaluating complex systems in population health.

Taking the long view

But often impact can take time. Our research and evaluations into patient safety, and person centred care have been active for more than a decade. When we began working in these areas it was first necessary to convince people of the need for change before action could be taken at scale. Now we’re pleased to see that both patient safety and person centred care are embedded in the understanding of high quality care, and national policy reflects some of the insights and approaches we have pioneered with the NHS. In both areas we have been the chosen partner for system-wide implementation plans, for example the Q initiative , which is connecting people with improvement expertise across the UK, and Realising the Value , which builds on what we know about person centred care.

Our research investment is also helping to build capability and capacity within the research community. Our funding has led to academic career progression, provided stability to enable research teams to grow, and allowed multi-disciplinary teams to carve out a reputation for high-quality research in their particular areas. In some instances our funding has even been leveraged in ways to establish financially sustainable research units.

Plans for the coming year

This year we are going even further to promote research capability , but this time within the NHS. Our exciting new programme, Advancing Applied Analytics, launching in June, will support NHS analysts to develop and test novel analytical applications that have the potential to contribute to improvements in patient care and population health.

And, in six short years our contribution to building a stronger scientific underpinning for quality improvement has culminated in the establishment of a ground-breaking Improvement Research Institute , the first of its kind in Europe. Led by Professor Mary Dixon-Woods , the Institute will strengthen the evidence-base for how to improve health care, growing capacity in research skills in the NHS, academia and beyond. I look forward to seeing how the institute generates new and exciting areas of research and enables wide participation in large scale research programmes.

2017 is going to be particularly exciting as we will be launching a number of researcher-led open calls over the year.

Our Insight 2017 programme is currently open, until 25 July 2017, for ideas to support research that can advance the use of national clinical audits and patient registries to improve healthcare quality.

Later this year we will also be launching a second round of our Behavioural Insights researcher-led open call . Behavioural insights research is gaining widespread traction as a complementary policy lever in tackling the many challenges in improving health and health care. We are thrilled to be a leading funder in this area working alongside experts such as the Behavioural Insights Team . Hannah Burd tells us more this month, explaining how small behaviourally informed changes can lead to significant reductions in inefficiency and waste in health care.

So there you have it, our research in a nutshell.

And, what excites me the most about these 160 or so research projects is the impact that we hope to make in improving health and health care over the next decade. So, keep watching this space and please do get in touch if you would like to know more.

Darshan Patel is a Senior Research Manager at the Health Foundation

Share this page:

Quick links

  • News and media
  • Work with us
  • Events and webinars

Hear from us

Receive the latest news and updates from the Health Foundation

  • 020 7257 8000
  • [email protected]
  • The Health Foundation Twitter account
  • The Health Foundation LinkedIn account
  • The Health Foundation Facebook account

Copyright The Health Foundation 2024. Registered charity number 286967.

  • Accessibility
  • Anti-slavery statement
  • Terms and conditions
  • Privacy policy

We're a Living Wage employer

Brand identity

How research is creating new knowledge and insight

The pursuit of knowledge and discovery has always been an intrinsic human characteristic, but when new knowledge is curated and put in the right hands it has the power to bring about high value change to society. I work in the research team at the Health Foundation, an independent charity committed to bringing about better health and health care for people in the UK. Our aim is a healthier population, supported by high quality health care that can be equitably accessed. And one key way in which we go about realising this vision is through supporting and funding innovative research and evaluations, which I explore below.

Darshan Patel 12 Jun 2017

research is to create new knowledge

Generating new knowledge and insight

We have a long-standing commitment to research. As I write this, the Health Foundation is currently supporting or working on over 160 research projects. And since 2004 for every £3 of grant funding we have awarded, around £1 has been invested in research and evaluation. All of this work has developed our understanding of how to improve aspects of health and health care .

But how do we go about this? Well, we generate new knowledge and insight through a blend of in-house and externally commissioned analysis and research, innovative researcher-led open calls, by supporting individuals through research Chairs and Fellowships , and through rigorous evaluations of our improvement work , as well as evaluating the work of others .

The Health Foundation is currently supporting or working on over 160 research projects.

And we don’t stop there. We try to ensure that our researchers are also well connected to their research colleagues, policy makers and practitioners working in the health service. We hold safe-space networking and knowledge exchange events, share new research widely and support researchers to develop relationships and new connections with policy makers and health care professionals. We also draw on independent expertise to ensure our research is well grounded in policy and practice.

One example of this is our Efficiency Research Programme ’s advisory group which provides expertise and guidance on efficiency across the health sector to seven research teams we are funding. Here its chair Professor Peter Smith , Emeritus Professor of Health Policy at Imperial College, explains why efficiency is such an important subject area in health and social care and outlines some of the challenges that researchers face. Our aim in proactively supporting research in these ways is to share new knowledge and insights quickly and in ways that can be applied by those who make decisions or deliver care.

Informing policy and improvement work

We are proud to see all our effort having impact. Over the last few years we have actively contributed to effective policy development nationally and seen an increase in the number of areas where our views and expertise have been used.

Our research has contributed to the Five Year Forward View , formed the basis for the key recommendations on safety measurement in the Berwick Review of Patient Safety in England, and also underpinned funding increases for the NHS in Wales , to name but a few.

Our research and evaluation places a strong emphasis on improving health service delivery and patient care, and our recent work to advise on the national evaluation of complex new models of care is bearing fruit . Our Improvement Analytics Unit is providing rapid feedback on the impact of new care models, allowing those delivering frontline improvements to make real-time course correction.

As we look to a future with a healthier population, we want decision makers to have evidence that is useful and that reflects the nature of the health challenges we face today. We are working with Dr Harry Rutter , London School of Hygiene and Tropical Medicine, on ways to overcome the research challenges of evaluating complex systems in population health.

Taking the long view

But often impact can take time. Our research and evaluations into patient safety, and person centered care have been active for more than a decade. When we began working in these areas it was first necessary to convince people of the need for change before action could be taken at scale. Now we’re pleased to see that both patient safety and person centered care are embedded in the understanding of high quality care, and national policy reflects some of the insights and approaches we have pioneered with the NHS. In both areas we have been the chosen partner for system-wide implementation plans, for example the Q initiative , which is connecting people with improvement expertise across the UK, and Realizing the Value , which builds on what we know about person centered care.

Our research investment is also helping to build capability and capacity within the research community. Our funding has led to academic career progression, provided stability to enable research teams to grow, and allowed multi-disciplinary teams to carve out a reputation for high-quality research in their particular areas. In some instances our funding has even been leveraged in ways to establish financially sustainable research units.

Plans for the coming year

This year we are going even further to promote research capability , but this time within the NHS. Our exciting new program, Advancing Applied Analytics, launching in June, will support NHS analysts to develop and test novel analytical applications that have the potential to contribute to improvements in patient care and population health.

Our exciting new program, Advancing Applied Analytics, launching in June, will support NHS analysts to develop and test novel analytical applications that have the potential to contribute to improvements in patient care and population health.

And, in six short years our contribution to building a stronger scientific underpinning for quality improvement has culminated in the establishment of a ground-breaking Improvement Research Institute , the first of its kind in Europe. Led by Professor Mary Dixon-Woods , University of Cambridge, the Institute will strengthen the evidence-base for how to improve health care, growing capacity in research skills in the NHS, academia and beyond. I look forward to seeing how the institute generates new and exciting areas of research and enables wide participation in large scale research programs.

2017 is going to be particularly exciting as we will be launching a number of researcher-led open calls over the year.

Our Insight 2017 program is currently open, until 25 July 2017, for ideas to support research that can advance the use of national clinical audits and patient registries to improve healthcare quality.

Later this year we will also be launching a second round of our Behavioral Insights researcher-led open call . Behavioral insights research is gaining widespread traction as a complementary policy lever in tackling the many challenges in improving health and health care. We are thrilled to be a leading funder in this area working alongside experts such as the Behavioral Insights Team . Hannah Burd tells us more this month, explaining how small behaviorally informed changes can lead to significant reductions in inefficiency and waste in health care.

So there you have it, our research in a nutshell.

And, what excites me the most about these 160 or so research projects is the impact that we hope to make in improving health and health care over the next decade. So, keep watching this space and please do get in touch if you would like to know more.

The Health Foundation is an independent charity committed to bringing about better health and health care for people in the UK.

Our aim is a healthier population, supported by high quality health care that can be equitably accessed. From giving grants to those working at the front line to carrying out research and policy analysis, we shine a light on how to make successful change happen. We use what we know works on the ground to inform effective policymaking and vice versa.

We believe good health and health care are key to a flourishing society. Through sharing what we learn, collaborating with others and building people’s skills and knowledge, we aim to make a difference and contribute to a healthier population.

View the latest posts on the On Health homepage

  • Latest Posts

' src=

Darshan Patel

Latest posts by darshan patel ( see all ).

  • How research is creating new knowledge and insight - 12th June 2017

Recommended posts

Popular on health tags.

  • public health
  • Human Resources for Health
  • global health
  • Infectious Diseases of Poverty
  • health services research
  • reproductive health
  • infectious diseases

Popular posts

  • Most shared

Sorry. No data so far.

Most Shared Posts

  • Sleep, rest and play – the Journal of Activity, Sedentary and Sleep Behaviors publishes its first articles
  • BMC Global and Public Health : New journal to advance SDG research
  • COVID-19 Risk Adjustment, Driving Transformation or Normalizing Deviance? It is Our Choice
  • Meet the SDG3 researchers: Tatenda Mawoyo and Stefani Du Toit
  • Science versus the virus: Science is saving millions of lives, but how is it so rapid, and why is the death rate so variable?
  • January 2024  (1)
  • November 2023  (1)
  • July 2023  (2)
  • June 2023  (1)
  • March 2023  (1)
  • November 2022  (1)
  • October 2022  (2)
  • September 2022  (3)
  • August 2022  (2)
  • July 2022  (1)
  • March 2022  (3)
  • January 2022  (2)

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

What is knowledge and when should it be implemented?

Affiliation.

  • 1 Knowledge Translation Research Network, Ontario Institute for Cancer Research, Toronto, ON, Canada. [email protected]
  • PMID: 22994990
  • DOI: 10.1111/j.1365-2753.2012.01899.x

A primary purpose of research is to generate new knowledge. Scientific advances have progressively identified optimal ways to achieve this purpose. Included in this evolution are the notions of evidence-based medicine, decision aids, shared decision making, measurement and evaluation as well as implementation. The importance of including qualitative and quantitative methods in our research is now understood. We have debated the meaning of evidence and how to implement it. However, we have yet to consider how to include in our study findings other types of information such as tacit and experiential knowledge. This key consideration needs to take place before we translate new findings or 'knowledge' into clinical practice. This article critiques assumptions regarding the nature of knowledge and suggests a framework for implementing research findings into practice.

© 2012 Blackwell Publishing Ltd.

PubMed Disclaimer

Similar articles

  • Use of the innovation-decision process teaching strategy to promote evidence-based practice. Schmidt NA, Brown JM. Schmidt NA, et al. J Prof Nurs. 2007 May-Jun;23(3):150-6. doi: 10.1016/j.profnurs.2007.01.009. J Prof Nurs. 2007. PMID: 17540318
  • Strategies to promote evidence-based practice in pediatric physical therapy: a formative evaluation pilot project. Schreiber J, Stern P, Marchetti G, Provident I. Schreiber J, et al. Phys Ther. 2009 Sep;89(9):918-33. doi: 10.2522/ptj.20080260. Epub 2009 Jul 30. Phys Ther. 2009. PMID: 19643835
  • Five steps from evidence to effect: exercising clinical freedom to implement research findings. Kulier R, Gee H, Khan KS. Kulier R, et al. BJOG. 2008 Sep;115(10):1197-202. doi: 10.1111/j.1471-0528.2008.01821.x. BJOG. 2008. PMID: 18715404 No abstract available.
  • Risk management frameworks for human health and environmental risks. Jardine C, Hrudey S, Shortreed J, Craig L, Krewski D, Furgal C, McColl S. Jardine C, et al. J Toxicol Environ Health B Crit Rev. 2003 Nov-Dec;6(6):569-720. doi: 10.1080/10937400390208608. J Toxicol Environ Health B Crit Rev. 2003. PMID: 14698953 Review.
  • Utility of qualitative research findings in evidence-based public health practice. Jack SM. Jack SM. Public Health Nurs. 2006 May-Jun;23(3):277-83. doi: 10.1111/j.1525-1446.2006.230311.x. Public Health Nurs. 2006. PMID: 16684207 Review.
  • Experiential knowledge of risk and support factors for physician performance in Canada: a qualitative study. Kain NA, Hodwitz K, Yen W, Ashworth N. Kain NA, et al. BMJ Open. 2019 Feb 22;9(2):e023511. doi: 10.1136/bmjopen-2018-023511. BMJ Open. 2019. PMID: 30798305 Free PMC article.
  • Applying social network analysis to understand the knowledge sharing behaviour of practitioners in a clinical online discussion forum. Stewart SA, Abidi SS. Stewart SA, et al. J Med Internet Res. 2012 Dec 4;14(6):e170. doi: 10.2196/jmir.1982. J Med Internet Res. 2012. PMID: 23211783 Free PMC article.

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Ovid Technologies, Inc.

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 21 July 2020

The evolution of knowledge within and across fields in modern physics

  • Ye Sun 1 &
  • Vito Latora   ORCID: orcid.org/0000-0002-0984-8038 1 , 2 , 3 , 4  

Scientific Reports volume  10 , Article number:  12097 ( 2020 ) Cite this article

8144 Accesses

27 Citations

3 Altmetric

Metrics details

  • Applied mathematics
  • Applied physics

The exchange of knowledge across different areas and disciplines plays a key role in the process of knowledge creation, and can stimulate innovation and the emergence of new fields. We develop here a quantitative framework to extract significant dependencies among scientific disciplines and turn them into a time-varying network whose nodes are the different fields, while the weighted links represent the flow of knowledge from one field to another at a given period of time. Drawing on a comprehensive data set on scientific production in modern physics and on the patterns of citations between articles published in the various fields in the last 30 years, we are then able to map, over time, how the ideas developed in a given field in a certain time period have influenced later discoveries in the same field or in other fields. The analysis of knowledge flows internal to each field displays a remarkable variety of temporal behaviours, with some fields of physics showing to be more self-referential than others. The temporal networks of knowledge exchanges across fields reveal cases of one field continuously absorbing knowledge from another field in the entire observed period, pairs of fields mutually influencing each other, but also cases of evolution from absorbing to mutual or even to back-nurture behaviors.

Similar content being viewed by others

research is to create new knowledge

A mathematical model for the process of accumulation of scientific knowledge in the early modern period

research is to create new knowledge

Papers and patents are becoming less disruptive over time

research is to create new knowledge

Investigating patterns of change, stability, and interaction among scientific disciplines using embeddings

Introduction.

Knowledge creation and knowledge sharing go hand in hand. Knowledge is in fact created through combination and integration of different concepts, and can benefits from social interactions and interdisciplinary collaborations. Recent works have explored from many angles how knowledge flows across scholars 1 , 2 , 3 , 4 , 5 , institutions 6 , 7 , 8 , 9 and disciplines 10 , 11 , 12 . In particular, it has been shown that knowledge exchange across fields can influence the evolution of culture and language 13 , 14 , strengthen multi-faceted cooperation 3 , 15 , and drive the innovation and development of science 16 , 17 , 18 , 19 . Research publications are one of the primary channels of communication for the exchange and spreading of knowledge in science 20 . By publishing their own articles and citing works by their peers, researchers continuously contribute to the processes of knowledge creation, knowledge sharing and knowledge acquisition 21 , thereby promoting the advancement of science. The presence of a citation between two research articles often denotes a certain transfer of knowledge from the cited article to the citing articles. It is therefore natural to use citations between articles published in different scientific fields to investigate the flow of knowledge across different domains of science. Despite some works in this direction have already started elucidating the main mechanisms of knowledge sharing and diffusion 22 , 23 , 24 , a systematic study on how knowledge evolves in time 25 and of the complex interactions and influences between different fields 26 is still lacking.

In this article, we propose a novel framework to detect and quantify relevant transfers of knowledge across disciplines and between different time periods. One of the outcomes of the method is the construction of a time-varying network mapping the structure of knowledge and the relations between disciplines. In particular, we present an application to study the evolution of scientific knowledge in modern physics, namely to investigate how influences from one field of physics to another have evolved over time in the last 30 years. Building on bibliographic information of over 430,000 articles published by the American Physical Society (APS) between 1985 and 2015, and making use of the highest-level Physics and Astronomy Classification Scheme (PACS) codes, which indicate the fields of physics an article belongs to, we construct a temporal network where the nodes represent the fields of modern physics and the directed links denote the presence of a significant dependence of a field on another. Such a network is changing over time and, as we will show below, its analysis by the methods of network science 27 , 28 is able to reveal essential properties of how knowledge is exchanged among fields and over different time periods. We have found that, overall, knowledge flows have become increasingly homogeneous over the last years, indicating the important role of interdisciplinary research 10 , 11 , 29 , 30 , 31 . In spite of this, some typical patterns of influence, such as cases of one field absorbing knowledge from another field, or two fields mutually influencing each other, clearly emerge at the microscopic scale. Our findings provide insights into the basic mechanisms of knowledge exchange in science, and can turn very useful to understand the dynamics of scientific production and the growth of novelties in scientific domains 32 , 33 , 34 .

The fields of modern physics

An overall insight into the main fields of modern physics can be obtained by a basic analysis of the characteristics of the APS data sets. Considered at their highest level, PACS codes divide modern physics into ten major fields (see Table  1 ). A measure of the relevance of each field can then be derived from the volume of papers published in each field. Since each paper can be listed with multiple PACS codes we assign it to multiple fields. We therefore consider each paper as one unit of knowledge and define the field composition of the paper as the relative frequency of its PACS codes. For instance, if a paper is listed with three PACS codes \({\textit{89.75.-k}}\) , \({\textit{81.05.-t}}\) and \({\textit{05.45.-a}}\) , we assign two-thirds of this paper to Interdisciplinary Physics (PACS 80), and the remaining one-third to General Physics (PACS 00). The total number of papers \(N_{paper}\) associated to each field over the entire time period of 31 years is reported in Table  1 . One can see that the three largest fields are Condensed Matter (PACS 60 and 70) and General Physics (PACS 00), capturing \(57\%\) of the entire publications in APS journals. GPE is the smallest field, with only 8325 papers, which is roughly one-fifteenth the size of the largest field CM 2. To quantify and compare the growth rate of each field, the average yearly change in the number of papers \(\Delta N_{paper}\) is also reported in Table  1 . Consistently with the rankings based on field sizes, GEN and CM 2 also exhibit the highest growth rate, larger than 100 papers per year, while GPE shows the slowest increase, with only 5 papers per year on average. On the contrary, CM 2, the third-largest field by size, is ranked fourth from the bottom according to the average growth \(\Delta N_{paper}\) . An opposite trend is observed for Interdisciplinary Physics and Astrophysics , which respectively take fifth and sixth place according to their growth rates, although their field sizes are ranked eighth and ninth among these fields, reflecting their rapid development during the observing period.

We found that \(91\%\) of the papers have more than one PACS code, with \(36\%\) of them reporting PACS codes which are at least from two different fields. To quantify the level of interdisciplinarity of a given field, we have collected all the papers with at least one PACS code from that field, and then calculated the proportion J of these papers which are also classified by at least one PACS code from other fields. The results in Table  1 show that Interdisciplinary Physics is the field with the largest value of J : almost \(90\%\) of the Interdisciplinary Physics papers are also classified by PACS codes from other fields of physics. This result is consistent with the expectation that interdisciplinary research combines knowledge from various disciplines. Instead, papers in the fields of Nuclear , Particles and Condensed matter 2 physics are more likely to use PACS codes from their own fields. Summing up, the above analyses indicate that the differences between fields of physics are remarkable, either in terms of the size and growth of the fields, and in terms of their interactions with other fields.

The knowledge flow network

Interactions among scientific fields can be better characterized by making use of scientific citations. A published article in a scientific field citing articles of another field implies that the cited field reflects a piece of previously existing knowledge that the citing field builds upon. And this, in turn, indicates a flow of knowledge from the cited field to the citing field. Hence, we can construct a network of knowledge flow across fields by analyzing the pattern of citations among papers of different fields. The nodes of such a network represent the ten fields of modern physics as indicated by the PACS codes, while the directed links between fields denote the flows of knowledge from one area of physics to another.

figure 1

The knowledge flow network and its time evolution. ( a ) Illustration of how a citation between two papers is translated into a contribution to the knowledge flow between the two corresponding fields. ( b ) Construction of a weighted network of knowledge flow based on the significance of each link. ( c ) The knowledge flow network among different fields in physics in years 1990, 2000 and 2010. Node sizes are proportional to the number of papers published in each field and given year, and the line widths correspond to the weights of knowledge flows between two fields. The links with weights larger than 1 are selected with red color. The arrow represents the direction of knowledge flows.

Specifically, for a given citation c there will be a transfer of knowledge from each PACS code in the cited reference to all the PACS codes in the citing article. We hence indicate as \(f^{c}_{\alpha \rightarrow \beta }\) the volume of knowledge flow from field \(\alpha\) to field \(\beta\) due to citation c . As shown in Fig.  1 a, this is calculated as the product of the proportion of PACS codes from field \(\beta\) in the citing article times the proportion of PACS codes from field \(\alpha\) in the cited article. This ensures the normalization \(\sum _{\alpha }\sum _{\beta }{f}^{c}_{\alpha \rightarrow \beta }=1\) for each citation c , meaning that each citation contributes a unit of knowledge transfer that is then split to the different fields. For instance, in Fig.  1 a, two of the three PACS codes of the cited article belong to field \(\alpha\) , while one over two of the PACS codes in the citing article is from field \(\beta\) . Consequently, we assume that the volume of knowledge flowing from field \(\alpha\) to field \(\beta\) , due to citation c , is \(f^{c}_{\alpha \rightarrow \beta }=2/3 \times 1/2\) . Similarly, we can calculate the quantities \(f^{c}_{\alpha \rightarrow \alpha }\) , \(f^{c}_{\beta \rightarrow \alpha }\) and \(f^c_{\beta \rightarrow \beta }\) . In order to characterize the flow of knowledge across fields and to study its evolution over the years, we construct yearly aggregated networks by selecting different pairs of years for citing and cited articles respectively. This is done by analyzing all the citations from papers published in a given year t to papers published in year \(t-n\) , and defining the total volume of knowledge flowing from field \(\alpha\) to field \(\beta\) as:

where the sum runs over all citations c from papers published in field \(\beta\) in year t to papers published in field \(\alpha\) in year \(t-n\) . Notice that n is a tunable parameter, denoting the relative age of cited papers with respect to the citing year t . Having the possibility to vary both t and n allows to take into account that the probability of a citation is influenced by: (1) the relative age of the two papers 35 , and by (2) the number of papers published in the cited year \(t-n\) . The quantities \(F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}\) are, however, affected by field-specific characteristics and publishing conventions, such as typical field sizes and time-varying growth rates, which, as shown in Table  1 , may vary a lot from field to field. Some other influencing factors that are not discussed here might exist, such as the difference of reference list length across fields, which could be considered in more detailed study. Hence, an increase of \(F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}\) over time does not automatically reflect a closer relation between fields \(\alpha\) and \(\beta\) , as it can only be due to a rapid growth in the number of publications in these two fields. In order to account for this, we define the statistical significance \(\phi (\alpha _{t-n},\beta _{t})\) , which quantifies how the observed knowledge flow \(F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}\) exceeds the flow expected in a opportunely chosen null model (see the “ Methods ” section). Figure  1 b illustrates an example of how to calculate the quantities \(\phi (\alpha _{t-n},\beta _{t})\) in Eq. ( 5 ). Suppose, for instance, field \(\alpha\) in year \(t-n\) provides a total of 75 units of knowledge to all the fields in year t , with field \(\alpha\) itself receiving 50 of these 75 units, and \(\beta\) getting 25, i.e. \(F_{\alpha \rightarrow \alpha }^{t-n \rightarrow t}=50\) and \(F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}=25\) . Analogously we assume field \(\beta\) in year \(t-n\) provides a total of 25 units to year t , 15 to field \(\beta\) itself and 10 to \(\alpha\) . Now, the marginal probability that the citing field is \(\beta\) can be obtained as \(Pr(X_{t}^{citing}=\beta )=\sum _{\alpha } F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}/\sum _{\alpha ,\beta } F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}=(25+15)/(75+25)\) , while \(Pr(Y_{t-n}^{cited}=\alpha | X_{t}^{citing}=\beta )= F_{\alpha \rightarrow \beta }^{t-n \rightarrow t}/\sum _{\alpha } F_{\alpha \rightarrow \beta }^{t-n \rightarrow t} = 25/(25+15)\) . We therefore have \(P(\alpha _{t-n}, \beta _{t}) = 25/40 \cdot 40/100\) . Such a probability needs to be compared to that of a null model in which \(P^{\mathrm{rand}}(\alpha _{t-n}, \beta _{t})=Pr(Y_{t-n}^{cited}=\alpha ) \cdot Pr(X_{t}^{citing}=\beta )= 80/150 \cdot 40/100\) , since the probability \(Pr(Y_{t-n}^{cited}=\alpha )\) that the cited paper in year \(t-n\) is in field \(\alpha\) is equal to \(80/(80+70)\) . Finally, the ratio \(\phi (\alpha _{t-n},\beta _{t})\) in Eq. ( 5 ) is equal to \(25/40 \cdot 150/80\) .

Analogously, we can calculate the statistical significance of all the other flows reported in Fig.  1 b. Such quantities allow to capture the intrinsic variation of knowledge flows among fields and also to compare different pair of fields. Finally, to obtain the weights of knowledge flows in year t from a cited time window \(\Delta t'\) , we define the flow weights \(w_{\alpha \rightarrow \beta }^{\Delta t' \rightarrow t}\) for each couple of cited field \(\alpha\) and citing field \(\beta\) as:

where \(\vert \Delta t' \vert\) is the length of the time window. Let \(\Delta t'=[1, 5]\) and \(\vert \Delta t' \vert =5\) , then one can construct the significant knowledge flow network in each year t from the previous 5 years. Furthermore, for each given source period \(\Delta t'\) , one can also investigate the knowledge flows within an observing period \(\Delta t\) :

For example, let \(\vert \Delta t \vert =5\) , we can divide the entire time period into five observing time windows, namely [1990, 1994], [1995, 1999], [2000, 2004], [2005, 2009] and [2010, 2014]. The weight of each link in the network reflects how significant the knowledge flows between two related fields. This quantitative framework enable us to investigate the evolution of knowledge flows in two time dimensions: (1) for each given observing period \(\Delta t\) , the weights of knowledge flows from different time interval \(\Delta t'\) can be observed; and also (2) for each fixed \(\Delta t'\) , the weights of knowledge flows within different observing period \(\Delta t\) can be compared. Although in this paper we have studied knowledge flows across the ten major fields of physics, we believe that our framework can also give important information when applied to investigate knowledge transfer among subfields at any possible level of hierarchy.

Temporal analysis of knowledge flow networks

We first investigate how the overall properties of the knowledge flow networks have changed over time. Specifically we have evaluated, for each year, the flows of knowledge from the previous 5 years, i.e we have fixed \(\Delta t'=[1,5]\) and \(\vert \Delta t \vert =1\) in our framework. To better visualize the temporal changes, the whole knowledge flow networks obtained for the 3  years 1990, 2000 and 2010 are reported in Fig.  1 c. Links representing a significant flow of knowledge ( \(w_{\alpha \rightarrow \beta }^{\Delta t' \rightarrow t}>1\) ) are shown in red color.

figure 2

Temporal analysis of knowledge flow networks. ( a ) The number of significant links in the knowledge flow network is shown as a function of the year together with its time average, reported as a dashed gray line. ( b ) Network reciprocity measuring the proportion of bidirectional links, shows a pattern with a peak around year 1998. The top \(50\%\) of bidirectional links with the largest sum of mutual weights were considered in the computation of the reciprocity. ( c ) The Z-score of two types of three-node motifs is reported as a function of time. ( d ) Mean and standard deviation of the weights of significant links gradually decrease over time, indicating that the knowledge flows between fields are tending more towards random expectations.

The first thing to notice is that the number of significant links is roughly constant over the years, as also illustrated in Fig.  2 a. In addition to this, we observe that more links are reciprocated in 2000 with respect to years 1990 and 2010, which suggests that situation in which couples of fields mutually influence each other are more common 2000. To further examine this, we have computed the network reciprocity (see “ Methods ” section) for each year. The results reported in Fig.  2 b indicate that the value of the reciprocity \(\rho\) has increased in the first few years, reached a peak around 1998, and then has begun to decrease in the following years. This has lead us to conclude that the highest levels of mutuality in knowledge transfer among different fields of physics have been experienced between 1995 and 2000.

We have then extracted the typical patterns of knowledge transfer in the network. For this reason, we have focused on the statistically significant three-node motifs in the knowledge flow networks 36 , i.e. the directed connected subgraphs of three nodes that appear in the network more often than they would occur by chance. Figure  2 c illustrates the Z-scores (see the “ Methods ” section) of two relevant three-node motifs over the years. One can see that the subgraph represented by bi-directed paths (diamond symbol) is the most significant motif throughout the whole time period, with a Z-score on average equal to about 6. Furthermore, complete subgraphs of three nodes, corresponding to three mutually connected fields of physics, are only statistically significant in the period from 1998 to 2000, where the complete subgraph GEN , EOA and IPR appears. Notice that this period also corresponds to the time period of high reciprocity in Fig.  2 b.

In addition, Fig.  1 c indicates that there are fewer links with large weights in year 2010 than in 1990 and 2000. To further investigate this trend, Fig.  2 d reports mean \(\overline{w}\) and standard deviation \(\sigma _w\) of the weights of significant links between 1990 and 2016. We find that both the values of \(\overline{w}\) and \(\sigma _{w}\) gradually decrease overtime and eventually stabilize to values slightly above 1 and 0 respectively. This indicates that the exchange of knowledge across domains has increasingly become more homogeneous with respect to the beginning of 1980s, when each field only absorbed knowledge from a handful of close domains. From a different perspective, this also reflects a rise of the interdisciplinary character of research in physics.

Internal knowledge flows

The weights of the internal flows \(w_{\alpha \rightarrow \alpha }^{\Delta t' \rightarrow t}\) from a field \(\alpha\) to itself are an indication of the degree of self-dependence of the research field. To investigate the evolution of the internal knowledge flows, we have computed, for each of the ten fields of physics, the weights of the internal flows in every observing year t (between 1990 and 2015) from each of the previous 5 years, namely adopting a cited time window \(\Delta t'=[1,5]\) . Figure  3 a indicates that the internal knowledge flows are significant ( \(w_{\alpha \rightarrow \alpha }^{\Delta t' \rightarrow t}>1\) ) for all ten fields over the whole 21-year period of time, although the temporal trends can vary from field to field. The two fields with the largest variations are GAA and GPE . Field GAA exhibits a remarkable decrease in the degree of self-reference after 1993, indicating that in this field the internal transfer of knowledge has become less and less significant over time. Conversely, field GPE shows an increasing trend and becomes the most self-referential field after 1995. Other fields exhibit decreasing ( EOA and IPR ) or increasing ( NUC and ATM ) patterns, while the contribution of internal flows are relatively low and keeps nearly constant for fields such as GEN , CM1 and CM2 .

In Fig.  3 b–e we focus on the evolution of internal flows of the four fields EPF , NUC , IPR and GAA . In particular, we perform a two-dimensional analysis in which we change the positions of both the observing time windows \(\Delta t\) and the source time window \(\Delta t'\) . We consider the case where the lengths of the two time windows is the same and is equal to 5 years. The colours in Fig.  3 b–e represent the values of internal knowledge flows \(w_{\alpha \rightarrow \alpha }^{\Delta t' \rightarrow \Delta t}\) . By looking at the variations of colours in each row we find that field NUC shows an increasingly high degree of self-reference over time, while IPR and GAA tend to lower their degree of internal flows, which is consistent with the results in Fig.  3 a.

By looking at the variation of colours over each column of Fig.  3 b–e we can instead investigate the influence of reference’s age on the internal flows of knowledge. One can see that fields such as NUC and IPR show a decreasing trend from most recent times to the past, in agreement with previous studies stating that the likelihood of a paper being discovered significantly decreases with the papers’ age 37 . By contrast, we observe an unexpected and very clear pattern for EPF and GAA , since both fields exhibit a maximum of the values along the anti-diagonal line. Notice that each square along the anti-diagonal line represents the same cited time window, namely a time window of 5 years before the period [1990, 1994]. This may be due to important discoveries and the publication of pioneering research works in the fields EPF and GAA during the period [1985, 1990] which would clearly increase the probability for researchers in the field to cite, in the following years, papers published in that period. A possible explanation can be for instance the rapid development in the period [1985, 1989] of the new research area “ astroparticle physics ”, emerging at the intersection of particle physics, astronomy and astrophysics 38 , and which mainly combines the knowledge from fields EPF and GAA . As an evidence of this rapid development, notice that even a new journal named “Astroparticle Physics” was established in 1992. Moreover, the fact that the weights of the internal flows in GAA are nearly three times larger than those in EPF , can be due to the Hubble Space Telescope, one of the major scientific breakthroughs in field GAA . The telescope is one of the largest and most productive scientific research tool for astronomy, and it was indeed launched in 1990 (within the period of interest in the anti-diagonal line), greatly promoting the development of astronomy in GAA . We have further examined the evolution of internal flows for the remaining six fields and found similar patterns (see Supplementary Information (SI) ).

The evolution of knowledge flows across fields

Examining how the discoveries in a field have contributed to a different field of physics is even more important than studying the flows of knowledge within a given field. In order to get an overall picture of the existing influences across different fields of modern physics, we report in Fig.  4 a the average weights of knowledge flows between each couple of fields over the whole period under study. To highlight the mutual exchange of flows, the results are shown in a ( \(\overline{w}_{\alpha \rightarrow \beta }\) )-( \(\overline{w}_{\beta \rightarrow \alpha }\) ) plane. Each point refers to a pair of fields, and the distance from the position of the point to the bisector (red line) measures the level of asymmetry in the exchange of knowledge between the two fields. We notice that most of the points are concentrated around the bisector, especially those points in the lower-left corner corresponding to pairs of fields with small significance weights. However, there are also points far from the line, such as the point corresponding to the pair GPE and ATM (red up-triangle in the lower right of the panel), indicating asymmetric transfers of knowledge between two fields.

figure 3

Evolution of knowledge flows within a field. ( a ) For each field \(\alpha\) and each year t , we plot the internal knowledge flow \(w_{\alpha \rightarrow \alpha }^{\Delta t' \rightarrow t}\) from a window of the previous 5 years to t . In ( b )–( e ) we show the evolution of internal flows for four specific fields in two-dimensional plots \(\Delta t\) , \(\Delta t'\) . By comparing the change in each row, we find that field NUC shows an increasingly high degree of self-reference over time, while, conversely, IPR and GAA tend to become less and less self-dependent. Focusing on the variation in each column, we can examine the effect of reference age on the significance of internal knowledge transfer. The lengths of each citing period \(\Delta t\) and cited period \(\Delta t'\) in ( b )–( e ) are both equal to 5 years.

To investigate the temporal evolution in the exchange of knowledge between two fields in Fig.  4 b–e, we have considered the same types of plots over time. In such a case, each pair of fields corresponds to a trajectory joining the points corresponding to the different years from 1990 to 2015. The colors of symbols from light to dark indicates the years from past to the most recent. Although the significance of the links in general decreases over time, the temporal patterns can vary from one pair of fields to another. The four panels illustrate the four major classes of behaviour (modes) we have found, namely: absorbing, absorbing to mutual, back-nurture and mutual mode. Absorbing mode can be seen in field GPE , which has absorbed more knowledge from fields ATM and EOA throughout the whole period under study (Fig.  4 b). Fields GEN and EPF show a similar behavior as GPE in the beginning, absorbing more knowledge from GAA , while in the last few years, GEN and EPF tend to mutually exchange knowledge with GAA , although the weights on the links in both directions become less significant (Fig.  4 c). More interestingly, we also find a back-nurture mode as shown in Fig.  4 d. Field GEN at first absorbs more knowledge from EOA than what it provides to EOA , but later the situation is inverted. Finally, fields IPR and CM1 shows another pattern, the mutual mode, indicating that they have exchanged knowledge in an almost symmetric way over the whole period. Similar evolution modes have also been seen in the remaining six fields (see SI ). These different evolution patterns clearly demonstrate that the processes of knowledge creation and transfer across fields can be highly heterogeneous.

figure 4

Evolution of knowledge across fields. ( a ) For each pair of fields, we plot the average flows of knowledge in either direction, averaged over the entire observation period of 26 years. ( b )–( e ) report some of the typical patterns of temporal evolution we have observed over the years. Symbol color (from light to dark) indicates the years from 1990 to 2015, while the lines join consecutive years to help following the trajectories. The bisector red line corresponds to the case of perfectly symmetric knowledge flows between the two fields. ( b ) “Absorbing mode”: field GPE has been absorbing knowledge from fields ATM and EOA throughout the entire time period. ( c ) “From absorbing to mutual mode”: GEN and EPF have initially absorbed more knowledge from field GAA and then tend to a balanced case in which they absorb from GAA the same knowledge they provide to it. ( d ) “Back-nurture mode”: while during the first few years GEN has absorbed more knowledge from field EOA than it has contributed to, at a later stage the situation is inverted. ( e ) “Mutual mode”: fields IPR and CM1 tend to share knowledge in a symmetric way over the whole period.

Knowledge sharing and transfer across scientific disciplines, and cross-fertilization are increasingly recognized as crucial factors to breakthrough innovation in science 12 , 39 , 40 . The temporal network approach we have proposed in this article can be useful to shed lights on the evolution of knowledge within a field and on the dynamic patterns of influences between different fields. Our study case application has shown that major developments in physics can influence a field for many decades and can even trigger knowledge production in other fields. Indeed, the patterns of cross-fertilization vary greatly among the different disciplines of physics and can also show marked transitions over time. For instance, the physics of gases and plasmas has consistently absorbed knowledge from atomic and molecular physics and from electromagnetism over the last 3 decades. Other fields such as condensed matter and interdisciplinary physics have instead always shared and mutually exchanged knowledge. Finally, we have revealed interesting transitions from absorbing to mutual modes, for instance in the case of the physics of elementary particles, a field of physics that has initially been strongly influenced by astronomy and astrophysics, but in the new century has also contributed to the progress of these latter disciplines. Our findings not only shed new lights on the basic laws governing the development of scientific fields, but can also have practical implications on the future development of economic policies and research strategies.

The data set contains 435, 717 articles published by the American Physical Society (APS) from year 1985 until the end of 2015. Publication date, Physics and Astronomy Classification Scheme (PACS) codes, and bibliography have been extracted for each article. The PACS codes are grouped into a five-level hierarchy and each of them indicates a very specific field of physics. As an example, the PACS code \(\textit{64.60.aq}\) , indicating the field ”Networks”, belongs to the broader field ”Equations of state, phase equilibria, and phase transitions” (PACS 64) and further belongs to top-level field ”Condensed Matter: Structural, Mechanical and Thermal Properties” (PACS 60). Here, we consider the PACS codes at the highest level, which classify the physics into ten main fields (Table  1 ). Each article is associated with up to four PACS codes. With regard to the bibliography, only the citations referring to articles published in the APS journals were considered.

Null model and statistically significant networks

To characterize the flow of knowledge across fields and at different time periods, the statistical significance of each contribution has been validated with respect to an appropriately chosen null model. For each couple of fields all the citations from papers published in citing year t to papers published in cited year \(t-n\) have been considered. Let \({\mathrm{X}}_{t}^{\mathrm{citing}}\) be the field of citing papers published in year t , and \({\mathrm{Y}}_{t-n}^{\mathrm{cited}}\) the field of cited papers published in year \(t-n\) . We indicate as \(P (\alpha _{t-n}, \beta _{t}) = {\mathrm{Pr}}({\mathrm{Y}}_{t-n}^{\mathrm{cited}}=\alpha , {\mathrm{X}}_{t}^{\mathrm{citing}}=\beta )\) the joint probability that papers published in year t in field \(\beta\) cite papers published in year \(t-n\) in field \(\alpha\) . Such a probability can be written as:

where \({\mathrm{Pr}}({\mathrm{Y}}_{t-n}^{\mathrm{cited}}=\alpha | {\mathrm{X}}_{t}^{\mathrm{citing}}=\beta )\) is the conditional probability of \({\mathrm{Y}}_{t-n}^{\mathrm{cited}}=\alpha\) given that \({\mathrm{X}}_{t}^{\mathrm{citing}}=\beta\) , and \({\mathrm{Pr}}({\mathrm{X}}_{t}^{\mathrm{citing}}=\beta )\) is the marginal probability. We then consider a null model in which the papers published in year t in field \({\mathrm{X}}\) randomly select papers published in year \(t-n\) as their citations, regardless of which fields they belong to. Hence, the joint probabilities in the null model can be written in terms of the marginal probabilities as:

By calculating the ratio \(\phi (\alpha _{t-n}, \beta _{t})\) between the two probabilities in Eqs. ( 4 ) and ( 5 ):

we were able to quantify how the observed flows of knowledge deviate from the flows expected to arise simply from random choices. A value \(\phi (\alpha _{t-n},\beta _{t})=1\) has been adopted as critical threshold to distinguish whether the knowledge flow from field \(\alpha\) to field \(\beta\) is statistically significant, with \(\phi (\alpha _{t-n},\beta _{t})>1\) indicating that field \(\beta\) in year t is more likely to have extracted knowledge from field \(\alpha\) in year \(t-n\) than would be expected at random.

Network analysis

To characterize the networks of knowledge flow we have evaluated the network reciprocity and we have performed a motif analysis.

Reciprocity . For each year t we have computed the reciprocity coefficient 41 of the knowledge flow network as:

where the average value \(\overline{w}_{\Delta t' \rightarrow t} \equiv \sum _{\alpha \ne \beta }w_{\alpha \rightarrow \beta }^{\Delta t' \rightarrow t}/N(N-1)\) indicates the mean of the link weights and \(\Delta t'=[1, 5]\) . The reciprocity coefficient \(\rho _{t}\) ranges from \(-1\) to 1, and allows us to distinguish between antireciprocal ( \(\rho _t<0\) ) and reciprocal ( \(\rho _t>0\) ) networks. We have only considered in the calculation the top half pairs of mutual links with the largest weights.

Motifs . In this study, we focus on three-node motif analysis 36 . There are 13 different possible connected subgraphs of three nodes. In order to measure the statistical significance of each subgraph g , we have computed the Z -score \(Z_g\) defined as

where \(N_{g}\) is the number of times subgraph g appears in the network, and \(\langle N^{rand}_{g} \rangle\) and \(\sigma _{g}\) are respectively average and standard deviation of the number of times subgraph g occurs in an ensemble of randomized graphs with the same degree distribution as the original network. For each year, we have generated an ensemble of 5000 different randomized samples of the original network using the configuration model.

Sekara, V. et al. The chaperone effect in scientific publishing. Proc. Natl. Acad. Sci. USA 115 , 12603–12607 (2018).

Article   ADS   CAS   PubMed   Google Scholar  

Li, W., Aste, T., Caccioli, F. & Livan, G. Early coauthorship with top scientists predicts success in academic careers. Nat. Commun. 10 , 1–9 (2019).

Article   Google Scholar  

Monechi, B., Pullano, G. & Loreto, V. Efficient team structures in an open-ended cooperative creativity experiment. Proc. Natl. Acad. Sci. USA 116 , 22088–22093 (2019).

Armano, G. & Javarone, M. A. The beneficial role of mobility for the emergence of innovation. Sci. Rep. 7 , 1781 (2017).

Article   ADS   PubMed   PubMed Central   Google Scholar  

Milojević, S., Radicchi, F. & Walsh, J. P. Changing demographics of scientific careers: the rise of the temporary workforce. Proc. Natl. Acad. Sci. USA 115 , 12616–12623 (2018).

Article   PubMed   Google Scholar  

Clauset, A., Arbesman, S. & Larremore, D. B. Systematic inequality and hierarchy in faculty hiring networks. Sci. Adv. 1 , e1400005 (2015).

Gargiulo, F. & Carletti, T. Driving forces of researchers mobility. Sci. Rep. 4 , 4860 (2014).

Article   ADS   CAS   PubMed   PubMed Central   Google Scholar  

Deville, P. et al. Career on the move: geography, stratification, and scientific impact. Sci. Rep. 4 , 4770 (2014).

Article   CAS   PubMed   PubMed Central   Google Scholar  

Ma, A., Mondragón, R. J. & Latora, V. Anatomy of funded research in science. Proc. Natl. Acad. Sci. USA 112 , 14760–14765 (2015).

Van Noorden, R. Interdisciplinary research by the numbers. Nature 525 , 306–307 (2015).

Article   ADS   PubMed   Google Scholar  

Sinatra, R., Deville, P., Szell, M., Wang, D. & Barabási, A.-L. A century of physics. Nat. Phys. 11 , 791 (2015).

Article   CAS   Google Scholar  

Battiston, F. et al. Taking census of physics. Nat. Rev. Phys. 1 , 89–97 (2019).

Bhagat, R. S., Kedia, B. L., Harveston, P. D. & Triandis, H. C. Cultural variations in the cross-border transfer of organizational knowledge: an integrative framework. Acad. Manag. Rev. 27 , 204–221 (2002).

Chen, J., Sun, P. Y. & McQueen, R. J. The impact of national cultures on structured knowledge transfer. J. Knowl. Manag. 14 , 228–242 (2010).

Bell, G. G. & Zaheer, A. Geography, networks, and knowledge flow. Organ. Sci. 18 , 955–972 (2007).

Sorenson, O., Rivkin, J. W. & Fleming, L. Complexity, networks and knowledge flow. Res. Policy 35 , 994–1017 (2006).

Agrawal, A., Kapur, D. & McHale, J. How do spatial and social proximity influence knowledge flows? Evidence from patent data. J. Urban Econ. 64 , 258–269 (2008).

Meyer, M. Tracing knowledge flows in innovation systems. Scientometrics 54 , 193–212 (2002).

Acemoglu, D., Akcigit, U. & Kerr, W. R. Innovation network. Proc. Natl. Acad. Sci. USA 113 , 11483–11488 (2016).

Article   CAS   PubMed   Google Scholar  

Zeng, A. et al. The science of science: from the perspective of complex systems. Phys. Rep. 714 , 1–73 (2017).

Article   ADS   MathSciNet   MATH   Google Scholar  

Zhang, Q., Perra, N., Gonçalves, B., Ciulla, F. & Vespignani, A. Characterizing scientific production and consumption in physics. Sci. Rep. 3 , 1640 (2013).

Börner, K., Penumarthy, S., Meiss, M. & Ke, W. Mapping the diffusion of scholarly knowledge among major us research institutions. Scientometrics 68 , 415–426 (2006).

Zhuge, H. A knowledge flow model for peer-to-peer team knowledge sharing and management. Expert Syst. Appl. 23 , 23–30 (2002).

Yan, E. Disciplinary knowledge production and diffusion in science. J. Assoc. Inf. Sci. Technol. 67 , 2223–2245 (2016).

Perc, M. Self-organization of progress across the century of physics. Sci. Rep. 3 , 1–5 (2013).

Shen, Z. et al. Interrelations among scientific fields and their relative influences revealed by an input–output analysis. J. Informetr. 10 , 82–97 (2016).

Latora, V., Nicosia, V. & Russo, G. Complex Networks: Principles, Methods and Applications (Cambridge University Press, Cambridge, 2017).

Book   MATH   Google Scholar  

Newman, M. Networks (Oxford University Press, Oxford, 2018).

Pan, R. K., Sinha, S., Kaski, K. & Saramäki, J. The evolution of interdisciplinarity in physics research. Sci. Rep. 2 , 551 (2012).

Bonaventura, M., Latora, V., Nicosia, V. & Panzarasa, P. The advantages of interdisciplinarity in modern science. arXiv:1712.07910 (2017).

Pluchino, A. et al. Exploring the role of interdisciplinarity in physics: success, talent and luck. PLoS ONE 14 , e0218793 (2019).

Tria, F., Loreto, V., Servedio, V. D. P. & Strogatz, S. H. The dynamics of correlated novelties. Sci. Rep. 4 , 5890 (2014).

Iacopini, I., Milojević, S. C. V. & Latora, V. Network dynamics of innovation processes. Phys. Rev. Lett. 120 , 048301 (2018).

Chinazzi, M., Gonçalves, B., Zhang, Q. & Vespignani, A. Mapping the physics research space: a machine learning approach. EPJ Data Sci. 8 , 33 (2019).

Zhu, H., Wang, X. & Zhu, J.-Y. Effect of aging on network structure. Phys. Rev. E 68 , 056121 (2003).

Article   ADS   Google Scholar  

Milo, R. et al. Network motifs: simple building blocks of complex networks. Science 298 , 824–827 (2002).

Tahamtan, I., Afshar, A. S. & Ahamdzadeh, K. Factors affecting number of citations: a comprehensive review of the literature. Scientometrics 107 , 1195–1225 (2016).

Cirkel-Bartelt, V. History of astroparticle physics and its components. Living Rev. Relativ. 11 , 2 (2008).

Article   ADS   PubMed   PubMed Central   MATH   Google Scholar  

Rinia, E. J., Van Leeuwen, T. N., Bruins, E. E., Van Vuren, H. G. & Van Raan, A. F. Measuring knowledge transfer between fields of science. Scientometrics 54 , 347–362 (2002).

Phene, A., Fladmoe-Lindquist, K. & Marsh, L. Breakthrough innovations in the us biotechnology industry: the effects of technological space and geographic origin. Strat. Manag. J. 27 , 369–388 (2006).

Garlaschelli, D. & Loffredo, M. I. Patterns of link reciprocity in directed networks. Phys. Rev. Lett. 93 , 268701 (2004).

Download references

Acknowledgements

This work was funded by the Leverhulme Trust Research Fellowship “CREATE: the network components of creativity and success” and EPSRC grant EP/N013492/1.

Author information

Authors and affiliations.

School of Mathematical Sciences, Queen Mary University of London, London, E1 4NS, UK

Ye Sun & Vito Latora

Dipartimento di Fisica ed Astronomia, Università di Catania and INFN, 95123, Catania, Italy

  • Vito Latora

The Alan Turing Institute, The British Library, London, NW1 2DB, UK

Complexity Science Hub Vienna (CSHV), Vienna, Austria

You can also search for this author in PubMed   Google Scholar

Contributions

Y.S. and V.L. designed research; Y.S. performed research; Y.S. and V.L. analyzed data; and Y.S. and V.L. wrote the paper.

Corresponding author

Correspondence to Vito Latora .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary information., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Sun, Y., Latora, V. The evolution of knowledge within and across fields in modern physics. Sci Rep 10 , 12097 (2020). https://doi.org/10.1038/s41598-020-68774-w

Download citation

Received : 17 February 2020

Accepted : 24 June 2020

Published : 21 July 2020

DOI : https://doi.org/10.1038/s41598-020-68774-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Dynamics and characteristics of interdisciplinary research in scientific breakthroughs: case studies of nobel-winning research in the past 120 years.

  • Jingjing Ren

Scientometrics (2023)

Interdisciplinary researchers attain better long-term funding performance

  • Giacomo Livan

Communications Physics (2021)

Evolution and transformation of early modern cosmological knowledge: a network study

  • Maryam Zamani
  • Alejandro Tejedor
  • Holger Kantz

Scientific Reports (2020)

Knowledge and social relatedness shape research portfolio diversification

  • Giorgio Tripodi
  • Francesca Chiaromonte
  • Fabrizio Lillo

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

research is to create new knowledge

Encyclopedia

Writing with artificial intelligence.

  • © 2023 by Joseph M. Moxley - Professor of English - USF

Research refers to a systematic investigation carried out to discover new  knowledge , test existing  knowledge claims, solve practical problems , and develop new products, apps, and services. This article explores why different research communities have different ideas about what research is and how to conduct it. Learn about the different epistemological assumptions that undergird  informal , qualitative , quantitative , textual , and mixed research methods .

research is to create new knowledge

Table of Contents

What is Research?

Research may refer to

  • For most researchers, the first step in any research project involves strategic searching to learn what the current and best research, theory, and scholarship is on a topic .
  • scholars create knowledge by engaging in textual research , interpretation , and hermeneutics .
  • scientists create knowledge by engaging in observation and systematic experimentation.
  • Ethnography
  • Participant Observation
  • Survey Research
  • “a systematic application of knowledge toward the production of useful materials, devices, and systems or methods, including design, development, and improvement of prototypes and new processes” (NSF n.d.)
  • a process,  a research methodology , that follows  the principles of lean design .

Key Words: Research Community ; Research Methodology ; Research Methods ; Epistemology

research is to create new knowledge

Why Does Research Matter?

Overall, research is essential for advancing knowledge, solving problems, informing decision-making, fostering innovation, and promoting critical thinking. It plays a crucial role in shaping the world we live in and the future we create.

  • Research allows us to better understand the world around us, from the fundamental workings of the universe to the intricacies of human behavior. By conducting research, scholars can uncover new information, develop new theories and models, and identify gaps in existing knowledge that need to be filled. This knowledge can help students and teachers to better understand the world around them and develop new solutions to the problems facing society.
  • Research helps us identify and solve problems. It can help us find ways to improve our health, protect the environment, reduce poverty, and develop new technologies.
  • Research provides important information that can inform policy decisions, business strategies, and individual choices. By studying trends, analyzing data, and conducting experiments, researchers can help us make better-informed decisions.
  • Research often leads to new technologies, products, and services. By pushing the boundaries of what is currently possible, researchers can inspire and fuel innovation.
  • Research teaches us to question assumptions, evaluate evidence, and think critically. These skills are important for students to develop because they enable them to become more informed and engaged citizens, able to make more informed decisions and contribute to society in meaningful ways.
  • Research experience can be an asset in many career fields, including academia, business, government, and nonprofit organizations. By conducting research as an undergraduate student, students can develop valuable skills and experience that can help them to succeed in their future careers.

Types of Research

research is to create new knowledge

The choice of research methods depends on the epistemological assumptions of the researchers and the practices of a particular methodological community , the research question , the type of data needed, and the resources available.

The method is conducted to solve a particular problem for specific situation. Investigators engage in services, applications, and products can we create?
The method is conducted to advance knowledge and theory without consideration for commercial gain or practical application. In basic research, investigators strive to understand the most fundamental questions, “who are we? how did we get here? what should we do next?
This method involves in-depth exploration of a particular case or phenomenon.
This method involves analyzing written, visual, or audio material to identify patterns and themes.
This examines the relationship between two or more variables without manipulating them.
This research method used to develop commercial services, products, and applications.
This type of research aims to describe a phenomenon or situation, usually without attempting to establish cause-and-effect relationships.
This method relies on observation and experimentation. Investigators  in systematic ways. Examples: ,  , 
This method involves studying a culture or group of people in their natural environment.
This method involves manipulating variables to determine cause-and-effect relationships between them.
This type of research is used when little is known about a , and the goal is to gain a preliminary understanding of it.
This method gathers   anecdotally or based on convenience, is directed by an investigator’s hunches and curiosity rather than a methodological community’s expectations and conventions., is unplanned, unstructured, and intuitive.
This method is similar to experimental research, but it lacks random assignment of participants to conditions.
This method involves collecting data from a sample of participants through questionnaires or interviews.
This method focuses on the discourse practices who scholars who engage in textual hermeneutics — interpretation & criticism. Examples:
; ;
“Usability is the art of making sure that any kind of communication deliverable (e.g. a website, a handbook, a user guide, etc.) is intuitive, easy-to-use , and helps users achieve their goals. Usability is part of the broader discipline known as User Experience Design (or UX), which encompasses all aspects of the look, feel, and information contained in a communication deliverable” ( 2019).

research is to create new knowledge

Epistemology and Research Communities

Investigators across academic disciplines — the humanities, social sciences, sciences, and the arts — share some common methods and values. For instance, in both workplace writing and academic writing , investigators are careful

  • to cite sources , particularly sources that have changed the conversation on a topic
  • to provide evidence for claims (as opposed to opinion or other forms of anecdotal knowledge .

Yet it is also important to note that different research communities also develop unique approaches to exploring and solving problems in their knowledge domains. Research communities develop different ways of conducting research because they face different problems and because they may have different epistemological assumptions about what knowledge is and how to measure it. For example, if a researcher believes that knowledge can only be gained through observation and empirical evidence , they may choose to use quantitative research methods such as experiments or surveys . Conversely, if a researcher believes that knowledge can also be gained through subjective experience and interpretation , they may choose to use qualitative research methods such as case study , ethnography or participant observation

While there are many nuanced definitions of epistemology , scholars have identified three major epistemological perspectives that inform the works of three research communities

  • The Scholars – aka Scholarship
  • The Positivists – aka Positivism
  • The Postpositivists – aka Postpositivism

overfiew of figure 2

Research & Mindset

Researchers are curious about the world. They embrace openness , a growth mindset , and collaboration . They undertake research projects in order to review existing knowledge and generate original knowledge claims about the topic , thesis, research question they are investigating. Research finds evidence.

Research Ethics

Researchers and consumers of research are wise to view research claims and research plans from an ethical perspective. Given human nature — such as the tendency to look for confirming evidence and ignore disconfirming evidence and to allow emotions to cloud reasoning — it’s foolhardy to disregard critical literacy practices when consuming the research of others.

Ethics are important to undergraduate students as researchers because ethics provide a framework for conducting research that is responsible, respectful, and accountable :

  • Ethics ensure that participants in research are treated with respect and dignity, and that their rights and well-being are protected. As a student researcher, it is important to obtain informed consent from participants, ensure their confidentiality, and minimize any potential harm or discomfort.
  • Ethics ensure that research is conducted with integrity and honesty. This means that data is collected and analyzed accurately, and that findings are reported truthfully and transparently.
  • Ethics help to build trust between researchers and the public. When research is conducted ethically, participants and the wider community are more likely to trust the findings and the researchers themselves.
  • Adhering to ethical standards in research can help students to develop important professional skills, such as critical thinking, problem-solving, and communication . These skills can be useful in a wide range of career fields, including academia, healthcare, and government.
  • Ethical research is a professional obligation. By conducting research ethically, students are fulfilling their obligations to the wider research community.

Research as an Iterative, Recursive, Chaotic Process

Research is commonly depicted on websites and textbooks on research methods as systematic work (see, e.g., Wikipedia’s Research page).

Depicting research as systematic work is certainly valid, especially in natural and social science research. For instance, scientists in the lab working with a virus like COVID-19 or Ebola aren’t going to play around. Their professionalism and safety is tied to rigorously following research protocols.

That said, it’s an oversimplification to suggest research processes are invariably systematic. Discoveries have emerged from basic research that have been wildly popular and useful real-world applications . (See, for example, 24 Unintended Scientific Discoveries — the video below). Scientists may begin researching hypothesis A but rewrite that hypothesis multiple times until they find hypothesis Z — something that explains the data. Then they go back and repackage their investigation, following ethical standards, for a wider audience.

Ultimately, because research is such an iterative process, the thesis or hypothesis a researcher began with may not be the one the researcher ends up with. The takeaway here is that research is a learning process. Research efforts can lead to unpredictable applications and insights. Research finds evidence. Ultimately, research is about curiosity and openness. The question that initiates a research effort may morph into other questions as researchers

  • dig deeper into the literature on the topic and become more conversant
  • endeavor to make sense of the data/information they have gathered during the conduct of the study.

research is to create new knowledge

Related Concepts

Research methods.

Research results— knowledge claims -—are important. But, how researchers claim to know what they know—their research methods and research methodology —are equally important.

Information Literacy

During the early stages of a writing project, you can identify research questions worth asking by engaging in Information Literacy practices.

Using Evidence

Learn to summarize,  paraphrase , and  cite sources . Weave others’ ideas and words into your texts in ways that support your  thesis/research question ,  information ,  rhetorical stance .


could be organized at under After all, as articulated by the ACRL, addresses how research is .

However, we have chosen to present research as a major heading at and not subsume it under Information Literacy because is more commonly associated with being of whereas is associated with the efforts of people

Information Literacy is focused on getting and vetting information whereas is focused on producing and developing new products and services.

Hale, J. (2018). Understanding research methodology 5: Applied and basic research, PsychCentral . https://psychcentral.com/blog/understanding-research-methodology-5-applied-and-basic-research/

Related Articles:

Applied Research, Basic Research

Applied Research, Basic Research

Research Ethics

Research Methodology

Research Methods

Scholarship - The Scholars - Textual Research Methods

Recommended.

Student engrossed in reading on her laptop, surrounded by a stack of books

Academic Writing – How to Write for the Academic Community

You cannot climb a mountain without a plan / John Read

Structured Revision – How to Revise Your Work

research is to create new knowledge

Professional Writing – How to Write for the Professional World

research is to create new knowledge

Credibility & Authority – How to Be Credible & Authoritative in Research, Speech & Writing

How to Cite Sources in Academic and Professional Writing

Citation Guide – Learn How to Cite Sources in Academic and Professional Writing

Image of a colorful page with a big question in the center, "What is Page Design?"

Page Design – How to Design Messages for Maximum Impact

Suggested edits.

  • Please select the purpose of your message. * - Corrections, Typos, or Edits Technical Support/Problems using the site Advertising with Writing Commons Copyright Issues I am contacting you about something else
  • Your full name
  • Your email address *
  • Page URL needing edits *
  • Phone This field is for validation purposes and should be left unchanged.

Applied Research, Basic Research

  • Joseph M. Moxley

Understand the difference between Applied Research and Basic Research.

Research Ethics

As an investigator be sure to protect your research subjects and follow ethical standards. As a consumer of research, be mindful of when investigators may be exaggerating results, making claims...

Research Methodology

Not all research methods are equal or produce the same kind of knowledge. Learn about the philosophies, the epistemologies, that inform qualitative, quantitative, mixed, and textual research methods.

Research Methods

Understand how to identify appropriate research methods for particular methodological communities, rhetorical situations, and research questions.

Scholarship is not just about memorizing facts or regurgitating information. It’s about developing a deep understanding of a subject, making connections across disciplines, and contributing to the ongoing conversation about...

Featured Articles

Student engrossed in reading on her laptop, surrounded by a stack of books

Logo for JCU Open eBooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

1.2 Ways of Creating Knowledge

What constitutes knowledge.

To have a deep understanding of what research entails, we need to first consider the historical context of ways of creating knowledge and what constitutes knowledge. Remember that “Research is creating new knowledge”. Our knowledge, thoughts, perceptions and actions are influenced by our worldview, which is a collection of attitudes, values, tales, and expectations about the world. 3 One’s view of the world is at the heart of one’s knowledge. There are different methods of acquiring knowledge, including intuition, authority, logical reasoning and the scientific method. 4

Cambridge dictionary defines intuition as the knowledge from an ability to understand or know something immediately based on feelings rather than facts. 1 It is also described as instinctive knowing without the use of cognitive processes or emotionally charged judgments that result from quick, unconscious, and holistic associations. 5 The impression that something is right comes from intuition. Instincts and intuition are sometimes used interchangeably. 4 Justifications like “that feels right to me” are often used to support intuition. However, as there is no means to evaluate the accuracy of the knowledge based on intuition, there is no way to distinguish between accurate and inaccurate knowledge using such an approach. As a result, it is challenging to assess the correctness of intuition in the absence of action. 4 In research, intuition may lead to generating hypotheses, especially in areas with limited or no prior information. Nonetheless, the hypothesis has to be tested before the knowledge is accepted in modern healthcare settings.

Getting knowledge from an authority figure is another common way of acquiring knowledge. 6 Authority refers to a person or organisation having political or administrative power, influence and control. The information generated from such authority is regarded to be true since it was expressed by a social media influencer or an expert in a certain field. 4 This approach entails embracing novel concepts because an authority figure declares them true. 4 It is one of the quickest and simplest ways to learn; therefore, it can often be a good place to start. 6 Some of these authorities are parents, the media, physicians, priests and other religious leaders, the government, and professors. 4 Although we should be able to trust authority figures in an ideal world, there is always a chance that the information they provide may be incorrect or out of context. 4 War crimes such as the Holocaust and the Guatemala Syphilis research, where atrocities against humanity were committed, are only a few instances when people blindly listened to authoritative leaders without scrutinising the information they were given. 4 Information on research topics obtained from authorities could generate new ideas about the concept being investigated. However, these ideas must be subjected to rigorous scientific scrutiny rather than accepted at face value.

Logical reasoning

Logic reasoning or rationalism is any process of knowledge generation that requires the application of reasoning or logic. 4 This approach is predicated on the idea that reason is the primary source of knowledge. 6 It is based on the premise that people can discover the laws that govern the behaviour of natural objects through their efforts. 6 Human behaviour is frequently explained using rationalism. In order to reach sound conclusions utilising this method, premises are provided, and logical principles are followed. However, if any assumptions are wrong, then the conclusion will be invalid. 4 For example, if a student fails to attend a series of compulsory lectures or tutorials, the professor may reason that the student is taking a lackadaisical approach to the subject. However, the assumption that attendance is an indicator of engagement may be untrue and lead to an erroneous conclusion. Perhaps, the student may have been ill or genuinely absent for some other unavoidable reason. This highlights the disadvantage of rationalism, as relying solely on this approach could be misleading, leading to inaccurate conclusions. 4 Thus, while rationalism may be helpful when developing or thinking of a research hypothesis, all research hypotheses need to be tested using the scientific method.

Scientific method

The scientific method is an empirical method for systematically gathering and analysing data to test hypotheses and answer questions. 4 Let’s go back to our example of the professor who concluded that the student who skipped the required classes had a lax attitude. This could possibly be due to some prior interactions with students who had demonstrated a lack of interest in studying the subject. This illustration shows the fallacy of drawing conclusions solely from experience and observation. The amount of experience we have could be constraining, and our sensory perceptions may be misleading. 4 Therefore, it is important to use the scientific method, which allows the researcher to observe, ask questions, test hypotheses, collect data, examine the results and draw conclusions. While researchers often draw on intuition, authority, and logical reason to come up with new questions and ideas, they don’t stop there. 4 In order to test their theories, researchers utilise systematic approaches by making thorough observations under a variety of controlled situations to draw reliable conclusions. 6 Systematic techniques are used in scientific methods, and every technique or design has a set of guidelines or presumptions that make it scientific. 4 Thus, empirical evidence based on observations becomes an item of knowledge. In the following chapters, we will go into greater detail about what the scientific method comprises.

How does scientific method contribute to evidence?

While everyday activities such as cooking, as seen in the opening scenario, may involve research, this type of research may not involve a systematic or controlled approach. Scientific research requires a systematic approach, and it is defined as a systematic inquiry/data-gathering process used to investigate a phenomenon or answer a question. 4 Research is also a way of knowing that involves critical examination of various aspects of a given phenomenon that is under investigation. It requires formulation and understanding of principles that guide practice and the development and testing of new ideas/theories. 7 Research aims to be objective and unbiased and contributes to the advancement of knowledge. Research adds to existing knowledge by offering an understanding or new perspective on a topic, describing the characteristics of individuals or things, or establishing causal links between factors. 8

An Introduction to Research Methods for Undergraduate Health Profession Students Copyright © 2023 by Faith Alele and Bunmi Malau-Aduli is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

The Research Whisperer

Just like the thesis whisperer – but with more money, what is research.

A Scrabble board covered in words

We all know what research is – it’s the thing we do when we want to find something out. It is what we are trained to do in a PhD program. It’s what comes before development.

The wonderful people at Wordnet define research as

Noun: systematic investigation to establish facts; a search for knowledge. Verb: attempt to find out in a systematically and scientific manner; inquire into.

An etymologist might tell us that it comes from the Old French word cerchier , to search , with re- expressing intensive force. I guess it is saying that before 1400 in France, research meant to search really hard.

If I was talking to a staff member at my university, though, I would say that searching hard was scholarship . The difference? Research has to have an element of discovering something new, of creating knowledge. While a literature search is one important part of a research project, it isn’t research in and of itself. It is scholarship.

Don’t take my word for it. In Australian universities, we define research this way:

Research is defined as the creation of new knowledge and/or the use of existing knowledge in a new and creative way so as to generate new concepts, methodologies and understandings. This could include synthesis and analysis of previous research to the extent that it leads to new and creative outcomes. This definition of research is consistent with a broad notion of research and experimental development (R&D) as comprising of creative work undertaken on a systematic basis in order to increase the stock of knowledge, including knowledge of humanity, culture and society, and the use of this stock of knowledge to devise new applications This definition of research encompasses pure and strategic basic research, applied research and experimental development. Applied research is original investigation undertaken to acquire new knowledge but directed towards a specific, practical aim or objective (including a client-driven purpose).

Drawn from the 2012 Higher Education Research Data Collection (HERDC) specifications for the collection of 2011 data .

What research sounds like

Sometimes, however, you don’t want to talk about ‘Research ‘ . If you are applying to a philanthropic foundation, for example, they may not be interested in your new knowledge so much as the impact that your work will have, your capacity to help them to solve a problem. Industry partners may also be wary of the ‘R’ word. “Don’t bank your business on someone’s PhD”, they will say (and I would wholeheartedly agree).

This creates something of a quandary, as the government gives us money based on how much research income we bring in. They audit our claims, so everything we say is research has to actually be research. So, it helps to flag it as research, even if you don’t say it explicitly.

Instead, you might talk about innovation , or about experimentation . You could describe the element of risk associated with discovery . Investigation might lead to analysis . There might be tests that you will undertake to prove your hypothesis . You could just say that this work is original and has never been done before. You could talk about what new knowledge your work will lead to.

You might describe a new method or a new data source that will lead to a breakthrough or an incremental improvement over current practice. You could make it clear that it is the precursor to development , in the sense of ‘research and development’.

It really helps if you are doing something new .

What research looks like

Sometimes, it isn’t what you say, but what you do. If your work will lead to a patent, book or book chapter, refereed journal article or conference publication, or an artwork or exhibition (in the case of creative outputs), then it almost always fulfills the definition no matter what you call it.

What research isn’t

Sometimes, you can see a thing more clearly by describing what it isn’t.

Research isn’t teaching. Don’t get me wrong – you can research teaching, just like you can research anything else. However, teaching itself is generally regarded as the synthesis and transfer of existing knowledge. Generally, the knowledge has to exist before you can teach it. Most of the time, you aren’t creating new knowledge as you teach. Some lecturers may find that their students create strange new ‘knowledge’ in their assignments, but making stuff up doesn’t count as research either.

Research isn’t scholarship. As I said at the start, a literature search is an important aspect of the research process but it isn’t research in and of itself. Scholarship (the process of being a scholar) generally describes surveying existing knowledge. You might be looking for new results that you hadn’t read before, or you might be synthesizing the information for your teaching practice. Either way, you aren’t creating new knowledge, you are reviewing what already exists.

Research isn’t encyclopaedic. Encyclopedias, by and large, seek to present a synthesis of existing knowledge. Collecting and publishing existing knowledge isn’t research, as it doesn’t create new knowledge.

Research isn’t just data-gathering. Data-gathering is a vital part of research, but it doesn’t lead to new knowledge without some analysis, some further work. Just collecting the data doesn’t count, unless you do something else with it.

Research isn’t just about methodology. Just because you are using mice, or interviewing people, or using a High Performance Liquid Chromatograph (HPLC) doesn’t mean you are doing research. You might be, if you are using a new data set or using the method in a new way or testing a new hypothesis. However, if you are using the same method, on the same data, exploring the same question, then you will almost certainly get the same results. And that is repetition, not research.

Research isn’t repetition, except in some special circumstances. If you are doing the same thing that someone else has already done, then generally that isn’t research unless you are specifically trying to prove or disprove their work. What’s the difference? Repeating an experiment from 1400 isn’t research. You know what the result will be before your start – it has already been verified many times before. Repeating an experiment reported last year probably is research because the original result can’t be relied upon until it is verified.

Is development research? Development (as in ‘research and development’) may or may not be classified as research, depending on the type of risk involved. Sometimes, the two are inextricably linked: the research leads to the development and the development refines the research. At other times, you are creating something new, but it is a new product or process, not new knowledge. It is based on new knowledge, rather than creating new knowledge. If the risk involved is a business risk, rather than intellectual risk, then the knowledge is already known.

Help me out here – what are your favourite words that signal research?

Share this:

26 comments.

currently, im doing postgraduate education for both social science and technological science. i can’t help but to feel slightly amused by your assertion ..

“Don’t bank your business on someone’s PhD”, they will say (and I would wholeheartedly agree).

this is quite true when you’re doing phd for social science. however, if your phd is technologically inclined, the business entity who intends to commercialize it, may have to bank on your research for success.

illustrating this would not be a feat.

are you using google? well, did you know that google was actually a phd research? if they hadn’t banked on page’s and brin’s research, there wouldn’t be google today, would it? presently, it is rumoured that google and microsoft are competing for phd graduates from ivy leagues and what not.

personally, i’ve met a couple of ‘technopreneurs’ who have successfully commercialized their phd research. though they may not be as successful as google, financially speaking, their achievement should not be trivialized.

Thanks, pikir kool.

You are right, of course. I’m a big fan of businesses who provide scholarships for PhD students. It is a great way for the student to get funded, and for the business to get a bit of an edge.

‘Chercher’, the modern French word for chercier means to explore or get. Re-chercher adds the concept of re- or ‘again’ to indicate looking-again, usually on the basis of evidence or experience pointing to the object of the search being in a particular place, hence to ‘search really hard’. French-speaking individuals will ‘rechercher’ a criminal on the run, ‘rechercher’ the more probable destinations of a friend who is out shopping, and so forth. I agree Australian businesses consider PhD graduates are overpriced ‘scholars’ and ‘technicians’ trained to avoid risk, hold similar opinions, and assume as little responsibility for group/enterprise outcomes as possible. What shocks me is your suggestion graduates should misinform potential employers by suggesting they might be able to innovate, discover, and lead the business toward new markets and technologies by simply choosing hot button words. In France, universities are centres of ‘learning’ where individuals experience a rich intellectual environment that the government believes ‘develops’ curiosity, opens up new horizons, tests principles to live by, and rewards leadership. The ‘elitist’ French haut écoles are often criticised by Anglo-saxon countries, but I say the learning environment, which – by the way – focuses less on methodology, reflects human diversity (unique identity). The Australian system is based on an equal opportunity social objective and is funded to produce an intellectual resource pipeline .

Hello Gordon

Thanks for your information on ‘Chercher’.

I was not trying to suggest that anybody misinform anybody else with the use of words, hot button or otherwise, but I can understand how you read it that way.

I wrote that section, in part, as a guide to staff who are trying to satisfy two audiences – the people who are providing funding and the government auditors who are deciding what is counted as research. The easiest way to satisfy the government auditors that something is research is to call it ‘research’. However, in some funding situations, that simply isn’t appropriate. One way forward is to describe the work using words other than ‘research’ that signal to the auditors that the work satisfies the criteria for research.

I’m afraid that I’m not experienced enough with research in France to reply to your comparison of the French and Australian research training environments. I work within the Australian environment, and try to do the best job that I can.

Thank you for this post – very relevant for me right now and thought-provoking. I’m 13 months into my PhD investigating communication designers’ engagement with research and I’m astounded that there is so little consensus in academic literature (not to mention in professional practice) about what legitimate research is.

It seems that any definition or criteria for research that I find, I can also find an example of research that contradicts it. For example, in your post you note “data gathering is a vital part of research” but when I included this in my definition, a highly respected scholar in my field pointed out that research in his own field of Philosophy did not involve data gathering, yet he believed constituted research. So I’m still thinking about it : )

Your philosopher is right, of course. Some researchers are working with ideas and recombining them, reworking them, creating new ideas.

I deal with applied research, mostly, and I guess my definition reflects that.

I would love to see your definition when you are done.

Your article is rad. It shaped the whole concept of research in my mind. And I think that it exactly is a ‘re- search’, where you will be searching the facts again & again, on grass root level, following a sequence of systematic processes to reach a novel & efficient conclusion .

Thanks. Glad I could help, anonymouswailer.

Thank you for the post on ‘What is Research?’ Interesting and useful posts and comments. Since I am considering naming a blog page The Synthesist, I got off on a tangent relating to the words thesis, synthesis, etc. A couple thoughts …

I think you may be undervaluing the function of “synthesis” when it is only referred to in relation to encyclopedic summaries of existing knowledge, I think true synthesis is when 2 or more ideas combine to create a new idea. I also learned, when I served a literacy tutor, that “synthesis” is considered to be a more sophisticated learned literacy skill than “analysis,” which I thought was interesting. We live in analysis culture, creating deep silos of knowledge, with few strong horizontal threads that truly support “learning.”

Interesting comment on French value of learning as the highest human capacity. Not feeling that here in America.

Also, I was hoping to see in your answer of what research IS, a reference to the importance of questions and question formation.

Thanks– Amy

I’m prompted to comment by Amy’s:

After a long time working outside of academia I’m returning to begin a Masters in Disaster Communication and Resilience; I’m still at that early stage of being excited by ideas, and not quite ready to decide on a research topic. What I am sure of is that, in the area of disaster (post-typhoon for example) one of the biggest challenges is that the specialists don’t feel comfortable talking to each other and therefore need the generalist communicators / networkers to listen to what they are on about, develop a general understanding of what they are saying, and link them together with people in other specialist areas whose work might be strikingly different but potentially have enormous potential for synergy/ synthesis.

And I doubt that any research is being done on this.

[…] What is research? by Jonathan O’Donnell […]

This is perhaps a slightly different point of view/perspective from a reasonably long career in applied research, and I am now enrolled in a Doctorate program.

What I find really interesting is pondering where does ‘innovation’ especially in terms of various forms of professional practice or creative endeavour actually come from, if not from ‘research’ as you describe it above? (I often heard and still hear people in industry or the professional practice word using the word ‘research’ to describe an often fairly informal literature search to back up what they have already decided to do in practice – but that is probably another story.)

However, I often wonder where do the ideas for ‘innovation’ actually come from?

When they are drawn from research conclusions (or initially drove the research question) this probably makes that particular research more valuable from a funder point of view.

But it kind of begs the question as to what comes or should come first especially in terms of good applied research.

And then finally, where does creativity come in – especially when deciding what to research, and how to interpret the data and conclusions from the research?

I am off to think of some more concrete examples and to ponder the nexus between research – innovation and creativity.

BTW love this discussion so far!

The nexus between research, innovation and creativity is a great topic! If you are interested in writing it up as a blog post, let me know. We’d be happy to consider it for a guest post on the Research Whisperer.

Jonathan Let me think about it – this has provoked my thinking about the issues but not sure if I am there yet in terms of writing a post about it. I will let you know! Jane

Well, it certainly was interesting to see this comment thread come back to me after three years.

I was about to reply to this person named Amy who said she was going to start a blog called The Synthesist to tell her that I had myself started a blog called Neon Synthesist.

Then I realized it was myself. Strange mirror of time. In 2014, I discovered there was a rock band called Synthesist and named it Neon Synthesist instead, since it tends to be provocative.

There are some fun posts there like “What is an Idea?” and “Why Philosophy Isn’t Dead” and, funding researchers might like, a four-part series called “The Philanthropy Games” … but alas this page will probably go away. No subscribers that I could tell.

http://www.neonsynthesist.blogspot.com

Cheers! Amy

[…] (2012) what is research [online] available from < https://theresearchwhisperer.wordpress.com/2012/09/18/what-is-research/ > [09 march […]

[…] O’Donnell, J. (2012, September 18). What is research? [Blog post]. Retrieved from https://theresearchwhisperer.wordpress.com/2012/09/18/what-is-research/ […]

[…] For more discussion on the question “What is Research”, please see “What is Research?”, Study.com, available from https://study.com/academy/lesson/what-is-research-definition-purpose-typical-researchers.html . See also “What is Research?”, The Research Whisperer, available from https://researchwhisperer.org/2012/09/18/what-is-research/ . […]

I am enriched with the discussions. Thanks.

Thanks, Raton Kumar. I’m glad that you found it useful. Jonathan

[…] For wiser words on research than mine, CLICK HERE. […]

Research is creating new knowledge through systematic investigation and analysis of data. Research leads to development but not in all cases and Repetition of a research already done can be said valid only when we try to prove or disprove it. It sounds great!!!

Research is the effort done by an individual or group of people, to explore something new. It can be an effort to prove the same matter but applying new methods, it also can be done to prove a different findings.

Leave a comment Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed .

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

Understanding Knowledge Creation

  • First Online: 12 March 2020

Cite this chapter

research is to create new knowledge

  • Yuh Huann Tan 3 &
  • Seng Chee Tan 4  

469 Accesses

This chapter presents a literature review on the concepts of knowledge creation, including four prominent theories: organisational knowledge creation theory by Nonaka and Takeuchi, the theory of expansive learning by Engeström, knowledge building theory by Scardamalia and Bereiter and collaborative knowledge creation by Paavola and Hakkarainen. Following that, we discuss how these theoretical models are related to the field of education and their connection to K-12 school education.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Aristotle. (2000) Nicomachean ethics (R. Crisp, Trans.). Cambridge: Cambridge University Press. (Original work published date unknown). Retrieved from http://site.ebrary.com/lib/nielib/docDetail.action?docID=2000816 .

Audi, R. (2011). Epistemology: A contemporary introduction to the theory of knowledge (3rd ed.). New York: Routledge.

Google Scholar  

Bauters, M., Lakkala, M., Paavola, S., Kosonen, K., & Markkanen, H. (2012). KPE (knowledge practices environment) supporting knowledge creation practices in education. In A. Moen, A. I. Mørch, & S. Paavola (Eds.), Collaborative knowledge creation: Practice, tools, concepts (pp. 53–74). Rotterdam, The Netherlands: Sense Publishers.

Chapter   Google Scholar  

Bereiter, C. (2002). Education and mind in the knowledge age . Mahwah, NJ: Lawrence Erlbaum Associates.

Bereiter, C., & Scardamalia, M. (1996). Rethinking learning. In D. R. Olson & N. Torrance (Eds.), The handbook of education and human development: New models of learning, teaching and schooling (pp. 485–513). Cambridge, MA: Basil Blackwell.

Bereiter, C., & Scardamalia, M. (2003). Learning to work creatively with knowledge. In E. De Corte, L. Verschaffel, N. Entwistle, & J. van Merriënboer (Eds.), Powerful learning environments: Unraveling basic components and dimensions (Advances in Learning and Instruction Series) (pp. 55–68). Oxford: Elsevier Science.

Bereiter, C., & Scardamalia, M. (2014). Knowledge building and knowledge creation: One concept, two hills to climb. In S.-C. Tan, H.-J. So, & J. Yeo (Eds.), Knowledge creation in education (pp. 35–52). Singapore: Springer Science + Business Media.

Carriero, J. (2008). The Cartesian circle and the foundations of knowledge. In J. Broughton & J. Carriero (Eds.), A companion to descartes (pp. 302–319). Malden MA: Blackwell Publishing Ltd.

Chai, C. S., & Tan, S. C. (2011). Two exploratory studies of the relationships between teachers’ epistemic beliefs and their online interactions. International Journal of Continuing Engineering Education & Lifelong Learning, 21 (1), 13–24.

Article   Google Scholar  

Curado, C., & Bontis, N. (2011). Parallels in knowledge cycles. Computers in Human Behavior, 27 (4), 1438–1444.

Engeström, Y. (1987). An activity-theoretical approach to developmental research . Helsinki: Orienta-Konsultit. Retrieved from http://lchc.ucsd.edu/mca/Paper/Engestrom/expanding/toc.htm .

Engeström, Y. (1999a). Innovative learning in work teams: Analyzing cycles of knowledge creation in practice. In Y. Engeström, R. Miettinen, & R. L. Punamaki (Eds.), Perspectives on activity theory (pp. 377–404). Cambridge: Cambridge University Press.

Engeström, Y. (1999b). Learning by expanding: Ten years after . Introduction to the German and also to the Japanese edition of Learning by Expanding, published in 1999. In English. Retrieved http://lchc.ucsd.edu/mca/Paper/Engestrom/expanding/intro.htm .

Engeström, Y. (2001). Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work, 14 (1), 133–156.

Engeström, Y. (2011). Activity and learning at work. In M. Malloch, L. Cairns, K. Evans, & B. N. O’Connor (Eds.), The SAGE handbook of workplace learning (pp. 86–104). London: SAGE. Retrieved http://www.helsinki.fi/cradle/documents/Engestrom%20Publ/Chapter%20for%20Malloch%20book.pdf .

Engeström, Y. (2015). Learning by expanding: An activity-theoretical approach to developmental research (2nd ed.). Cambridge: Cambridge University Press.

Engeström, Y., Miettinen, R., & Punamäki, R.-L. (Eds.). (1999). Perspectives on activity theory . Cambridge: Cambridge University Press.

Engeström, Y., & Sannino, A. (2010). Studies of expansive learning: Foundations, findings and future challenges. Educational Research Review, 5 (1), 1–24.

Engeström, Y., & Sannino, A. (2011). Discursive manifestations of contradictions in organizational change efforts: A methodological framework. Journal of Organizational Change Management, 24 (3), 368–387.

Garber, D. (2010). Philosophia, historia, mathematica: Shifting sands in the disciplinary geography of the seventeenth century. In T. Sorell, G. A. J. Rogers, & J. Kraye (Eds.), Sientia in early modern philosophy: Seventeenth-century thinkers on demonstrative knowledge from first principles, studies in history and philosophy of science 24 (pp. 1–17). New York: Springer.

Gladwell, M. (2000). The tipping point: How little things can make a big difference . Little Brown.

Gloor, P. (2006). Swarm creativity: Competitive advantage through collaborative innovation networks . New York, NY: Oxford University Press.

Book   Google Scholar  

Gumperz, J. J., & Levinson, S. C. (1996). Introduction: Linguistic relativity re-examined. In J. J. Gumperz & S. C. Levinson (Eds.), Rethinking linguistic relativity (pp. 1–18). Cambridge, UK: Cambridge University Press.

Hakkarainen, K., & Paavola, S. (2009). Toward a trialogical approach to learning. In B. Schwarz, T. Dreyfus, & R. Hershkowitz (Eds.), Transformation of knowledge through classroom interaction (pp. 65–80). Abingdon, UK: Routledge.

Hamyln, D. W. (1995). Epistemology, history of. In T. Honderich (Ed.), The oxford companion to philosophy (pp. 242–245). Oxford, UK: Oxford University Press.

Hong, H.-Y., & Lin, S.-P. (2010). Teacher-education students’ epistemological belief change through collaborative knowledge building. Asia-Pacific Education Researcher, 19 (1), 99–110.

Il’enkov, E. V. (1977). Dialectical logic: Essays in its history and theory . Moscow: Progress.

Lakkala, M., Paavola, S., Kosonen, K., Muukkonen, H., Bauters, M., & Markkanen, H. (2009). Main functionalities of the Knowledge Practices Environment (KPE) affording knowledge creation practices in education. In C. O’Malley, D. Suthers, P. Reimann, & A. Dimitracopoulou (Eds.), Computer Supported Collaborative Learning Practices: CSCL2009 Conference Proceedings. (pp. 297–306). Rhodes, Creek: International Society of the Learning Sciences (ISLS). Retrieved from http://www.helsinki.fi/science/networkedlearning/texts/Lakkala_et_al_2009_KPE_cscl09.pdf .

Leont’ev, A. N. (1981). Problems of the development of the mind. Moscow: Progress Publishers.

Li, Y., & Kettinger, W. J. (2006). An evolutionary information-processing theory of knowledge creation. Journal of Association of Information Systems, 7 (9), 593–617.

Marx, K. (1887). Capital: A critique of political economy (Vol. 1) (S. Moore, & E. Aveling, Trans.). Moscow, USSR: Progress Publishers. (Original work published 1867). Retrieved from https://www.marxists.org/archive/marx/works/download/pdf/Capital-Volume-I.pdf .

Marx, K. (1973). Grundrisse [Foundations of the Critique of Political Economy] (M. Nicolaus, Trans.). Penguin Books in association with New Left Review (Original work published 1939–41). Retrieved from https://www.marxists.org/archive/marx/works/download/pdf/grundrisse.pdf .

Miller, R. W. (1987). Fact and method: Explanation, confirmation and reality in the natural and the social sciences . Princeton, NJ: Princeton University Press.

Moen, A., Mørch, A. I., & Paavola, S. (Eds.). (2012). Collaborative knowledge creation: Practice, tools, concepts . Rotterdam, The Netherlands: Sense Publishers.

Nonaka, I. (1991). The knowledge-creating company. Harvard Business Review, 6 (8), 97–104.

Nonaka, I. (1994). A dynamic theory of organizational knowledge creation. Organizational Science, 5 (1), 314–337.

Nonaka, I., & Takeuchi, H. (1995). The knowledge creating company: How Japanese companies create the dynamics of innovation . New York, NY: Oxford University Press.

Nonaka, I., & Konno, N. (1998). The concept of ‘ba’: Building a foundation for knowledge creation. California Management Review, 40 (3), 40–54.

Nonaka, I., Konno, N., & Toyama, R. (2001). Emergence of “ba”: A conceptual framework for the continuous and self-transcending process of knowledge creation. In I. Nonaka & T. Nishiguchi (Eds.), Knowledge emergence: Social, technical, and evolutionary dimensions of knowledge creation (pp. 13–29). New York: Oxford University Press.

Nonaka, I., Toyama, R., & Hirata, T. (2008). Managing flow: A process theory of the knowledge-based firm . New York: Palgrave Macmillan.

Nonaka, I., Toyama, R., & Konno, N. (2000). SECI, ba and leadership: A unified model of dynamic knowledge creation. Long Range Planning, 33 (1), 5–34.

Paavola, S., & Hakkarainen, K. (2005). The knowledge creation metaphor: An emergent epistemological approach to learning. Science & Education, 14 (6), 535–557.

Paavola, S., Engeström, R., & Hakkarainen, K. (2012). The trialogical approach as a new form of mediation. In A. Moen, A. I. Mørch, & S. Paavola (Eds.), Collaborative knowledge creation: Practice, tools, concepts (pp. 1–14). Rotterdam, The Netherlands: Sense Publishers.

Paavola, S., Lipponen, L., & Hakkarainen, K. (2004). Models of innovative knowledge communities and three metaphors of learning. Review of Educational Research, 74 (4), 557–576.

Paavola, S., Lakkala, M., Muukkonen, H., Kosonen, K., & Karlgren, K. (2011). The roles and uses of design principles for developing the trialogical approach on learning. Research in Learning Technology, 19 (3), 233–246.

Parry, R. (2014). Episteme and techne. In E.N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Fall 2014 ed.). Retrieved from http://plato.stanford.edu/archives/fall2014/entries/episteme-techne/ .

Peca, K. (2000). Positivism in education: Philosophical, research and organizational assumptions . Portales, NM: Eastern New Mexico University. Retrieved from the ERIC database. (ED456536).

Polanyi, M. (1966/2009). The tacit dimension . Illinois: University of Chicago Press.

Popper, K. R. (1972). Objective knowledge: An evolutionary approach . Oxford: Clarendon Press.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

Ryan, A. B. (2006). Post-Positivist approaches to research. In M. Antonesa, H. Fallon, A. B. Ryan, A. Ryan, T. Walsh, & L. Borys (Eds.), Researching and writing your thesis: A guide for postgraduate students (pp. 12–26). MACE: Maynooth Adult and Community Education.

Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. In B. Smith (Ed.), Liberal education in a knowledge society (pp. 67–98). Chicago: Open Court.

Scardamalia, M. (2004). CSILE/Knowledge Forum ® . In A. Kovalchick & K. Dawson (Eds.), Education and technology: An encyclopedia (pp. 183–192). Santa Barbara, CA: ABC-CLIO.

Scardamalia, M., & Bereiter, C. (1999). Schools as knowledge building organizations. In D. Keating & C. Hertzman (Eds.), Today’s children, tomorrow’s society: The developmental health and wealth of nations (pp. 274–289). New York: Guilford.

Scardamalia, M., & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. In R. K. Sawyer (Ed.), The cambridge handbook of the learning sciences (pp. 97–118). NY: Cambridge University Press.

Scardamalia, M., & Bereiter, C. (2014). Knowledge building and knowledge creation: Theory, pedagogy, and technology. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences (2nd ed., pp. 397–417). New York: Cambridge University Press.

Schön, D. A. (1983). The reflective practitioner: How professionals think in action . New York: Basic Books.

Sfard, A. (1998). On two metaphors for learning and the dangers of choosing just one. Educational Researcher, 27 (2), 4–13.

Sterelny, K. (2004). Externalism, epistemic artefacts and the extended mind. In R. Schantz (Ed.), The externalist challenge: New studies on cognition and intentionality (pp. 239–254). Berlin: de Gruyter. Retrieved from http://www.victoria.ac.nz/hppi/about/staff/publications/artefacts.pdf .

Steup, M. (2014). Epistemology. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Spring 2014 Edition). Retrieved from http://plato.stanford.edu/archives/spr2014/entries/epistemology/ .

Tan, Y. H., & Tan, S. C. (2016, June). Teachers’ understanding of knowledge creation: A phenomenography of Singapore Chinese Language teachers . Paper presented at Knowledge Building Summer Institute 2016, Nanyang Technological University, Singapore.

Tuomi, I., & Miller, R. (2011). Learning and education after the industrial age: A discussion paper for the confederation of Finish industries EK project Oivallus . Retrieved from http://www.meaningprocessing.com/personalPages/tuomi/articles/LearningAndEducationAfterTheIndustrialAge.pdf .

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Cambridge, MA: Harvard University Press.

Wagner, T. (1993). Systemic change: Rethinking the purpose of school. Educational Leadership, 51 (1), 24–28.

Whorf, B. L. (1956). Language, thought, and reality: Selected writings of Benjamin Lee Whorf. New York: Technology Press of MIT. Retrieved from https://archive.org/details/languagethoughtr00whor .

Wierzbicki, A. P., & Nakamori, Y. (2006). Creative space: Models of creative processes for the knowledge civilization age . Heidelberger, Berlin: Springer.

Xie, X., Fang, L., & Zeng, S. (2016). Collaborative innovation network and knowledge transfer performance: A fsQCA approach. Journal of Business Research, 69 (11), 5210–5215. https://doi.org/10.1016/j.jbusres.2016.04.114 .

Download references

Author information

Authors and affiliations.

Yusof Ishak Secondary School, Ministry of Education, Singapore, Singapore

Yuh Huann Tan

National Institute of Education, Nanyang Technological University, Singapore, Singapore

Seng Chee Tan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Yuh Huann Tan .

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this chapter

Tan, Y.H., Tan, S.C. (2020). Understanding Knowledge Creation. In: Conceptions of Knowledge Creation, Knowledge and Knowing. Springer, Singapore. https://doi.org/10.1007/978-981-15-3564-2_2

Download citation

DOI : https://doi.org/10.1007/978-981-15-3564-2_2

Published : 12 March 2020

Publisher Name : Springer, Singapore

Print ISBN : 978-981-15-3563-5

Online ISBN : 978-981-15-3564-2

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

National Academies Press: OpenBook

Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy (2014)

Chapter: 5 measuring the three k's: knowledge generation, knowledge networks, and knowledge flows.

5 Measuring the Three K’s: Knowledge Generation, Knowledge Networks, and Knowledge Flows

Knowledge generation can occur formally through directed research and experimental development in academic institutions, firms, and public and nonprofit institutions. Knowledge generation can also occur informally in a working environment through the activities and interactions of actors in an organization or the general economy. People are the critical input for knowledge generation, whether as individual researchers; in research teams; or even in collectives such as organizational subunits, entire organizations, or nation-states. 1 Therefore, indicators of knowledge generation focus on attributes of human capital inputs and related outputs. Knowledge can be acquired by using codified (written) sources such as publications or patents, or in tacit form by hiring people with the needed knowledge or participating in networks where the knowledge is stored ( Chapter 6 focuses on knowledge embodied in people). Knowledge can be both an intermediate input and a final output and can depreciate over time. 2

Knowledge networks link actors, organizations, and technologies in the global economy, revealing new discoveries and transferring knowhow on the development of new techniques, processes, and at times breakthroughs that can be commercialized ( Chapter 4 focuses on innovation). Knowledge networks include research collaborations, coinventorships, coauthorships, and strategic alliances. 3 Knowledge flows transmit across knowledge networks and point to comparative advantage, presence in other markets, and access to foreign technologies. To use acquired knowledge, recipients must have absorptive capacity. 4

Knowledge generation, diffusion, and use, as well as conduits for knowledge flows, are all key elements for economic growth (Romer, 1990). Therefore, it is critically important for the National Center for Science and Engineering Statistics (NCSES) to produce indicators of these varied dimensions of knowledge at the national, international, and subnational levels.

Quite a few data elements, such as research and development (R&D), patents, bibliometrics, and trade in technology, capture knowledge generation, networks, and flows (referred to as “the three K’s”). NCSES has been collecting these data for several decades in order to publish indicators on these topics, drawing on both its own and other data sources, such as the Bureau of Economic Analysis for data on global multinational R&D activities. International R&D is well covered by NCSES’s Business Research and Development and Innovation Survey (BRDIS). While NCSES has good measures of knowledge creation, however, a number of complex issues remain unaddressed, and challenges for measurement remain in the area of knowledge flows.

Therefore, the purpose of this chapter is to discuss the dynamics and outcomes of scientific R&D. To illustrate specific uses of science, technology, and innovation (STI) indicators in this context, the focus is on the policy questions that can be addressed using indicators on the three K’s; however, it should be noted that these indicators have several other uses. Box 5-1 highlights key policy questions relating to the generation and transfer of knowledge. 5 While raw data on R&D expenditures and patent citations are useful for understanding whether the United States is falling behind other countries in R&D expenditures and outcomes, more sophisticated statistics are required to address other

____________________

1 See Phelps and colleagues (2012, p. 7) for a description of repositories of knowledge. Romer (1990, p. S84) makes the following distinction between knowledge as an intermediate and final output: “… knowledge enters into production in two distinct ways. A new design enables the production of a new good that can be used to produce output. A new design also increases the total stock of knowledge and thereby increases the productivity of human capital in the research sector.”

2 See Huang and Diewert (2011) for methods of measuring knowledge depreciation.

3 For an extensive definition of knowledge networks, see Phelps et al. (2012, p. 61, endnote 1).

4 OECD (2013a) gives definitions of knowledge flows and classifications of indicators of knowledge flows in science, technology, and innovation sectors.

5 See Appendix B for the full list of policy questions.

BOX 5-1 Policy Questions Related to Knowledge Generation, Networks, and Flows

  • What new technologies or fields are emerging from current research?
  • Is the United States promoting platforms in information and communication technology, biotechnology, and other technologies to enable innovation in applications?
  • Is the United States falling behind other countries in R&D expenditures and outcomes?
  • How much are U.S. companies spending to be present in emerging markets? How much R&D are they conducting in these nations?
  • Is the United States losing or gaining advantage by buying and selling its R&D abroad?
  • Is the United States benefiting from research conducted in other countries?

issues pertaining to the competitiveness of U.S. companies and the benefits of buying and selling R&D internationally. The focus of this chapter is on the latter set of indicators.

A recent OECD (2012c) study titled Knowledge Networks and Markets in the Life Sciences describes key aspects of the three K’s in which indicators require further development. The following findings are particularly in accord with those presented in this chapter:

  • Individuals, firms, and countries are not uniformly linked to knowledge networks.
  • Evidence gaps persist with respect to capturing differences between knowledge production and use (as in the case of R&D), capturing partnerships and their financial dimension, monitoring the combined outward and inward dimensions of knowledge flows, and going beyond intellectual property indicators as measures of knowledge outputs.
  • Measurement standards need to be adapted if improvements are to be achieved in the interoperability of STI data sources across different domains, such as R&D, patents, other forms of registered intellectual property, scientific publications, innovation survey data, and administrative sources. Solutions need to be developed that address the impact of knowledge flows on the interpretation, relevance, and international comparability of existing STI indicators.

NCSES is poised to make important contributions to the improvement of indicators on the three K’s. Collaborative efforts with other agencies in the United States and abroad should be fruitful for this endeavor.

CODIFIED DEFINITIONS

The internationally accepted definition of “research and experimental development”—more commonly referred to as R&D—comes from OECD (2002, p. 30): “creative work undertaken on a systematic basis in order to increase the stock of knowledge, including knowledge of man, culture and society, and the use of this stock of knowledge to devise new applications.” 6 In BRDIS, NCSES expands on this definition, providing the following guidance (U.S. Department of Commerce, 2011, p. 3):

R&D is planned, creative work aimed at discovering new knowledge or developing new or significantly improved goods and services. This includes (a) activities aimed at acquiring new knowledge or understanding without specific immediate commercial applications or uses (basic research); (b) activities aimed at solving a specific problem or meeting a specific commercial objective (applied research); and (c) systematic use of research and practical experience to produce new or significantly improved goods, services, or processes (development).

The term “research and development” does NOT include expenditures for:

  • costs for routine product testing, quality control, and technical services unless they are an integral part of an R&D project;
  • market research;
  • efficiency surveys or management studies;
  • literary, artistic, or historical projects, such as films, music, or books and other publications; and
  • prospecting or exploration for natural resources.

The term “science and technology” (S&T) covers a wide range of activities, including R&D, but is rarely defined in the literature, perhaps because its breadth leads to its being used in different ways in different contexts. The United Nations Educational, Scientific and Cultural Organization (UNESCO) (1984, p. 17) provides a definition of the term that is used for this chapter:

For statistical purposes, Scientific and Technological Activities (STA) can be defined as all systematic activities which are closely concerned with the generation, advancement, dissemination, and application of scientific and technical knowledge in all fields of science and technology, that is the natural sciences, engineering and technology, the medical and the agricultural sciences (NS), as well as the social sciences and humanities (SSH). 7

6 This is the definition used in all OECD, European Union, African Union, and Latin American countries. All elaborate on this definition in their survey instruments as the United States has done to incorporate definitions for basic, applied, and experimental development.

7 Also included in the

Because S&T includes but is not limited to R&D, 8 the focus of this chapter is on indicators of foreign direct investment in R&D and trade in knowledge-intensive services. Measurement of intangible assets also is touched upon, although the panel does not view the development of such measures as more appropriate for NCSES than for the Bureau of Economic Analysis.

MEASURING SCIENCE AND TECHNOLOGY: MAJOR GAPS IN INTERNATIONAL COMPARABILITY

Comparability is a universal challenge for statistics and for indicators based on those statistics. The comparability of data can be affected by the survey techniques used to collect the data and the conversion of the data into statistics through the use of weighting schemes and aggregation techniques. These problems are amplified when statistics are used to create indicators, as the indicators may be a combination of statistics (e.g., an average, a sum, or a ratio) with different comparability problems. In addition to the international or geographic comparison of indicators that describe an aspect of a system (e.g., R&D as a percentage of gross domestic product [GDP]), there are problems with intertemporal and intersectoral comparisons. Users of indicators need to recognize that all statistics and indicators have a margin of error beyond which they should not be pushed. The problem is growing as response rates to official surveys continue to decline.

International comparisons entail fundamental issues such as language (e.g., the Japanese term for “innovation” is actually closer to what most Americans think of as “technology”), and NCSES is to be congratulated for supporting a project with OECD and the European Union (EU) on the cognitive testing of survey questions in multiple languages. Differences in institutions (e.g., the accounting for the European Union Framework program across EU member states) pose problems, as do cultural differences (e.g., the Nordic world has access to “cradle to grave” linked microdata on individuals) and differences in governance structures (e.g., the importance of subnational R&D programs in some countries). These differences can limit comparability and increase the margin of error that should be applied to international comparisons of statistics and indicators.

In the area of S&T indicators, a number of key comparability problems are well known. OECD compiles S&T statistics, monitors the methodology used to produce them, and publishes international comparisons and has documented the problems summarized below.

Research and Development 9

Each country depends for its R&D data on the coverage of national R&D surveys across sectors and industries. In addition, firms and organizations of different sizes are measured, and national classifications for firm sizes differ. Countries also do not necessarily use the same sampling and estimation methods. Because R&D typically involves a few large organizations in a few industries, R&D surveys use various techniques to maintain up-to-date registers of known performers. Analysts have developed ways to avoid double counting of R&D by performers and by companies that contract with those firms or fund R&D activities of third parties. These techniques are not standardized across nations.

R&D expenditure data for the United States are somewhat underestimated for a number of reasons:

  • R&D performed in the government sector covers only federal government activities. State and local government establishments are excluded from the national figures. 10
  • In the higher education sector, R&D in the humanities is excluded, as are capital expenditures. 11
  • R&D expenditures in the private nonprofit sector include only current expenditures. Depreciation is reported in place of gross capital expenditures in the business enterprise sector.

Allocation of R&D by sector poses another challenge to the comparability of data across nations. Using an industry-based definition, the distinction between market and public services is an approximate one. In OECD countries, private education and health services are available to varying degrees, while some transport and postal services remain in the public realm. Allocating R&D by industry presents a challenge as well. Some countries adopt a “principal activity” approach, whereby a firm’s R&D expenditures are assigned to that firm’s principal industrial activity code. Other countries collect information on R&D by “product field,” so the R&D is assigned to the industries of final use, allowing reporting companies to break expenditures down across product fields when more than one applies. Many countries follow a combination of these approaches, as product breakdowns often are not required in short-form surveys.

definition of S&T are “scientific and technological services” and “scientific and technological education and training,” the definitions of which are found in United Nations Educational, Scientific and Cultural Organization (1978).

8 The OECD Frascati Manual (OECD, 2002, p. 19) notes that “R&D (defined similarly by UNESCO and the OECD) is thus to be distinguished from both STET [scientific and technological education and training] and STS [scientific and technological services].” The Frascati definition of R&D includes basic research, applied research, and experimental development, as is clear from NCSES’s presentation of the definition in the BRDIS for use by its respondents.

9 This description draws heavily on OECD (2009, 2011) and Main Science and Technology Indicators (MSTI) (OECD, 2012b).

10 NCSES reports state R&D figures separately.

11 In general, OECD’s reporting of R&D covers R&D both in the natural sciences (including agricultural and medical sciences) and engineering and in the social sciences and humanities. A large number of countries collect data on R&D activities in the business enterprise sector for the natural sciences and engineering only. NCSES does report data on social science R&D.

The Frascati Manual (OECD, 2002) recommends following a main activity approach when classifying statistical units, but recommends subdividing the R&D by units or product fields for firms carrying out significant R&D for several kinds of activities. This applies to all industry groups and, at a minimum, to the R&D industry (International Standard Industrial Classification [ISIC] Rev. 3, Division 73, or North American Industry Classification System [NAICS] 5417 in North America), although not all countries follow this method.

Comparability problems are also caused by the need to preserve the confidentiality of survey respondents (see Chapter 4 ). National statistical practice will prevent publication of the value of a variable if it is based on too few responses. This not only results in suppression of a particular cell in a table, but also requires additional suppression if there are subtotals that could be used to infer the suppressed information. The result is reduced comparability, which can be overcome only by microdata analysis under controlled conditions.

In principle, R&D institutes serving enterprises are classified according to the industry they serve. When this is not done, the percentage of business enterprise expenditure on R&D (BERD) performed by what is most likely a service industry is overestimated compared with estimates for other countries.

Finally, R&D performers recently have been asked in surveys to break down their R&D activities across sites in different national territories or regions. Estimating R&D intensity by region or other subnational unit presents additional challenges. The existence of multinationals headquartered in a given country that conduct R&D and trade in R&D services worldwide makes it difficult to pinpoint where the R&D is funded and performed and where it has impact. For example, the R&D could be funded by a head office in Rome, performed in a research institute in Israel, and have an impact on consumers of the resulting product in the United States.

Government Budget Appropriations or Outlays for R&D (GBAORD) 12

GBAORD data are assembled by national authorities using statistics collected from budgets. This process entails identifying all the budget items involving R&D and measuring or estimating their R&D content. The series generally cover the federal or central government only. GBAORD is a good reflection of government priorities based on socioeconomic objectives. These statistics often are used for cross-country comparisons, particularly to address such questions as: Is the United States falling behind other countries in R&D expenditures and outcomes? While it is not necessarily the case that high government expenditures foreshadow international preeminence in S&T, it is important to understand whether such expenditures indeed lead to better employment, health, and security outcomes.

However, comparability problems arise because some countries do not include in their GBAORD estimates funding for general support of universities (e.g., the United States) or R&D funded as part of military procurement (e.g., Japan, Israel). Moreover, it currently is not possible for all countries to report, on the basis of budget data, which sectors are responsible for performing the R&D funded by government.

Business Enterprise Expenditures on R&D 13

BERD statistics convey business R&D expenditures. OECD breaks down business R&D expenditure data into 60 manufacturing and service sectors for OECD countries and selected nonmember economies. The reported data are expressed in national currencies (as well as in purchasing power parity U.S. dollars), at both current and constant prices.

When assessing changes in BERD over time, it is necessary to take account of changes in methods and breaks in series, notably in terms of the extension of survey coverage, particularly in the service sector, and the privatization of publicly owned firms. Identifying new and occasional R&D performers is also a challenge, and OECD countries take different approaches to this challenge in their BERD surveys. In addition, not all activities related to foreign affiliates’ R&D are recorded in company transactions. There are intracompany transfers (e.g., intracompany mobility of researchers) with no monetary counterparts that lead to R&D efforts that do not appear in the statistics as R&D spending by foreign affiliates. The increasing internationalization of R&D and other economic activities also makes it difficult to accurately identify inflows of R&D funds to companies and their precise nature (as discussed later in this chapter). For example, there is a growing need to measure international R&D transactions properly and to deal with the problem of nonpriced transfer of R&D within multinational enterprises. All of these issues require expert data manipulation and statistical analysis, thereby presenting challenges to the international comparability of indicators derived from these statistics.

Technology Receipts and Payments 14

Technology receipts and payments, including those for R&D services, show a country’s ability to sell technology abroad and its use of foreign technologies, respectively. Further qualitative and quantitative information is needed to analyze a country’s deficit or surplus because a deficit (surplus) on the technology balance does not necessarily indicate the lack (presence) of competitiveness.

12 This section is based on OECD (2011) and OECD (2012b).

13 This section is based on OECD (2011).

14 This section is based on OECD (2011).

Measurement errors may lead to underestimation or overestimation of technology transfers. Licensing contracts provide payment channels other than technology payments, and payment/receipt flows may be only part of the total price paid and received. Alternatively, national tax and control regulations on technology receipts and payments may bias data on technology flows, notably for international transfers of multinationals. If royalties are less taxable than profits, then they may be preferred to other transfer channels and exceed the value of technology transferred. On the other hand, if limitations are imposed on royalty remittances, then some portion of repatriated profits will represent remuneration of technology transfer.

Each of the above reasons for international incomparability of some S&T measures goes beyond what NCSES can deal with on its own. An OECD Working Party, the National Experts on Science and Technology Indicators (NESTI), has been in place for 50 years to discuss these issues and support collaboration to resolve them. Nonetheless, there are some areas in which NCSES has opportunities to adjust definitions and improve methodologies to obtain more accurate STI indicators. For example, finer-grained size classes for firms would allow a better understanding of the relationship between firm size and innovation (as discussed in Chapter 4 ). In addition, improved measures of business enterprise R&D would shed some light on the question of whether the United States is increasingly depending on knowledge generated in other countries. And better measuring of technology receipts and payments would show which countries are net buyers or sellers of knowledge-intensive services. Recommendations for how NCSES could go about improving these measures appear later in this chapter.

TRADITIONAL INDICATORS OF THE THREE K’S

Patent 15 data and bibliometrics (data on publication counts and citations) can be used to measure new knowledge, knowledge networks, and knowledge flows.

Patent administrative records—including citations, claims, technical classifications, families, 16 and countries where the patents are effective—contain a wealth of information about invention. They also contain detail on inventors and applicants and on the regulatory and administrative processes of the patenting system. 17 Patent information is useful for determining when a new product or process was developed and its linkages to prior inventions and to research that was the foundation for the invention. Observing where patents are registered can also yield clues to how new knowledge is diffused from nation to nation.

Patent data often are used to develop indicators of knowledge generation, flows, and linkages. OECD’s (2008) Compendium of Patent Statistics 2008 gives several examples:

  • Patent-based statistics can be derived that reflect the inventive performance of countries, regions, and firms.
  • The inventors’ addresses can be used to monitor linkages, including the internationalization of and international collaboration in S&T activities.
  • Knowledge networks can be determined by observing cooperation in research and diffusion of technology across industries or countries in specific technological areas.
  • The market strategy of businesses can be inferred from information contained in the patent file.

At the same time, information derived from patent records must be used with caution (OECD, 2006):

  • The value distribution of patents is skewed as many patents have no industrial application (and hence are of little value to society), whereas a few are of substantial value.
  • Many inventions are not patented because they are not patentable, or inventors may protect them using other methods, such as secrecy or lead time.
  • The propensity to patent differs across countries and industries.
  • Differences in patent regulations make it difficult to compare counts across countries.
  • Changes in patent law over the years make it difficult to analyze trends over time.

The panel emphasizes the first point on the above list: patents may be used strategically in some sectors of an economy to deter competition. Andrew Updegrove of

15 “Patents are an exclusive right issued by authorised bodies to inventors to make use of and exploit their inventions for a limited period of time (generally 20 years). Patents are granted to firms, individuals or other entities as long as the invention is novel, non-obvious and industrially applicable. The patent holder has the legal authority to exclude others from commercially exploiting the invention (for a limited time period). In return for the ownership rights, the applicant must disclose information relating to the invention for which protection is sought” (Khan and Dernis, 2006, p. 6).

16 “A patent family is the same invention disclosed by a common inventor(s) and patented in more than one country” (United States Patent and Trademark Office, http://www.uspto.gov/main/glossary/#p [June 2013]). The European Patent Office has the following definition: “A patent family is a set of either patent applications or publications taken in multiple countries to protect a single invention by a common inventor(s) and then patented in more than one country. A first application is made in one country—the priority—and is then extended to other offices” ( http://www.epo.org/searching/essentials/patent-families.html [June 2013]).

17 As administrative records, patent applications and grants are a rich microdata source that do not rely on surveys and do not generate the respondent burden associated with traditional statistical surveys.

Gesmer Updegrove LLP captured this sentiment by saying, “Patents don’t give value; they cause friction” (Updegrove, 2012). Therefore, the notion that substantial patent activity is an indicator of major leaps in S&T capabilities or innovation is not necessarily the case. In some instances, patenting could have a negative impact on knowledge creation and innovation. Thus observed patent activity as an indicator of knowledge generation or innovation should be determined sector by sector.

In his presentation to the panel in February 2012, Stuart Graham, chief economist at the United States Patent and Trademark Office (USPTO), outlined USPTO’s Economic Data Agenda. In the near term, the agency will improve its databases, particularly the Patent Assignment, Trademark Casefile, and Trademark Assignment datasets. Over time, USPTO is also “considering providing a forum that would facilitate the posting of additional matched datasets, papers and findings” and working with other agencies to create “matched datasets to other economically-relevant information.” For NCSES’s activities on STI indicators, particularly those related to producing better measures of knowledge generation, flows, and networks, continued collaboration with USPTO should be beneficial. NCSES already relies on USPTO data for basic measures of patenting activity. However, linking basic research outputs to patents and trademarks (including the human capital and demographic markers that are indicated on the records) and ultimately to outcomes that have significant societal impacts would be of great benefit to users of NCSES indicators. In addition, these linked files would be helpful to researchers who work with the datasets of USPTO, NCSES, and others to understand relationships and rates of return in the STI system.

The panel makes no explicit recommendation here for NCSES to do more than continue to explore wider use of patent indicators and to engage in international cooperation on the development of indicators based on patent records to address user needs. There is no standard method for calculating indicators from patent data, and as noted earlier, analysis of these data without reservation can lead to incorrect inferences and misleading policy decisions. It is important to improve data quality and analytical techniques in this area—an active role for NCSES in collaboration with other agencies and organizations worldwide. As NCSES continues to disseminate patent data as part of its STI indicators program, it would be valuable to users to have clear cautions regarding the use and misuse of these statistics for decision-making purposes.

Bibliometrics

Publication is a major vehicle for disseminating and validating research results. Bibliometric data on publication counts and citations thus are a valuable source for measuring scientific performance, tracking the development of new technologies and research areas, and mapping linkages among researchers. Publication counts are based on science and engineering (S&E) articles, notes, and reviews published in a set of the world’s most influential scientific and technical journals (Ruegg and Feller, 2003, p. 31).

A number of characteristics can be used for categorization of publications and indicator development. Fields are determined by the classification of each journal. Publications are attributed to countries by the author’s institutional affiliation at the time of publication. Indicators of coauthorship appear to be affected by two factors. The first is language, although this has become less of an issue as English has become the language most commonly used internationally by researchers. The second is geographic location, although the effect of information and communication technologies on knowledge flows has undoubtedly lessened its effect. The quality of publications can be measured both by the quality of the journal and by how often it is cited in other publications. Citations can also be used to measure knowledge flows and linkages between different research areas. Coauthorship provides an additional measure of linkages and often is used as an indicator of collaboration patterns.

NCSES currently publishes a number of indicators based on bibliometric data. These include counts of S&E articles, shares of articles with domestic or international coauthors, counts and shares of citations and top-cited articles, and citation rates. These indicators can be used primarily to measure the output of scientific research. For example, counts of articles and citations and shares of world totals show how the United States is faring compared with other countries or regions. These indicators can also be used to measure the extent of collaboration and linkage. An example is the network maps used in the report Knowledge, Networks and Nations: Global Scientific Collaboration in the 21st Century , by the UK Royal Society (The Royal Society, 2011). These network maps are based on authorship of articles and show patterns of collaboration between countries. They are based on numbers of jointly authored research papers, with linkages being displayed when the collaboration between two countries amounts to 5-50 percent of the overall publication output of one of the partners. The OECD (2010) report Measuring Innovation: A New Perspective uses citation data to measure the interrelatedness of different research areas. 18

Bibliometric data potentially can be used to create a number of additional indicators to provide further detail on linkages across research areas or by geographic location. This information can be particularly valuable for mapping the development of new research areas, such as green technologies, or the spread of general-purpose technologies.

There are some limitations to the use of bibliometric analysis for the production of S&T indicators, particularly when used to measure causal relationships, such as socioeconomic impacts of funding basic science. It is also difficult to isolate how much research networks have changed because

18 This report references the citation technique used in Saka et al. (2010).

of a given research funding award granted or the existence of a new collaborative agreement. Impact factors and Hirsh’s (h) index, commonly used by bibliometricians, do not allow for comparisons with counterfactual analysis. Furthermore, measures must be normalized to be helpful for comparing research outputs, or they are no better than “nose-prints”—metaphorically, signs of high window-shopping activity, with no true indication that a substantive purchase has occurred. There are ways for numbers of patents and articles to be inflated by their producers without substantive advances in S&T having been achieved. Bornmann and Marx (2013) state that “… mere citation figures have little meaning without normalization for subject category and publication year…. We need new citation impact indicators that normalize for any factors other than quality that influence citation rates and that take into account the skewed distributions of citations across papers.” Bornmann and Marx describe techniques using percentiles to create normalized indicators, an improvement on impact factors and Hirsh’s (h) index. 19 To its credit, the National Science Board (for which NCSES produces the Science and Engineering Indicators [ SEI ] biennial volumes) is mentioned by Bornmann and Marx as one of the federal agencies that uses percentile ranks of publications. Although this is good practice, it is important to note that these indicators are not appropriate for impact assessment, for which counterfactual evidence is necessary.

RECOMMENDATION 5-1: The National Center for Science and Engineering Statistics should expand its current set of bibliometric indicators to develop additional measures of knowledge flows and networking patterns. Data on both coauthorship and citations should be exploited to a greater extent than is currently the case.

BUSINESS R&D SERVICES AND INTANGIBLE ASSETS

Although NCSES publishes a rich set of data on R&D expenditures and performance, measures of spillover effects still are needed to aid in determining the effects of scientific investment on socioeconomic outcomes. Policy makers would benefit from such measures in addressing such questions as: What is the effect of federal spending on R&D on innovation and economic health, and over what time frame? What is the international balance of trade in R&D services? How much R&D do U.S. multinational companies conduct outside the United States, and how much R&D do foreign multinational companies carry out in the United States? How much are U.S. companies spending to be present in emerging markets? How much R&D are they conducting in these nations?

This section addresses the question of how R&D data can best be exploited, focusing in particular on the measurement of trade in R&D services. BRDIS contains a rich dataset on R&D that is only partially exploited in present indicators. Given the size and complexity of BRDIS, however, a tradeoff is entailed in terms of the time and resources needed to process these data. BRDIS can be exploited by researchers within and outside government, subject to appropriate restrictions to protect respondents, but only if a researcher database is provided with sufficient metadata 20 to define all the variables and the degree of imputation for each.

At the same time, the panel acknowledges that further exploitation of BRDIS would require additional resources and might also involve a trade-off in terms of the timeliness of the release of key R&D indicators. The time required to process and release R&D statistics increased significantly following the introduction of BRDIS, which is a longer and more complex survey than its predecessor, the Survey of Industrial Research and Development. The panel views timeliness as an important factor in determining the value of R&D and other indicators and encourages NCSES to place high priority on reducing the time lag in the release of BRDIS data.

Trade in R&D Services 21

One important aspect of R&D is R&D services, which are services for the performance of R&D provided by one organization for another. R&D services are for the most part provided by companies and organizations involved in biotechnology; contract research (including physical, engineering, and life sciences firms); and professional, scientific, and technical areas (including social sciences and humanities). These are companies or organizations categorized under NAICS code 5417 (scientific R&D services). Specifying NAICS codes for R&D services (as does BRDIS) is important, since firms in almost any industry can buy or sell R&D services. For example, Boeing can buy services to fill a gap in its R&D program for wing design; Walmart can sell its knowledge, based on R&D, on supply chains; and extraction firms can buy or sell R&D services related to extraction.

Currently, R&D services are captured through the use of a number of indicators published in the SEI . These include R&D by sector and location of performance, funding of R&D

19 “The percentile of a publication is its relative position within the reference set—the higher the percentile rank, the more citations it has received compared with publications in the same subject category and publication year” (Bornmann and Marx, 2013, p. 2).

20 Metadata describe the data and how they were constructed.

21 “Services are the result of a production activity that changes the conditions of the consuming units, or facilitates the exchange of products or financial assets. These types of service may be described as change-effecting services and margin services respectively. Change-effecting services are outputs produced to order and typically consist of changes in the conditions of the consuming units realized by the activities of producers at the demand of the consumers. Change-effecting service are not separate entities over which ownership rights can be established. They cannot be traded separately from their production. By the time their production is completed, they must have been provided to the consumers” (European Commission, 2009, Chapter 6, paragraph 17).

by companies and others, R&D performed abroad by U.S.owned companies, R&D performed in the United States by foreign multinationals (foreign direct investment in R&D), and exports and imports of R&D and testing services. For the SEI , data on R&D performance and funding are taken from BRDIS, while the Bureau of Economic Analysis (BEA) provides the data on foreign direct investment in R&D and on international trade in R&D testing services.

NCSES is expanding its data-linking activities to match BRDIS microdata with BEA survey microdata on U.S. foreign direct investment. The agency also has undertaken fruitful interagency collaboration with BEA to integrate R&D into the system of national accounts.

The panel deliberated on globalization and its impact on the research enterprise in the United States. An immediate policy question was how much R&D, measured in terms of expenditures, was outsourced to countries such as Brazil, China, or India, and whether R&D was performed by foreign affiliates or purchased from other companies. A related question was how much knowledge produced by U.S. R&D is being purchased by other countries, and which countries are leading purchasers. These are important but also complex questions that present a number of difficult challenges for data collection.

The panel thus commissioned a paper on this subject by Sue Okubo (2012). The paper reviews the current work of BEA in this area and compares it with recent NCSES work on BRDIS. 22 Several observations follow from this comparison.

One key observation in Okubo’s paper is the difference between the classifications used by BEA and NCSES and the fact that BEA measures trade in R&D and testing services, whereas NCSES in BRDIS measures R&D services only. While BEA and NCSES are cooperating on survey activity, the panel emphasizes the importance of this cooperation’s leading to comparability of the data produced by these and other agencies (see Recommendation 5-2 later in this section).

The surveys on international transactions administered by BEA and the R&D surveys 23 carried out by NCSES follow different guidance: BEA follows the sixth edition of the International Monetary Fund’s (IMF) (2011) Balance of Payments and International Investment Position Manual , while NCSES follows the Frascati Manual (OECD, 2002a). However, the two approaches are not far apart. The IMF manual includes some R&D and intellectual property elements that are consistent with the Frascati Manual . Therefore, the geographic and ownership scope of BEA’s international transaction surveys and that of the BRDIS are conceptually close. For example, BEA’s international transaction surveys encompass any company with activities in the United States, regardless of ownership. The surveys cover transactions of U.S.-located units of foreign multinational enterprises with entities outside the United States, including transactions with their own foreign parents, and affiliated and unaffiliated trade. Similarly, for the United States, the surveys cover affiliated and unaffiliated trade and transactions by purely domestic companies (no relationship with any multinational enterprise). BRDIS also covers any company with activities in the United States, regardless of ownership, and foreign affiliates of U.S. multinational enterprises.

On the other hand, BRDIS treats foreign parent companies differently from the way they are treated in both BEA’s trade surveys and BEA’s surveys of foreign direct investment. Other differences exist between BRDIS and BEA data on the international balance of payments in R&D trade: BEA’s testing services, which are part of the research, development, and testing measure, may include R&D and non-R&D components, and R&D is treated by NCSES basically as a cost measure, while transactions are treated more like market values. Moris (2009, p. 184) suggests a matrix for use in parsing the data from BEA’s trade surveys and R&D surveys (including BRDIS).

A second key observation in Okubo’s paper relates to the results of the BEA surveys with respect to the sale of R&D and testing services abroad. For 2010, the largest buyers of U.S. R&D and testing services were Bermuda, 24 Ireland, Japan, the Netherlands, and Switzerland, accounting for 6.6 percent of the total trade of $30.9 billion. Such a distribution of trade statistics is rare, as is illustrated by trade in professional, business, and technical (PBT) services. In 2010, the largest buyers of U.S. PBT services were Germany, Ireland, Japan, Switzerland, and the United Kingdom, accounting for 37 percent of total trade; the largest sellers of PBT services to the United States—the countries to which these services were outsourced—were Germany, India, Japan, the Netherlands, Switzerland, and the United Kingdom, which accounted for 40 percent of total U.S. payments for these services (Okubo, 2012). The dominance of the leading countries in the sale and purchase of PBT services is seen in other trade figures, but not in the sale and purchase of R&D and testing services. This difference in the concentration of R&D and testing services merits further analysis.

In summary, the questions that beg to be answered are: Under what circumstances does the R&D activity of multi-

22 In September 2012, NCSES inaugurated a website with two new publications on the International Investment and R&D Data Link project. The site will also house future publications on the BRDIS link (National Science Foundation, 2013b). It should be noted that BEA plans to incorporate R&D as investment in the core economic accounts in 2014.

23 The NCSES surveys referred to include BRDIS and its predecessor, the Survey of Industrial Research and Development.

24 If one were to start with R&D performers only and then look at their R&D exports and imports, legitimate non-R&D performers that only import their R&D from overseas would be eliminated from the analysis. This exercise would require access to the microdata, which are not publicly available. However, NCSES could conduct this analysis and publish the statistics and rankings. There is no escape from accounting and transfer price issues, such as allocated costs that are not related to actual R&D trade. R&D performance data for multinational enterprises are not immune to this issue. Conditioning on performance for trade flows can eliminate unwanted R&D and training data.

national corporations enhance U.S. economic performance, including leadership and strength in S&T? What effect do tax laws have on the location of R&D services? Clearly, the R&D activity of multinational corporations has grown, but the data available with which to analyze and track this activity have limitations. BRDIS includes data on domestic and foreign activities of firms and can provide a more detailed picture of R&D activities than has previously been possible or been fully exploited. Specifically, BRDIS offers more information on R&D service production and flows of R&D services in the United States and in U.S. firms abroad than has heretofore been published. Understanding outsourcing and trade in R&D services is particularly important because the developed economies are dominated by service industries. BRDIS data also can support measures of payments and receipts for R&D services abroad, by leading countries, which is critically important for policy purposes.

RECOMMENDATION 5-2: The National Center for Science and Engineering Statistics (NCSES) should make greater use of data from its Business Research and Development and Innovation Survey to provide indicators of payments and receipts for research and development services purchased from and sold to other countries. For this purpose, NCSES should continue collaboration with the U.S. Bureau of Economic Analysis on the linked dataset.

The panel believes NCSES can provide these estimates and, if necessary, include appropriate questions on BRDIS in 2013 and subsequent years. The 2008, 2009, and 2010 BRDIS did not allow NCSES to collect all of the elements described above, but the 2011 and 2012 questionnaires are more comprehensive in this dimension, collecting data on R&D production, funding, and transactions. Data would be available with which to produce statistics on payments and receipts for R&D services involving U.S. company affiliates at home and abroad and on how those data differ, if at all, from the BEA measures. Similar information on foreign company affiliates from other sources could be used for parallel comparisons. 25 NCSES could consider developing two series—payments and receipts for R&D services—for three to five leading countries. The resulting statistics would show what knowledge creation is being outsourced and which countries are buying U.S. knowledge. This information would enable users to track trends over time and have a better understanding of knowledge flows and the formation of R&D networks.

Over time, this exercise would provide answers to a range of questions: Is the United States losing or gaining advantage by buying and selling its R&D abroad? Is the United States benefiting from research conducted in other countries? What is the United States learning from other countries, and what are other countries learning from the United States? In what technological areas are other countries accelerating development using knowledge sourced in the United States? What is the role of multinational enterprises in transferring R&D capacity from country to country? The data could also be used in regression analysis to answer another important question: What impact does the international flow of R&D have on U.S. economic performance? Users of the data on international flows of R&D services are likely to be interested in seeing how emerging economies are advancing in R&D capacity, in what fields U.S. companies are sourcing or outsourcing R&D and whether it is increasingly being sourced or outsourced in specific countries, and which countries 5-10 years from now may be the hub of new scientific knowledge—possibly countries in Latin America, the Middle East, or sub-Saharan Africa.

Intangible Assets

Until recently, the important role of knowledge-based capital (KBC) was rarely recognized, one exception being Nakamura’s (1999) research on intangibles 26 and the “New Economy.” This situation has changed primarily as a result of the pioneering research of Corrado and colleagues (2005) on intangibles. In their 2006 paper, these authors point out that most knowledge-based investment is excluded from measured GDP and from most productivity and economic growth models. The authors recognize three broad categories of KBC: computerized information (software and databases); innovative property (patents, copyrights, designs, trademarks); and economic competencies (including brand equity, firm-specific human capital, networks joining people and institutions, and organizational know-how that increases enterprise efficiency). Another important form of KBC is human capital that is not firm specific, such as most human capital that is created through education. 27 The World Bank (1997) estimates that for most countries, intangibles, including human capital more broadly defined, represent the majority of a country’s wealth. 28 By all accounts, failing to recognize KBC in any analysis of economic growth or the potential for innovation is a significant omission.

For this reason, a major development in the measurement of KBC occurred when the status of R&D was changed in the 2008 System of National Accounts (SNA) from an expense to an (intangible) capital investment. Efforts are still ongoing both in the United States (see, e.g., U.S. Bureau of Economic Analysis, 2010) and internationally to integrate R&D fully into national accounts. This work requires not only high-

25 See, for example, Eurostat 2010 statistics (Eurostat, 2013). Also see statistics for Germany (Deutsche Bank Research, 2011) and on the Indian engineering R&D offshoring market (NASSCOM and Booz & Company, 2010). These two reports cite private company estimates, as well as published Eurostat statistics.

26 Part of the broad category of KBC; see, e.g., OECD (2012a).

27 Human capital is discussed in Chapter 6.

28 World Bank intangibles include human capital, the country’s infrastructure, social capital, and the returns from net foreign financial assets.

quality data on R&D, but also methods for estimating the depreciation of R&D capital, appropriate R&D deflators, and the estimation of price changes. Although the integration of R&D into the SNA is mainly the responsibility of BEA, NCSES has an important role through its long-standing expertise in the collection of R&D data.

The estimates of Corrado, Hulten, and Sichel for the United States give a sense of the relative importance of various components of KBC as defined above (Corrado et al., 2006). Almost 35 percent of their measured KBC either is currently in GDP (computer software) or is in GDP beginning with estimates for 2013 (mainly scientific R&D). Some data on nonscientific R&D (e.g., social science R&D) are now collected through National Science Foundation (NSF) surveys. Total nonscientific R&D is estimated by Corrado, Hulten, and Sichel to be in excess of 20 percent of total R&D. The largest portion of the unmeasured component, economic competencies, accounts for somewhat less than 40 percent of spending on business intangibles.

More than 70 percent of spending on economic competencies is for firm-specific resources. This spending includes employer-provided worker training and management time devoted to increasing firm productivity. Examples given for management time are time for strategic planning, adaptation, and reorganization. Corrado, Hulten, and Sichel used management consulting industry revenues, trends in compensation, and numbers of individuals in executive occupations to estimate spending in the management time category. Sixty percent of advertising expenditures is allocated to business spending on brand equity intangibles. 29

A number of researchers have estimated KBC for individual countries following the lead of Corrado, Hulten, and Sichel. These individual countries include Australia (Barnes, 2010; Barnes and McClure, 2009), Canada (Baldwin et al., 2008), China (Hulten and Hao, 2012), Finland (Jalava et al., 2007), France and Germany (Delbecque and Bounfour, 2011), Japan (Fukao et al., 2007, 2009, 2012; Miyagawa and Hisa, 2012), the Netherlands (van Rooijen-Horsten et al., 2008), and the United Kingdom (Gil and Haskel, 2008; Marrano et al., 2009). Corrado and colleagues (2012) recently completed KBC estimates for the 27 EU countries and the United States. In addition, the methodology for estimating individual components of KGC has been refined, most notably by Gil and Haskell (2008).

A discussion paper by Corrado and colleagues (2012) provides the broadest view of the importance of KBC as it covers the largest number of countries. 30 In their estimates, the United States stands out for two reasons as compared with regional EU country averages: it has the largest share of intangible investment in GDP (11 percent), and it is the only country/region for which intangible investment is a larger share of GDP than tangible investment. In all country/ regional comparisons, however, the rate of growth in intangible investment exceeds that in intangible investment. The authors report three main results. First, capital deepening is the dominant source of economic growth once intangibles are recognized. Second, deepening of intangible capital accounts for one-fifth to one-third of the growth of labor productivity. Finally, the contribution of intangible capital in some large European countries (e.g., Germany, Italy, and Spain) is lower than that in the United Kingdom and the United States. However, there are significant country differences in the distribution of intangibles by broad types: computerized information, innovative property, and economic competencies.

Aizcorbe and colleagues (2009) review various definitions of innovation; propose how measures of innovation like that addressed by Corrado, Hulten, and Sichel could be integrated into a satellite account; and outline future BEA plans. They note that whether advertising and marketing expenditures should be treated as investment is being debated. They question whether cumulating all firms’ advertising expenditures should be registered as increasing aggregate output. In addition, they comment on the difficulty of measuring spending on organizational change. As Corrado, Hulten, and Sichel also recognize, they describe how developing deflators and depreciation rates for most intangibles can be difficult. Their paper calls for cultivation of sources for spending on the development and implementation of new business models, the creation of new artistic originals (see below), the design of new products, and intermediate inputs to innovation. Finally, they hope to work toward better price and depreciation estimates and, in cooperation with the Census Bureau and NSF, the publication of firm and establishment innovation statistics.

Since Corrado, Hulten, and Sichel published their first paper on intangibles in 2005, U.S. government agencies have moved forward to measure and recognize intangibles more comprehensively. As mentioned above, efforts are under way to capitalize R&D and fully integrate it into the SNA. Investment in artistic originals is incorporated into U.S. GDP in 2013 (Aizcorbe et al., 2009). 31 BEA-defined artistic originals include theatrical movies, original songs and recordings, original books, long-lived television programming, and miscellaneous artwork (Soloveichik, 2010a,b,c,d, 2011a,b). For many years, mineral exploration, a relatively small component, has been recognized as investment in U.S. GDP.

Many reports and monographs and at least one book have been produced on KBC. Many of them have been published since 2005. An interim project report from OECD (2012a)

29 More information on how business spending in intangibles was estimated is available in Corrado et al. (2005).

30 The years covered vary in Corrado et al. (2012): the earliest beginning year is 1995, and the latest is 2009. Regions include Scandinavian (Denmark, Finland, and Sweden), Anglosaxon (Ireland and the United Kingdom), Continental (Austria, Belgium, France, Germany, Luxembourg, and the Netherlands), and Mediterranean (Greece, Italy, Portugal, and Spain).

31 See Chapter 7 of this report for more detail on how Aizcorbe and colleagues at BEA are using administrative records and web-based data in the agency’s project to capitalize intangible assets for inclusion in the SNA.

echoes the Corrado and colleagues (2012) conclusion that intangibles have been estimated to account for a substantial share of labor productivity: 20-25 percent across Europe and 27 percent in the United States. In addition, the OECD report notes that there are substantial spillovers from and repeated use of KBC, and that global competitiveness may increasingly be determined by KBC. After offering answers to the question of why business is investing in KBC, the OECD report focuses on policy questions. The policy challenges discussed with respect to KBC are in the areas of taxation, competition, intellectual property rights, personal data, and corporate reporting. Other publications focus on KBC more from an accounting or business perspective. Lev (2001) uses movements in stock market prices to estimate the impact and importance of intangibles. A long report by Stone and colleagues (2008), written from the business/accounting perspective, includes a long list of references. Among its contributions are a summary of efforts to measure firm- and aggregate-level innovation and a taxonomy of possible types of measures—indicator indices, monetary, and accounting. Many authors recognize the complexity of measuring and estimating the contribution of KBC to economic growth.

The potential definition of KBC is far broader then that employed by Corrado, Hulten, and Sichel. Aside from including all formal education, not just employer-provided training, Stone and colleagues (2008) cite two major categories—relational capital and open innovation. Relational capital refers to relationships with external stakeholders, including customers and suppliers. Its value can encompass the complementarity of user needs, such as customers and advertisers using Google for similar purposes. Companies that use open innovation post R&D and commercialization challenges on web-based forums or “marketplaces” that are accessible to communities of scientists, engineers, and entrepreneurs. A component of the Corrado, Hulten, and Sichel definition that is featured less prominently in related research, including that of Stone and colleagues (2008), is general networking. Stone and colleagues comment that general networking is particularly useful for businesses operating in emerging economies. Facebook provides a form of social capital/ networking that by extension has information and business value. Each of these expansions or extensions of the Corrado, Hulten, and Sichel definition of intangibles presents substantial measurement challenges.

As stated by Stone and colleagues (2008, p. II-4), “Intangible assets are not innovations, but they may lead to innovations.” And as stated by Ben Bernanke in the concluding sentence of a 2011 speech, “We will be more likely to promote innovative activity if we are able to measure it more effectively and document its role in economic growth.” The open question, however, is which KBC leads to economic growth and to what degree, and is this part of the challenge of making a direct and quantifiable connection between innovative activity and economic growth? Certainly some components of KBC have been studied extensively to document their role; scientific R&D is the prime example. Other components of KBC have been less well studied; organizational know-how is an example. The importance of KBC as an STI indicator depends on the drawing of connections. However, it is critical to recognize both KBC and tangible capital as factors that may be important indicators of future growth. Although the panel believes work on intangible assets may generate useful STI indicators, it believes NCSES should not seek to produce these statistics on its own, but support and draw on the work of other agencies, particularly BEA, in this area. However, NCSES still has an important role to play through its collection of high-quality R&D data, and it may also be able to contribute with other data sources. This might be the case, for example, if NCSES were to begin collecting data on innovation-related expenditures, as outlined in Chapter 4 .

RECOMMENDATION 5-3: The National Center for Science and Engineering Statistics (NCSES) should continue to report statistics on knowledge-based capital and intangible assets obtained from other agencies as part of its data repository function. In addition, NCSES should seek to use data from the Business Research and Development and Innovation Survey on research and development and potentially also on innovation-related expenditures as valuable inputs to ongoing work in this area.

Indicators of General-Purpose Technologies

“General-purpose technology” (Lipsey et al., 2005) is a term used to describe technologies with the potential to transform the economy and activities across a broad range of sectors and industries (Jovanovic and Rousseau, 2005). Earlier examples are steam, electricity, and internal combustion, while more recent examples include information and communication technologies, biotechnology, nanotechnology, and green technologies. Given their potential importance for innovation and growth, tracking the development of these technologies and their diffusion and application is important to inform policy. In this area, there is one particular policy question that users of STI indicators are eager to have answered: Is the United States promoting platforms in information and communication technology, biotechnology, and other technologies to enable innovation in applications?

Bresnahan and Trajtenberg (1995) outline three characteristics of general-purpose technologies: their pervasiveness across sectors, their development and improvement over time, and their ability to spur innovation in their own and other sectors. These characteristics are useful for guiding the measurement of general-purpose technologies. Tracking knowledge generation in these technologies, their diffusion to other sectors, and the linkages among them is important for understanding innovation and other sources of growth in the economy.

Measuring general-purpose technologies poses two main difficulties. The first is that not all of these technologies can be properly identified as belonging to a particular sector, because they are spread across different industry classifications. The second difficulty arises in identifying the use of these technologies in other sectors. Clearly, the extent of these difficulties varies according to each such technology. Information and communication technology is by far the best covered in statistics in terms of both industry classification and identification of investments in other sectors.

A number of the data sources discussed in this chapter can be used to generate indicators of general-purpose technologies. For example, patents and trademarks can be used to measure the use of such technologies for knowledge creation in sectors other than those in which they were developed, and both patent and bibliometric data can be used to measure the linkages among general-purpose technology sectors. R&D data provide an indicator of knowledge generation in sectors that develop general-purpose technologies, as do broader measures of investment in these technologies. In addition, the BRDIS contains data on the percentage of R&D in energy applications, environmental protection applications, software, medical clinical trials, biotechnology, and nanotechnology. These data can potentially be used to investigate the extent of R&D in these technologies across sectors (thus giving a picture of how “general-purpose” these technologies are).

NCSES currently publishes a number of statistics on general-purpose technologies—particularly for information and communication technology, but increasingly also for green technologies. The panel encourages NCSES to continue this work and also to build on current indicators in this area. In particular, NCSES should examine possibilities for better coverage of the diffusion and uptake of general-purpose technologies in sectors other than those in which they were developed, using both BRDIS and other data sources.

RECOMMENDATION 5-4: The National Center for Science and Engineering Statistics (NCSES) should develop a suite of indicators that can be used to track the development and diffusion of general-purpose technologies, including information and communication technologies, biotechnology, nanotechnology, and green technologies. NCSES should attempt to make greater use of data from the Business Research and Development and Innovation Survey for this purpose while also exploring the use of other sources, such as patent and bibliometric data.

Subnational Issues in Measuring New Knowledge and Knowledge Networks

Compared with the measurement of innovation, the measurement of knowledge production is more clearly connected to geographic location. A number of initiatives by successive administrations have emphasized the ability to locate federal research grants on S&T down to very detailed levels within neighborhoods. Of course, some of this detail is spurious. Establishment data may link to some postal address while the actual economic activity is being carried out over a territory of some size, depending on the industry. Moreover, much of the value derived from these targeted investments comes from the trade of goods and services, which is dispersed geographically.

Still, disaggregating is certainly possible to levels much finer than the states. For example, universities are well-behaved geographic phenomena in that they remain in one place, and their relation to various nested administrative hierarchies is straightforward. Their laboratories and research facilities are similar to those of other establishments; in fact, some of them resemble industrial facilities with loading docks, employees, and so on. The movement of goods and people in the university establishment can be accounted for in the production of scientific work.

Some success appears to have been achieved in gathering data on some of the basic science output tied to spatial units. Geographic identifiers appear in many contexts, including author lists of publications and patent applications. With some care, and some level of error, these outputs can be linked to a location. But difficulties are entailed in measuring the impacts of research investments, particularly with spatial disaggregation. Particularly challenging to measure is the geographic instantiation of a knowledge network and the flows of knowledge from place to place.

As a reference for understanding the national system of R&D, it may be worthwhile to examine the results of a major study conducted in Canada in 2011 (Jenkins et al., 2011). A six-member expert panel carried out a full review of the country’s federal R&D support programs. While one important theme concerned Canada’s balance between tax credits and direct R&D support, the authors’ comprehensive study of the whole system of R&D support programs bears examination for application to the United States. The Canadian panel surveyed more than 60 institutes and programs engaged in supporting business innovation. The distribution was highly skewed, with a few relatively large entities and many small ones. Because each was created under a distinct charter, there is little coherence in the criteria used to evaluate effectiveness, a common problem worldwide. The tendency, as in other countries, is to concentrate on generating more investment in R&D rather than on providing mechanisms for industry to obtain the assistance needed to overcome current problems in operations. Certain gaps also became evident from this comprehensive analysis, leading the Canadian panel to offer recommendations for short-term measures to improve the effectiveness of the country’s innovation system. The Canadian panel notes that the responsibility for fostering innovation cuts across many functions of government and therefore requires a system-wide perspective and whole-of-government priority. That panel’s recommendations include

making encouragement of innovation in the Canadian economy a stated objective of federal procurement policies and programs, restructuring procurement procedures to allow more latitude for innovative solutions to emerge, and reorienting existing federal laboratories to serve sectoral needs. 32

The important message in the present context is that certain aspects of the innovation system emerge from a comprehensive view of the whole. Canada invested the efforts of a distinguished panel in such a process, with clear results for managing its system. That panel’s analysis also raised the question of how to compare existing R&D support programs. Although NCSES, as a statistical office, does not conduct evaluation, it should be in a position to provide information on government programs that would support the evaluation done by others.

In this chapter, the panel has offered four recommendations regarding the development of indicators of knowledge generation, knowledge networks, and knowledge flows. The focus is on techniques that should be used to develop indicators that users want for specific market sectors and that improve the international comparability of the data. The panel also suggests that the production of certain measures is not in NCSES’s purview, and these measures should instead be acquired from other agencies. In the near term, NCSES should give priority to using tools that are readily available at the agency and continuing existing collaborations with other agencies while developing new techniques and cultivating new linkages over time.

32 The report lists the following sectors (p. 3-13): Goods Industries (agriculture, forestry, fishing and hunting; manufacturing; construction; utilities; and oil and gas and mining); Services Industries (transportation and warehousing; information and cultural industries; wholesale trade; retail trade; finance and insurance, real estate and rental and leasing; professional, scientific, and technical services; and other services); and Unclassified Industries.

This page intentionally left blank.

Since the 1950s, under congressional mandate, the U.S. National Science Foundation (NSF) - through its National Center for Science and Engineering Statistics (NCSES) and predecessor agencies - has produced regularly updated measures of research and development expenditures, employment and training in science and engineering, and other indicators of the state of U.S. science and technology. A more recent focus has been on measuring innovation in the corporate sector. NCSES collects its own data on science, technology, and innovation (STI) activities and also incorporates data from other agencies to produce indicators that are used for monitoring purposes - including comparisons among sectors, regions, and with other countries - and for identifying trends that may require policy attention and generate research needs. NCSES also provides extensive tabulations and microdata files for in-depth analysis.

Capturing Change in Science, Technology, and Innovation assesses and provides recommendations regarding the need for revised, refocused, and newly developed indicators of STI activities that would enable NCSES to respond to changing policy concerns. This report also identifies and assesses both existing and potential data resources and tools that NCSES could exploit to further develop its indicators program. Finally, the report considers strategic pathways for NCSES to move forward with an improved STI indicators program. The recommendations offered in Capturing Change in Science, Technology, and Innovation are intended to serve as the basis for a strategic program of work that will enhance NCSES's ability to produce indicators that capture change in science, technology, and innovation to inform policy and optimally meet the needs of its user community.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Pharmacol Pharmacother
  • v.4(2); Apr-Jun 2013

The critical steps for successful research: The research proposal and scientific writing: (A report on the pre-conference workshop held in conjunction with the 64 th annual conference of the Indian Pharmaceutical Congress-2012)

Pitchai balakumar.

Pharmacology Unit, Faculty of Pharmacy, AIMST University, Semeling, 08100 Bedong. Kedah Darul Aman, Malaysia

Mohammed Naseeruddin Inamdar

1 Department of Pharmacology, Al-Ameen College of Pharmacy, Bengaluru, Karnataka, India

Gowraganahalli Jagadeesh

2 Division of Cardiovascular and Renal Products, Center for Drug Evaluation and Research, US Food and Drug Administration, Silver Spring, USA

An interactive workshop on ‘The Critical Steps for Successful Research: The Research Proposal and Scientific Writing’ was conducted in conjunction with the 64 th Annual Conference of the Indian Pharmaceutical Congress-2012 at Chennai, India. In essence, research is performed to enlighten our understanding of a contemporary issue relevant to the needs of society. To accomplish this, a researcher begins search for a novel topic based on purpose, creativity, critical thinking, and logic. This leads to the fundamental pieces of the research endeavor: Question, objective, hypothesis, experimental tools to test the hypothesis, methodology, and data analysis. When correctly performed, research should produce new knowledge. The four cornerstones of good research are the well-formulated protocol or proposal that is well executed, analyzed, discussed and concluded. This recent workshop educated researchers in the critical steps involved in the development of a scientific idea to its successful execution and eventual publication.

INTRODUCTION

Creativity and critical thinking are of particular importance in scientific research. Basically, research is original investigation undertaken to gain knowledge and understand concepts in major subject areas of specialization, and includes the generation of ideas and information leading to new or substantially improved scientific insights with relevance to the needs of society. Hence, the primary objective of research is to produce new knowledge. Research is both theoretical and empirical. It is theoretical because the starting point of scientific research is the conceptualization of a research topic and development of a research question and hypothesis. Research is empirical (practical) because all of the planned studies involve a series of observations, measurements, and analyses of data that are all based on proper experimental design.[ 1 – 9 ]

The subject of this report is to inform readers of the proceedings from a recent workshop organized by the 64 th Annual conference of the ‘ Indian Pharmaceutical Congress ’ at SRM University, Chennai, India, from 05 to 06 December 2012. The objectives of the workshop titled ‘The Critical Steps for Successful Research: The Research Proposal and Scientific Writing,’ were to assist participants in developing a strong fundamental understanding of how best to develop a research or study protocol, and communicate those research findings in a conference setting or scientific journal. Completing any research project requires meticulous planning, experimental design and execution, and compilation and publication of findings in the form of a research paper. All of these are often unfamiliar to naïve researchers; thus, the purpose of this workshop was to teach participants to master the critical steps involved in the development of an idea to its execution and eventual publication of the results (See the last section for a list of learning objectives).

THE STRUCTURE OF THE WORKSHOP

The two-day workshop was formatted to include key lectures and interactive breakout sessions that focused on protocol development in six subject areas of the pharmaceutical sciences. This was followed by sessions on scientific writing. DAY 1 taught the basic concepts of scientific research, including: (1) how to formulate a topic for research and to describe the what, why , and how of the protocol, (2) biomedical literature search and review, (3) study designs, statistical concepts, and result analyses, and (4) publication ethics. DAY 2 educated the attendees on the basic elements and logistics of writing a scientific paper and thesis, and preparation of poster as well as oral presentations.

The final phase of the workshop was the ‘Panel Discussion,’ including ‘Feedback/Comments’ by participants. There were thirteen distinguished speakers from India and abroad. Approximately 120 post-graduate and pre-doctoral students, young faculty members, and scientists representing industries attended the workshop from different parts of the country. All participants received a printed copy of the workshop manual and supporting materials on statistical analyses of data.

THE BASIC CONCEPTS OF RESEARCH: THE KEY TO GETTING STARTED IN RESEARCH

A research project generally comprises four key components: (1) writing a protocol, (2) performing experiments, (3) tabulating and analyzing data, and (4) writing a thesis or manuscript for publication.

Fundamentals in the research process

A protocol, whether experimental or clinical, serves as a navigator that evolves from a basic outline of the study plan to become a qualified research or grant proposal. It provides the structural support for the research. Dr. G. Jagadeesh (US FDA), the first speaker of the session, spoke on ‘ Fundamentals in research process and cornerstones of a research project .’ He discussed at length the developmental and structural processes in preparing a research protocol. A systematic and step-by-step approach is necessary in planning a study. Without a well-designed protocol, there would be a little chance for successful completion of a research project or an experiment.

Research topic

The first and the foremost difficult task in research is to identify a topic for investigation. The research topic is the keystone of the entire scientific enterprise. It begins the project, drives the entire study, and is crucial for moving the project forward. It dictates the remaining elements of the study [ Table 1 ] and thus, it should not be too narrow or too broad or unfocused. Because of these potential pitfalls, it is essential that a good or novel scientific idea be based on a sound concept. Creativity, critical thinking, and logic are required to generate new concepts and ideas in solving a research problem. Creativity involves critical thinking and is associated with generating many ideas. Critical thinking is analytical, judgmental, and involves evaluating choices before making a decision.[ 4 ] Thus, critical thinking is convergent type thinking that narrows and refines those divergent ideas and finally settles to one idea for an in-depth study. The idea on which a research project is built should be novel, appropriate to achieve within the existing conditions, and useful to the society at large. Therefore, creativity and critical thinking assist biomedical scientists in research that results in funding support, novel discovery, and publication.[ 1 , 4 ]

Elements of a study protocol

An external file that holds a picture, illustration, etc.
Object name is JPP-4-130-g001.jpg

Research question

The next most crucial aspect of a study protocol is identifying a research question. It should be a thought-provoking question. The question sets the framework. It emerges from the title, findings/results, and problems observed in previous studies. Thus, mastering the literature, attendance at conferences, and discussion in journal clubs/seminars are sources for developing research questions. Consider the following example in developing related research questions from the research topic.

Hepatoprotective activity of Terminalia arjuna and Apium graveolens on paracetamol-induced liver damage in albino rats.

How is paracetamol metabolized in the body? Does it involve P450 enzymes? How does paracetamol cause liver injury? What are the mechanisms by which drugs can alleviate liver damage? What biochemical parameters are indicative of liver injury? What major endogenous inflammatory molecules are involved in paracetamol-induced liver damage?

A research question is broken down into more precise objectives. The objectives lead to more precise methods and definition of key terms. The objectives should be SMART-Specific, Measurable, Achievable, Realistic, Time-framed,[ 10 ] and should cover the entire breadth of the project. The objectives are sometimes organized into hierarchies: Primary, secondary, and exploratory; or simply general and specific. Study the following example:

To evaluate the safety and tolerability of single oral doses of compound X in normal volunteers.

To assess the pharmacokinetic profile of compound X following single oral doses.

To evaluate the incidence of peripheral edema reported as an adverse event.

The objectives and research questions are then formulated into a workable or testable hypothesis. The latter forces us to think carefully about what comparisons will be needed to answer the research question, and establishes the format for applying statistical tests to interpret the results. The hypothesis should link a process to an existing or postulated biologic pathway. A hypothesis is written in a form that can yield measurable results. Studies that utilize statistics to compare groups of data should have a hypothesis. Consider the following example:

  • The hepatoprotective activity of Terminalia arjuna is superior to that of Apium graveolens against paracetamol-induced liver damage in albino rats.

All biological research, including discovery science, is hypothesis-driven. However, not all studies need be conducted with a hypothesis. For example, descriptive studies (e.g., describing characteristics of a plant, or a chemical compound) do not need a hypothesis.[ 1 ]

Relevance of the study

Another important section to be included in the protocol is ‘significance of the study.’ Its purpose is to justify the need for the research that is being proposed (e.g., development of a vaccine for a disease). In summary, the proposed study should demonstrate that it represents an advancement in understanding and that the eventual results will be meaningful, contribute to the field, and possibly even impact society.

Biomedical literature

A literature search may be defined as the process of examining published sources of information on a research or review topic, thesis, grant application, chemical, drug, disease, or clinical trial, etc. The quantity of information available in print or electronically (e.g., the internet) is immense and growing with time. A researcher should be familiar with the right kinds of databases and search engines to extract the needed information.[ 3 , 6 ]

Dr. P. Balakumar (Institute of Pharmacy, Rajendra Institute of Technology and Sciences, Sirsa, Haryana; currently, Faculty of Pharmacy, AIMST University, Malaysia) spoke on ‘ Biomedical literature: Searching, reviewing and referencing .’ He schematically explained the basis of scientific literature, designing a literature review, and searching literature. After an introduction to the genesis and diverse sources of scientific literature searches, the use of PubMed, one of the premier databases used for biomedical literature searches world-wide, was illustrated with examples and screenshots. Several companion databases and search engines are also used for finding information related to health sciences, and they include Embase, Web of Science, SciFinder, The Cochrane Library, International Pharmaceutical Abstracts, Scopus, and Google Scholar.[ 3 ] Literature searches using alternative interfaces for PubMed such as GoPubMed, Quertle, PubFocus, Pubget, and BibliMed were discussed. The participants were additionally informed of databases on chemistry, drugs and drug targets, clinical trials, toxicology, and laboratory animals (reviewed in ref[ 3 ]).

Referencing and bibliography are essential in scientific writing and publication.[ 7 ] Referencing systems are broadly classified into two major types, such as Parenthetical and Notation systems. Parenthetical referencing is also known as Harvard style of referencing, while Vancouver referencing style and ‘Footnote’ or ‘Endnote’ are placed under Notation referencing systems. The participants were educated on each referencing system with examples.

Bibliography management

Dr. Raj Rajasekaran (University of California at San Diego, CA, USA) enlightened the audience on ‘ bibliography management ’ using reference management software programs such as Reference Manager ® , Endnote ® , and Zotero ® for creating and formatting bibliographies while writing a manuscript for publication. The discussion focused on the use of bibliography management software in avoiding common mistakes such as incomplete references. Important steps in bibliography management, such as creating reference libraries/databases, searching for references using PubMed/Google scholar, selecting and transferring selected references into a library, inserting citations into a research article and formatting bibliographies, were presented. A demonstration of Zotero®, a freely available reference management program, included the salient features of the software, adding references from PubMed using PubMed ID, inserting citations and formatting using different styles.

Writing experimental protocols

The workshop systematically instructed the participants in writing ‘ experimental protocols ’ in six disciplines of Pharmaceutical Sciences.: (1) Pharmaceutical Chemistry (presented by Dr. P. V. Bharatam, NIPER, Mohali, Punjab); (2) Pharmacology (presented by Dr. G. Jagadeesh and Dr. P. Balakumar); (3) Pharmaceutics (presented by Dr. Jayant Khandare, Piramal Life Sciences, Mumbai); (4) Pharmacy Practice (presented by Dr. Shobha Hiremath, Al-Ameen College of Pharmacy, Bengaluru); (5) Pharmacognosy and Phytochemistry (presented by Dr. Salma Khanam, Al-Ameen College of Pharmacy, Bengaluru); and (6) Pharmaceutical Analysis (presented by Dr. Saranjit Singh, NIPER, Mohali, Punjab). The purpose of the research plan is to describe the what (Specific Aims/Objectives), why (Background and Significance), and how (Design and Methods) of the proposal.

The research plan should answer the following questions: (a) what do you intend to do; (b) what has already been done in general, and what have other researchers done in the field; (c) why is this worth doing; (d) how is it innovative; (e) what will this new work add to existing knowledge; and (f) how will the research be accomplished?

In general, the format used by the faculty in all subjects is shown in Table 2 .

Elements of a research protocol

An external file that holds a picture, illustration, etc.
Object name is JPP-4-130-g002.jpg

Biostatistics

Biostatistics is a key component of biomedical research. Highly reputed journals like The Lancet, BMJ, Journal of the American Medical Association, and many other biomedical journals include biostatisticians on their editorial board or reviewers list. This indicates that a great importance is given for learning and correctly employing appropriate statistical methods in biomedical research. The post-lunch session on day 1 of the workshop was largely committed to discussion on ‘ Basic biostatistics .’ Dr. R. Raveendran (JIPMER, Puducherry) and Dr. Avijit Hazra (PGIMER, Kolkata) reviewed, in parallel sessions, descriptive statistics, probability concepts, sample size calculation, choosing a statistical test, confidence intervals, hypothesis testing and ‘ P ’ values, parametric and non-parametric statistical tests, including analysis of variance (ANOVA), t tests, Chi-square test, type I and type II errors, correlation and regression, and summary statistics. This was followed by a practice and demonstration session. Statistics CD, compiled by Dr. Raveendran, was distributed to the participants before the session began and was demonstrated live. Both speakers worked on a variety of problems that involved both clinical and experimental data. They discussed through examples the experimental designs encountered in a variety of studies and statistical analyses performed for different types of data. For the benefit of readers, we have summarized statistical tests applied frequently for different experimental designs and post-hoc tests [ Figure 1 ].

An external file that holds a picture, illustration, etc.
Object name is JPP-4-130-g003.jpg

Conceptual framework for statistical analyses of data. Of the two kinds of variables, qualitative (categorical) and quantitative (numerical), qualitative variables (nominal or ordinal) are not normally distributed. Numerical data that come from normal distributions are analyzed using parametric tests, if not; the data are analyzed using non-parametric tests. The most popularly used Student's t -test compares the means of two populations, data for this test could be paired or unpaired. One-way analysis of variance (ANOVA) is used to compare the means of three or more independent populations that are normally distributed. Applying t test repeatedly in pair (multiple comparison), to compare the means of more than two populations, will increase the probability of type I error (false positive). In this case, for proper interpretation, we need to adjust the P values. Repeated measures ANOVA is used to compare the population means if more than two observations coming from same subject over time. The null hypothesis is rejected with a ‘ P ’ value of less than 0.05, and the difference in population means is considered to be statistically significant. Subsequently, appropriate post-hoc tests are used for pairwise comparisons of population means. Two-way or three-way ANOVA are considered if two (diet, dose) or three (diet, dose, strain) independent factors, respectively, are analyzed in an experiment (not described in the Figure). Categorical nominal unmatched variables (counts or frequencies) are analyzed by Chi-square test (not shown in the Figure)

Research and publication ethics

The legitimate pursuit of scientific creativity is unfortunately being marred by a simultaneous increase in scientific misconduct. A disproportionate share of allegations involves scientists of many countries, and even from respected laboratories. Misconduct destroys faith in science and scientists and creates a hierarchy of fraudsters. Investigating misconduct also steals valuable time and resources. In spite of these facts, most researchers are not aware of publication ethics.

Day 1 of the workshop ended with a presentation on ‘ research and publication ethics ’ by Dr. M. K. Unnikrishnan (College of Pharmaceutical Sciences, Manipal University, Manipal). He spoke on the essentials of publication ethics that included plagiarism (attempting to take credit of the work of others), self-plagiarism (multiple publications by an author on the same content of work with slightly different wordings), falsification (manipulation of research data and processes and omitting critical data or results), gift authorship (guest authorship), ghostwriting (someone other than the named author (s) makes a major contribution), salami publishing (publishing many papers, with minor differences, from the same study), and sabotage (distracting the research works of others to halt their research completion). Additionally, Dr. Unnikrishnan pointed out the ‘ Ingelfinger rule ’ of stipulating that a scientist must not submit the same original research in two different journals. He also advised the audience that authorship is not just credit for the work but also responsibility for scientific contents of a paper. Although some Indian Universities are instituting preventive measures (e.g., use of plagiarism detecting software, Shodhganga digital archiving of doctoral theses), Dr. Unnikrishnan argued for a great need to sensitize young researchers on the nature and implications of scientific misconduct. Finally, he discussed methods on how editors and peer reviewers should ethically conduct themselves while managing a manuscript for publication.

SCIENTIFIC COMMUNICATION: THE KEY TO SUCCESSFUL SELLING OF FINDINGS

Research outcomes are measured through quality publications. Scientists must not only ‘do’ science but must ‘write’ science. The story of the project must be told in a clear, simple language weaving in previous work done in the field, answering the research question, and addressing the hypothesis set forth at the beginning of the study. Scientific publication is an organic process of planning, researching, drafting, revising, and updating the current knowledge for future perspectives. Writing a research paper is no easier than the research itself. The lectures of Day 2 of the workshop dealt with the basic elements and logistics of writing a scientific paper.

An overview of paper structure and thesis writing

Dr. Amitabh Prakash (Adis, Auckland, New Zealand) spoke on ‘ Learning how to write a good scientific paper .’ His presentation described the essential components of an original research paper and thesis (e.g., introduction, methods, results, and discussion [IMRaD]) and provided guidance on the correct order, in which data should appear within these sections. The characteristics of a good abstract and title and the creation of appropriate key words were discussed. Dr. Prakash suggested that the ‘title of a paper’ might perhaps have a chance to make a good impression, and the title might be either indicative (title that gives the purpose of the study) or declarative (title that gives the study conclusion). He also suggested that an abstract is a succinct summary of a research paper, and it should be specific, clear, and concise, and should have IMRaD structure in brief, followed by key words. Selection of appropriate papers to be cited in the reference list was also discussed. Various unethical authorships were enumerated, and ‘The International Committee of Medical Journal Editors (ICMJE) criteria for authorship’ was explained ( http://www.icmje.org/ethical_1author.html ; also see Table 1 in reference #9). The session highlighted the need for transparency in medical publication and provided a clear description of items that needed to be included in the ‘Disclosures’ section (e.g., sources of funding for the study and potential conflicts of interest of all authors, etc.) and ‘Acknowledgements’ section (e.g., writing assistance and input from all individuals who did not meet the authorship criteria). The final part of the presentation was devoted to thesis writing, and Dr. Prakash provided the audience with a list of common mistakes that are frequently encountered when writing a manuscript.

The backbone of a study is description of results through Text, Tables, and Figures. Dr. S. B. Deshpande (Institute of Medical Sciences, Banaras Hindu University, Varanasi, India) spoke on ‘ Effective Presentation of Results .’ The Results section deals with the observations made by the authors and thus, is not hypothetical. This section is subdivided into three segments, that is, descriptive form of the Text, providing numerical data in Tables, and visualizing the observations in Graphs or Figures. All these are arranged in a sequential order to address the question hypothesized in the Introduction. The description in Text provides clear content of the findings highlighting the observations. It should not be the repetition of facts in tables or graphs. Tables are used to summarize or emphasize descriptive content in the text or to present the numerical data that are unrelated. Illustrations should be used when the evidence bearing on the conclusions of a paper cannot be adequately presented in a written description or in a Table. Tables or Figures should relate to each other logically in sequence and should be clear by themselves. Furthermore, the discussion is based entirely on these observations. Additionally, how the results are applied to further research in the field to advance our understanding of research questions was discussed.

Dr. Peush Sahni (All-India Institute of Medical Sciences, New Delhi) spoke on effectively ‘ structuring the Discussion ’ for a research paper. The Discussion section deals with a systematic interpretation of study results within the available knowledge. He said the section should begin with the most important point relating to the subject studied, focusing on key issues, providing link sentences between paragraphs, and ensuring the flow of text. Points were made to avoid history, not repeat all the results, and provide limitations of the study. The strengths and novel findings of the study should be provided in the discussion, and it should open avenues for future research and new questions. The Discussion section should end with a conclusion stating the summary of key findings. Dr. Sahni gave an example from a published paper for writing a Discussion. In another presentation titled ‘ Writing an effective title and the abstract ,’ Dr. Sahni described the important components of a good title, such as, it should be simple, concise, informative, interesting and eye-catching, accurate and specific about the paper's content, and should state the subject in full indicating study design and animal species. Dr. Sahni explained structured (IMRaD) and unstructured abstracts and discussed a few selected examples with the audience.

Language and style in publication

The next lecture of Dr. Amitabh Prakash on ‘ Language and style in scientific writing: Importance of terseness, shortness and clarity in writing ’ focused on the actual sentence construction, language, grammar and punctuation in scientific manuscripts. His presentation emphasized the importance of brevity and clarity in the writing of manuscripts describing biomedical research. Starting with a guide to the appropriate construction of sentences and paragraphs, attendees were given a brief overview of the correct use of punctuation with interactive examples. Dr. Prakash discussed common errors in grammar and proactively sought audience participation in correcting some examples. Additional discussion was centered on discouraging the use of redundant and expendable words, jargon, and the use of adjectives with incomparable words. The session ended with a discussion of words and phrases that are commonly misused (e.g., data vs . datum, affect vs . effect, among vs . between, dose vs . dosage, and efficacy/efficacious vs . effective/effectiveness) in biomedical research manuscripts.

Working with journals

The appropriateness in selecting the journal for submission and acceptance of the manuscript should be determined by the experience of an author. The corresponding author must have a rationale in choosing the appropriate journal, and this depends upon the scope of the study and the quality of work performed. Dr. Amitabh Prakash spoke on ‘ Working with journals: Selecting a journal, cover letter, peer review process and impact factor ’ by instructing the audience in assessing the true value of a journal, understanding principles involved in the peer review processes, providing tips on making an initial approach to the editorial office, and drafting an appropriate cover letter to accompany the submission. His presentation defined the metrics that are most commonly used to measure journal quality (e.g., impact factor™, Eigenfactor™ score, Article Influence™ score, SCOPUS 2-year citation data, SCImago Journal Rank, h-Index, etc.) and guided attendees on the relative advantages and disadvantages of using each metric. Factors to consider when assessing journal quality were discussed, and the audience was educated on the ‘green’ and ‘gold’ open access publication models. Various peer review models (e.g., double-blind, single-blind, non-blind) were described together with the role of the journal editor in assessing manuscripts and selecting suitable reviewers. A typical checklist sent to referees was shared with the attendees, and clear guidance was provided on the best way to address referee feedback. The session concluded with a discussion of the potential drawbacks of the current peer review system.

Poster and oral presentations at conferences

Posters have become an increasingly popular mode of presentation at conferences, as it can accommodate more papers per meeting, has no time constraint, provides a better presenter-audience interaction, and allows one to select and attend papers of interest. In Figure 2 , we provide instructions, design, and layout in preparing a scientific poster. In the final presentation, Dr. Sahni provided the audience with step-by-step instructions on how to write and format posters for layout, content, font size, color, and graphics. Attendees were given specific guidance on the format of text on slides, the use of color, font type and size, and the use of illustrations and multimedia effects. Moreover, the importance of practical tips while delivering oral or poster presentation was provided to the audience, such as speak slowly and clearly, be informative, maintain eye contact, and listen to the questions from judges/audience carefully before coming up with an answer.

An external file that holds a picture, illustration, etc.
Object name is JPP-4-130-g004.jpg

Guidelines and design to scientific poster presentation. The objective of scientific posters is to present laboratory work in scientific meetings. A poster is an excellent means of communicating scientific work, because it is a graphic representation of data. Posters should have focus points, and the intended message should be clearly conveyed through simple sections: Text, Tables, and Graphs. Posters should be clear, succinct, striking, and eye-catching. Colors should be used only where necessary. Use one font (Arial or Times New Roman) throughout. Fancy fonts should be avoided. All headings should have font size of 44, and be in bold capital letters. Size of Title may be a bit larger; subheading: Font size of 36, bold and caps. References and Acknowledgments, if any, should have font size of 24. Text should have font size between 24 and 30, in order to be legible from a distance of 3 to 6 feet. Do not use lengthy notes

PANEL DISCUSSION: FEEDBACK AND COMMENTS BY PARTICIPANTS

After all the presentations were made, Dr. Jagadeesh began a panel discussion that included all speakers. The discussion was aimed at what we do currently and could do in the future with respect to ‘developing a research question and then writing an effective thesis proposal/protocol followed by publication.’ Dr. Jagadeesh asked the following questions to the panelists, while receiving questions/suggestions from the participants and panelists.

  • Does a Post-Graduate or Ph.D. student receive adequate training, either through an institutional course, a workshop of the present nature, or from the guide?
  • Are these Post-Graduates self-taught (like most of us who learnt the hard way)?
  • How are these guides trained? How do we train them to become more efficient mentors?
  • Does a Post-Graduate or Ph.D. student struggle to find a method (s) to carry out studies? To what extent do seniors/guides help a post graduate overcome technical difficulties? How difficult is it for a student to find chemicals, reagents, instruments, and technical help in conducting studies?
  • Analyses of data and interpretation: Most students struggle without adequate guidance.
  • Thesis and publications frequently feature inadequate/incorrect statistical analyses and representation of data in tables/graphs. The student, their guide, and the reviewers all share equal responsibility.
  • Who initiates and drafts the research paper? The Post-Graduate or their guide?
  • What kind of assistance does a Post-Graduate get from the guide in finalizing a paper for publication?
  • Does the guide insist that each Post-Graduate thesis yield at least one paper, and each Ph.D. thesis more than two papers, plus a review article?

The panelists and audience expressed a variety of views, but were unable to arrive at a decisive conclusion.

WHAT HAVE THE PARTICIPANTS LEARNED?

At the end of this fast-moving two-day workshop, the participants had opportunities in learning the following topics:

  • Sequential steps in developing a study protocol, from choosing a research topic to developing research questions and a hypothesis.
  • Study protocols on different topics in their subject of specialization
  • Searching and reviewing the literature
  • Appropriate statistical analyses in biomedical research
  • Scientific ethics in publication
  • Writing and understanding the components of a research paper (IMRaD)
  • Recognizing the value of good title, running title, abstract, key words, etc
  • Importance of Tables and Figures in the Results section, and their importance in describing findings
  • Evidence-based Discussion in a research paper
  • Language and style in writing a paper and expert tips on getting it published
  • Presentation of research findings at a conference (oral and poster).

Overall, the workshop was deemed very helpful to participants. The participants rated the quality of workshop from “ satisfied ” to “ very satisfied .” A significant number of participants were of the opinion that the time allotted for each presentation was short and thus, be extended from the present two days to four days with adequate time to ask questions. In addition, a ‘hands-on’ session should be introduced for writing a proposal and manuscript. A large number of attendees expressed their desire to attend a similar workshop, if conducted, in the near future.

ACKNOWLEDGMENT

We gratefully express our gratitude to the Organizing Committee, especially Professors K. Chinnasamy, B. G. Shivananda, N. Udupa, Jerad Suresh, Padma Parekh, A. P. Basavarajappa, Mr. S. V. Veerramani, Mr. J. Jayaseelan, and all volunteers of the SRM University. We thank Dr. Thomas Papoian (US FDA) for helpful comments on the manuscript.

The opinions expressed herein are those of Gowraganahalli Jagadeesh and do not necessarily reflect those of the US Food and Drug Administration

Source of Support: Nil

Conflict of Interest: None declared.

What is Research

                                                                                  research is defined as: “A investigation ( i.e., the gathering and analysis of information) designed to develop or contribute to knowledge.” The National Academy of Sciences states that the object of research is to “extend human knowledge of the physical, biological, or social world beyond what is already known.” Research is different than other forms of discovering knowledge (like reading a book) because it uses a systematic process called the Scientific Method .

The Scientific Method consists of observing the world around you and creating a about relationships in the world. A hypothesis is an informed and educated prediction or explanation about something. Part of the research process involves testing the , and then examining the results of these tests as they relate to both the hypothesis and the world around you. When a researcher forms a hypothesis, this acts like a map through the research study. It tells the researcher which factors are important to study and how they might be related to each other or caused by a that the researcher introduces (e.g. a program, treatment or change in the environment). With this map, the researcher can interpret the information he/she collects and can make sound conclusions about the results.

Research can be done with human beings, animals, plants, other organisms and inorganic matter. When research is done with human beings and animals, it must follow specific rules about the treatment of humans and animals that have been created by the U.S. Federal Government. This ensures that humans and animals are treated with dignity and respect, and that the research causes minimal harm.

No matter what topic is being studied, the value of the research depends on how well it is designed and done. Therefore, one of the most important considerations in doing good research is to follow the design or plan that is developed by an experienced researcher who is called the (PI). The PI is in charge of all aspects of the research and creates what is called a (the research plan) that all people doing the research must follow. By doing so, the PI and the public can be sure that the results of the research are real and useful to other scientists.

                              

                                                                                                            

 

Knowledge creation in projects: an interactive research approach for deeper business insight

International Journal of Managing Projects in Business

ISSN : 1753-8378

Article publication date: 3 August 2022

Issue publication date: 1 March 2023

The purpose of this paper is to shed light on different types of knowledge created and how this links to the project design, process, and content.

Design/methodology/approach

In this paper the authors investigate participants' experiences from a three-year interactive research project, designed to trigger reflection among the participants. They apply a knowledge creation perspective on experiences expressed by participants as a result of different research project activities.

The study resulted in five categories of insights with potential for sustainable influence on the participating organizations: an understanding of concepts and theories; an understanding of the impacts of collaborative, reflective work processes; an understanding of the meaning of one's own organizational context; an understanding of the importance of increased organizational self-awareness; and an understanding of the potential for human interaction and communication.

Practical implications

The author’s findings suggest that it is possible to design a project to promote more profound and sustainable effects on a business beyond the explicit purpose of the project. They advise practitioners to make room for iterative reflection; be mindful to create a trustful and open environment in the team; challenge results with opposing views and theories; and make room for sharing experiences and giving feedback.

Originality/value

This study contributes to unraveling key practices which can nurture conditions for knowledge creation in interactive research projects and business projects alike.

  • Practice-based research
  • Collaborative research
  • Knowledge creation
  • Qualitative research
  • Project management

Engström, A. , Johansson, A. , Edh Mirzaei, N. , Sollander, K. and Barry, D. (2023), "Knowledge creation in projects: an interactive research approach for deeper business insight", International Journal of Managing Projects in Business , Vol. 16 No. 1, pp. 22-44. https://doi.org/10.1108/IJMPB-09-2021-0233

Emerald Publishing Limited

Copyright © 2022, Annika Engström, Anette Johansson, Nina Edh Mirzaei, Kristina Sollander and Daved Barry

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode .

1. Introduction

The project form dominates work in large parts of our society, and the term “projectification” is used to explain developments toward the creation of a project society ( Lundin et al. , 2017 ). Projects are seen as efficient ways of organizing people with different areas of expertise to work on a joint task toward common goals, often in contexts that demand collaboration between different competencies, functions, and departments ( Canonico et al. , 2013 ). This is typically the type of context involved in complex product and process development, which has contributed to the view of project-based organizations as key sites for knowledge creation and innovation ( Davies and Hobday, 2005 ). Projects are also often seen as learning spaces ( Nilsen, 2013 ) and used as learning mechanisms ( Scarborough et al. , 2004 ). Understanding how knowledge in projects is created, communicated, and shared in organizations is critical to both project management research and practice, given the strong influence of projects on society.

One arena where knowledge creation is at the very core is academic research projects. Plenty of evidence shows that projects within academic research using interactive, action-oriented, collaborative research forms—thus sharing similarities with typical business context projects—have positive impacts on learning ( Svejvig et al. , 2021 ). Nevertheless, despite the potential this research approach has for addressing complex organizational problems ( Avison et al. , 2018 ) by combining theoretical rigor and practical insights ( Geraldi and Söderlund, 2018 ), it has received little attention in the project research community ( Svejvig et al. , 2021 ).

The co-production of knowledge in research projects ( Lindhult and Axelsson, 2021 ) is a strong tradition in Scandinavian management research ( Gunnarsson et al. , 2015 ), as well as an important part of the sustainable Swedish model for innovation, renewal, and effectiveness in the industry ( Magnusson and Ottosson, 2012 ). In the family of co-productive research approaches (CARs) ( Lindhult and Axelsson, 2021 ), interactive research is developed from action research traditions ( Aagard-Nielsen and Svensson, 2006 ). Actions and changes in behaviors and mindsets are the focus, and shared experiences in joint learning through different phases in the research process are central ( Svensson et al. , 2002 , 2007 ). Interactive research differs from action research in that researchers (the academic system) and practitioners (the practice system) have an equal relationship with and impact on the knowledge created. At the same time, the roles and responsibilities connected with knowledge creation in the respective systems are clearly defined – researchers are responsible for knowledge creation in the academic system, whereas practitioners are responsible for knowledge creation in the practice system ( Aagard Nielsen and Svensson, 2006 ; Cronholm and Goldkuhl, 2003 ).

Interactive research approaches have recently been evaluated and described as powerful in terms of validity for meeting organizational and societal needs and for reaching rigorous research results ( Ellström et al. , 2020 ; Wallo, 2008 ; Wallo et al. , 2012 ; Andersson et al. , 2022 ). However, despite the seemingly common agreement among researchers that interactive research has an impact on learning, research in this context still lacks empirical descriptions and examples of cases and research project designs of this kind ( Lindhult and Axelsson, 2021 ). Interactive research is often designed as projects that include analytic seminars between scholars and practitioners ( Ellström et al. , 2020 ), feedback dialogue meetings with companies, and workshops that include researchers and practitioners ( Svensson et al. , 2002 , 2007 ). There is little understanding of how different types of activities, such as reflective conversations, meetings, and workshops, lead to various kinds of knowledge creation ( Lindhult and Axelsson, 2021 ). Additionally, the increasing demand for academia to collaborate with and contribute to society and the attention to the impact of publications rather than their quantity make robust models important for achieving sustainable effects in interactive, collaborative research projects ( Lindhult and Axelsson, 2021 ; Svejvig et al., 2021 ).

In this paper, we investigate the experiences from a three-year interactive research project with small and medium-sized manufacturing companies in Sweden focusing on innovation capabilities in terms of organizational ambidexterity ( Junni et al. , 2013; Lubatkin et al. , 2006; O’Reilly and Tushman, 2008; Zimmermann et al. , 2015 )—the ability to simultaneously exploit existing and explore new knowledge ( March, 1991 ). The purpose is to shed light on the different types of knowledge created during the project and how that links to the project design, process, and content. We apply a knowledge creation perspective ( Ellström, 2001 , 2010b , 2011 ) to the experiences that the participants expressed as a result of different activities in the project. In doing so, we respond to the challenges raised relating to understanding how knowledge in interactive projects is created and how it is linked to specific activities. Therefore, we contribute to unraveling key practices which can nurture conditions for knowledge creation in both interactive research projects and business projects.

2. The theoretical framework

The theoretical focus of this paper is knowledge creation processes in interactive projects in general, and in interactive research projects particularly. We refer to knowledge creation as an action-oriented learning process and highlight reflection as an influential mechanism.

2.1 Knowledge creation processes in interactive projects

Projects are often viewed upon as learning spaces ( Nilsen, 2013 ) and used as learning mechanisms ( Scarborough et al. , 2004 ), and there is ample evidence that promoting interaction and collaboration are key ingredients in knowledge creation in projects. For example, a study on quality improvement projects by Choo et al. (2007) defines learning behavior as interaction between members and that adhering to a specific method (in this specific case problem-solving steps in the Six Sigma framework) influenced this interaction and subsequently created knowledge. Other examples include the study by Faccin and Balestrin (2018) who identified collaborative practices in R&D projects to be key to ensure complementary exploration and exploitation approaches necessary for both innovation and knowledge creation, and Weck's (2006) study on interfirm R&D projects which concluded that the exchange of complementary specialist knowledge were key success factors in the process of interfirm knowledge creation.

To combine equal relationships and critical thinking, to balance closeness with critical distance, to be proactive without being controlling (the process is owned by the participants), to start from the specific and local but to strive for general explanations, to have knowledge without being an authority, to be able to adapt and improvise while preserving integrity and independence, to be able to combine theory and practice, to be able to act as trailblazers, to think strategically but at the same time respecting ethical considerations, which requires practical wisdom—phronesis—to be part of the development process without being held ransom by it, and to have good knowledge of [the researcher’s] own discipline but at the same time aim for an interdisciplinary understanding. ( Johannisson et al. , 2008 , p. 371, author’s translation)

Face-to-face meetings in person are common in knowledge creation processes in collaborative research projects ( Palm, 2007 ). Workshops of different kinds have become common ways of carrying out project meetings, even though they have various names, such as dialogue conferences ( Gustavsen and Engelstad, 1986 ), interpretive forums ( Mohrman et al. , 2001 ), cooperative inquiries ( Heron and Reason, 2006 ), seminars ( Svensson et al. , 2007 ), meetings or group meetings ( Larsson, 2006 ), jam sessions ( Börjesson and Fredberg, 2004 ), and feedback sessions ( Ellström, 2007 ). Furthermore, these workshops can play different roles in a project. Their goals can be trust building and networking ( van de Ven, 2007 ), knowledge development ( Adler et al. , 2004 ), knowledge creation ( Jacob et al. , 2000 ), knowledge sharing ( Näslund et al. , 2010 ), joint learning ( Larsson, 2006 ), data analysis or interpretation ( Mohrman et al. , 2001 ; Ellström, 2007 ), or testing and validation of results ( Ellram and Tate, 2015 ).

Knowledge creation in interactive research projects depends on a democratic dialogue characterized by reflection and critical joint analysis, in which equally recognized knowledge interests in research and practice have the potential to complement and support each other, yielding more sustainable results ( Aagard Nielsen and Svensson, 2006 ). The role of the participants needs to be negotiated and renegotiated in a process characterized by critical reflection, as position shifts in relation to the phenomenon being studied may be needed ( Sandberg and Wallo, 2013 ). As highlighted in Figure 1 , research in an interactive approach is responsible for academic knowledge development in the research system, whereas more context-specific knowledge is developed in the practice system. Both are equally important for different purposes.

The overlap between the systems, indicated in the center of Figure 1 , illustrates the joint activities in co-production, whereas the arrowed loops show that the roles and the desirable output are different for the two systems ( Svensson et al. , 2015 ).

2.2 Action-oriented perspectives on knowledge creation

There is no consensus on what knowledge creation really is ( Runsten and Werr, 2016 ). Rather, there are many different definitions of knowledge in relation to different philosophical points of view depending on how one sees science ( Chalmers, 2013 ). Knowledge in the cognitive, rational perspective ( Winn and Snyder, 2004 ) is defined as objective information, facts, or methods, which are separated from both situations and actions ( Runsten and Werr, 2016 ). This perspective on knowledge is common in technically oriented action projects focusing on the development of products, processes, or artifacts in co-creation with partners from industry ( Hevner et al. , 2004 ; Susman and Evered, 1978 ; Wieringa and Morali, 2012 ). Knowledge in the situational, contextual perspective ( Lave and Wenger, 1991 ) is regarded as an activity in a social system ( Engeström, 1987 ), in which the individual's ability and influence on the learning process is limited. This perspective on knowledge is common in social science-oriented projects, which focus on the critical analysis of conditions and agents in social systems ( Engeström, 2008 ; Gustavsson, 2007 ).

An alternative to the above perspectives on knowledge is the action-oriented perspective , in which cognition and context are tightly bound ( Ellström, 2001 , 2010a , 2010b ; Granberg and Ohlsson, 2005 ; Ohlsson, 1996 ). The individual's learning is seen neither as purely cognitive and rational nor completely dependent on the social system, without the ability to think rationally. In this action-oriented perspective ( Schön, 1983 ), knowledge is considered an action— knowing in action —in which knowledge in relation to a problem, specific situation, context, or task is created. Knowledge creation here is defined as a learning process for change in mindsets, behaviors, and action patterns ( Ellström, 1992 ). An action-oriented perspective draws upon Dewey's (2002) way of reasoning—we create knowledge while we are acting. When our habits or assumptions are disturbed, we act on impulse and gain experience. These experiences can, depending on the extent to which they reflect intellectually, create potential for knowledge creation, stimulate change in behavior, and lead to the development of new procedures in dealing with life ( Dewey, 2002 ). This perspective is based on interaction, dialogue, and reflection ( Döös and Wilhelmson, 2011 ; Ohlsson, 1996 ). When challenges are dealt within a social context, individuals in groups can jointly form and create an understanding of and insights into common action alternatives ( Granberg and Ohlsson, 2005 ; Ohlsson, 1996 ). Ellström (2011) illustrates this by showing the tension between implicit and explicit action levels; tacit knowledge and routinized actions are based on habits, whereas awareness, transparency, and explicit work processes can increase knowledge- and reflection-based actions with a new and deeper understanding and insight.

There are different ways to categorize different types of knowledge or the content in learning processes. One way is inspired by anthropological emic and etic approaches ( Chilcott and Barry, 2016 ). The emic approach investigates the knowledge of local people within the system and how they think, whereas the etic approach investigates knowledge in the system from an outside perspective. The members of a culture might be too involved in what they are doing to interpret their behavior impartially. Their assumptions can be seen as social representations ( Moscovici, 1981 ), a mindset that is difficult to change. The etic approach functions as a perspective that a researcher or an outsider could have, which sometimes works as feedback or an eye opener for the people within the system. Another way to categorize different types of knowledge is inspired by the Greek episteme ( Gustavsson, 1996 , 2000 ), which means to understand how the world is structured and how it works; techne , which means to create and produce; and fronesis , which means to develop good judgment and to act as a democratic and ethical citizen. These different types of knowledge are closely related to one another and are formed in a dialectical process in which learning is based on what is already known, familiar, and recognizable in the encounter with the unknown. We experience the new based on how we interpret and understand the world. By doing and reflecting, we obtain insights into the larger context ( Gustavsson, 1996 , 2000 ).

2.3 Reflection in knowledge creation processes

A true reflection does not only mean that one has understood but also how the process of understanding occurred—when reflection leads to deeper knowledge. ( Wenestam and Lendahl Rosendahl, 2005 , pp. 82–83)
Usually reflection on knowing-in-action goes together with reflection on the stuff at hand. There is some puzzling, or troubling, or interesting phenomenon with which the individual is trying to deal. As he tries to make sense of it, he also reflects on the understandings which have been implicit in his action, understandings which he surfaces, criticizes, restructures, and embodies in further action. ( Schön, 1983 , p. 50)

Even though knowledge creation seems necessary in organizational research, individuals and groups often prevent development and resist learning by engaging in defensive routines that avoid critical reflection ( Argyris, 1994 , 2010 ). This defense indicates both preparedness and resistance to change and can be seen as energy in learning processes ( Illeris, 2007 ). Defensive behavior ( Aagard Nielsen and Svensson, 2006 ; Adler et al. , 2004 ; Andersson et al. , 2022 ; Argyris, 1990 , 2010 ) and learning difficulties have been discussed by Senge (1990) . An excessive focus on daily activities and implementation based on what seems right, now, rather than development and sticking to long-term strategies, may be one of the reasons for both defensive behavior and learning difficulties. Another possible reason that Senge highlights concerns the overconfidence that we obtain from experience. Learning does not happen automatically, but it requires special arrangements and focus ( Senge, 1990 ). It seems that humans, when most in need of learning, paradoxically hinder it even more ( Argyris, 1990 , 2010 ).

Actions such as defensive behaviors, or theories in use , must be made visible to break them and increase learning ( Argyris, 2010 ). Actively dealing with discrepancies and disturbances stimulates learning ( Engström, 2014 ), which is supported by a climate of psychological safety ( Edmondson, 1999 ) and the ability to learn from failure ( Edmondson, 2011 ). Robust learning includes three important components in relation to leading and analyzing learning activities: steering , challenging, and supporting knowledge creation processes ( Svensson et al. , 2009 ). Steering toward the goal and with certain structures keeps the focus on the content of the learning process. Challenging includes not only disturbances, such as dealing with contradictions, discrepancies, questioning, and uncertainty, but also engagement out of one's comfort zone. Supporting includes active empathetic listening, responding, and confirming someone's thoughts and opinions. Feedback can both challenge and support the learning process. Corrective feedback engages a person or group in dialogue to explore new ways of thinking or doing. Confirmatory feedback aims to support and strengthen a person's or group's pre-existing actions or knowledge ( Egan, 2002 ).

3. Research methodology

To understand the different types of knowledge created in interactive research projects and how they are linked to the project design, process, and content, we studied how the participants in a three-year collaborative research project perceived the learning outcomes. We used a qualitative research approach, in which we focused on the participants' experiences of the project activities. In the following sections, we provide a detailed description of the context in which the study was conducted, the data collection, and the analysis of the participants' experiences.

3.1 Research case

The context of the study presented in this paper is a research project conducted in 2018–2020 in small and medium-sized manufacturing companies. The overall project goal was to strengthen innovation capabilities, in terms of organizational ambidexterity. The project was run by five researchers from different disciplines, and participants from six small and medium-sized manufacturing companies in Sweden. The design of the research project was characterized by an interactive research approach ( Aagard Nielsen and Svensson, 2006 ; Ellström et al. , 2020 ; Svensson et al. , 2002 , 2007 ). Snowball sampling was used to find collaborating companies before the project started. As Yin (2014) notes, this type of participant selection can be beneficial when seeking knowledge within specific areas. The sample-finding phase started from November 2017 to January 2018 and resulted in a group of six manufacturing small and medium-sized enterprises (SMEs). The companies were selected based on their history of working with operations improvements, their exemplary performance records, and their interest in enhancing their organizational ambidexterity capabilities. For these companies' information, please see Table 1 .

Besides using an interactive research approach, the project adopted a problem-driven design and applied an emic approach with ethnographic roots ( Chilcott and Barry, 2016 ), which means that “accounts, descriptions, and analyses expressed in terms of the conceptual schemes and categories regarded as meaningful … by the native members of the culture whose beliefs and behaviours are being studied” ( Lett, 1990 , p. 130) are central to the endeavor. Taking an emic approach in this study meant holding on loosely to the researchers' understandings of the organizations' ambidexterity and innovation capabilities while carefully attending to the participants' framings and practices, rather than using pre-existing operationalizations. During the research process, following Raisch and Birkinshaw (2008) and Czarniawska (2007) , we collected multiple data from three mixed company focus groups, six company focus groups, 18 diaries, 257 survey respondents, and 25 days of shadowing and observations in the companies. Data collection was intertwined with data analysis in different stages and was later followed by feedback sessions with the companies and presentations of preliminary results in common workshops.

The project was planned in an iterative process with four-month cycles, which included the following steps during each cycle ( Figure 2 ): (1) Meetings in a steering group consisting of representatives from each company and all researchers were held, in which previous work was processed and subsequent steps were planned, including the content of the next stages, companies' homework, the data collection needed, and the invitation of guest speakers. (2) Both academic and industrial partners collected data and experimented with new ways of working. (3) Analysis and reflection followed, in which academic and industrial partners met in their own arenas to discuss what was learned. (4) Joint workshops were then arranged, in which partner companies and all academics met to share knowledge and experiences. These workshops were what Ellström et al. (2020) define as analytic seminars; typically, 14–19 participants were present from the industrial partners and the research team.

The project was designed to stimulate reflection among the participants through both discussion and feedback. Reflection within each company was needed before each workshop to fulfill the assigned homework. An example of homework during the projects was reflecting upon their own diaries, which were recorded during one week. The reflections were conducted at both the individual and group levels. Furthermore, the participants reflected on work meetings that took place in their own organizations, and they investigated different work tasks and how they related to the phenomena we studied (ambidexterity). During each workshop, the participants presented their findings and discussed them with both the participants from other industrial partners and the research team. The industrial partners were encouraged to have more than one participant per company to ensure that the ideas, reflections, and insights gained during the workshop could be continuously discussed and worked with later within each company. Several companies started with one or two participants, but ended up with four or more participants toward the end of the project.

Each workshop was held during 24 h, from lunch-to-lunch, starting with a visit to the hosting industrial partner in the morning. Different workshop themes ( Figure 2 ) were decided in the steering group, step by step, for each four-month cycle during the entire process. The workshops were often structured as follows: (1) joint lunch, (2) homework presentations, (3) mixed company group reflections on the presentations, (4) joint dinner, (5) theory input from a researcher or practical examples from a guest speaker, and (6) company-level group reflections on impressions from the workshop and the way forward.

3.2 Data collection—experiences from the project

This study is based on data collected in the form of oral testimonials and presentation materials on two separate occasions during the final phase of the research project: (1) the final workshop and (2) an open webinar. Both events provided meta-reflections on what the participants experienced during the project. This enabled an understanding of the link between the larger context of the interactive research project and its intended and unintended learning outcomes as perceived by the participants. The data consists of recordings and transcriptions of the final workshop and the webinar.

3.2.1 Accounts from the final workshop

The final workshop sought to address the learning outcomes from the project as understood from the perspective of each industrial partner. During the workshop, the participants' discussions took place in small groups, at the company level and in mixed constellations, and in a large group that included all project participants. Five companies were represented, and 19 people, all holding management positions, participated. During the workshop, the following questions were raised: (1) What areas within your organization have received the most innovation focus during the project? (2) Feel free to tell us more about your experiences with the changes you have tested and/or implemented. (3) When it comes to organizational ambidexterity, it is all about balancing the daily execution with the work around renewal in the business. What has facilitated and what has hindered the work on that balance? (4) What is the company's main challenge going forward? Each company presented their answers to the rest of the project team, followed by a joint discussion of thoughts and reflections. The presentation and joint discussion took between 45 and 60 min for each company and were recorded.

3.2.2 Accounts from the open webinar

The purpose of the concluding webinar was to function as an interactive platform for gathering the learning outcomes based on both the researchers' and the practitioners' perspectives and to disseminate the results of the project to a wider, primarily industrial, audience. The free online webinar on the difficult balance between stability and change in small and medium-sized manufacturing companies addressed the issue of working with both innovation and daily activities at the same time. It was a 1.5-h event divided into three parts: (1) a summary of the project background and purpose, (2) presentation of the findings, and (3) reflections by the research team and industrial partners on the findings. The participants representing the industrial partners were asked to prepare answers to three questions: (1) How has your view of innovation ability changed? (2) What exactly has happened in your organization? What focus in your business have you had during the project? (3) What advice do you want to give other companies based on the lessons you learned about innovation ability? One representative from each industrial partner held a presentation based on the above questions, and three selected participants took part in a panel discussion about the learning outcomes from the project. In total, 15 people, including the research team members, presented something at the webinar. The webinar was recorded and made available online afterward.

3.3 Data analysis

All the meetings were recorded and transcribed. Thematic analysis was carried out in two consecutive phases. In the first phase, NVivo software was used for the empirical analysis. The transcribed files were imported into the software, and all text was processed manually by the research team. All parts of the text indicating some sort of learning or change were highlighted and coded in different categories that simply described the content of that aspect (i.e. the first-order concepts). This procedure led to a combination of codes referring to company-specific aspects and very general ones. Once the transcribed files were fully covered, ensuring that no important aspects were left out, the second step of the empirical coding started. Here, the codes were clustered when deemed necessary (i.e. when there were overlaps in the aspects they covered). This was an iterative process that resulted in seven categories, which were the second-order themes: (1) interpretations and definitions of innovation; (2) the role of the project; (3) ownership and company size; (4) strategy, vision, and development; (5) self-image; (6) regional spirit; and (7) examples of changes. From these categories/themes, we managed to derive five aggregated dimensions (business insights) in a final empirical analysis inspired by Gioia et al. (2013) , Aagard Nielsen and Svensson (2006) , Adler et al. (2004) .

In the second phase of analysis, the theoretical examination took place. The five aggregated dimensions were compared to the theoretical framework to understand the outcomes of the project from a knowledge creation perspective and how such insights link to the design, process, and content of projects.

4. Findings

The empirical analysis of the data in NVivo resulted in five categories of insights derived from the participation in the interactive research project, with the potential for sustainable influence on the participating organizations: (1) an understanding of concepts and theories; (2) an understanding of the impacts of collaborative, reflective work processes; (3) an understanding of the meaning of the own organizational context; (4) an understanding of the importance of increased organizational self-awareness; and (5) an understanding of the potential for human interaction and communication.

4.1 Elaborating five categories of business insights

4.1.1 an understanding of concepts and theories.

Throughout the research project, concepts and theories that are related to and that capture innovation and ambidexterity have been constantly addressed. The initial kick-off activity, in which representatives from the companies formed mixed focus groups, concentrated on how the individuals understood innovation, how they defined innovation work, and what made it different from their daily work.

… with innovation, you just don’t know. / … / that’s the whole thing with innovation; you don’t have the methods or the time or the money—that is, you don’t know how it’s going to turn out / … / you’re on thin ice; we don’t know what choices we’ll make. So, making plans is not easily done beforehand.

They also suggested that participating in the project had given them the feeling that working with innovation is “something bigger” that could “lift them,” making them realize that they needed to include more people from the company taking part in the project.

… it’s this that happens, which isn’t planned, that’s really interesting. That’s when innovation happens or that’s when you see new patterns or get ideas. This part of the unplanned is what I think is the most exciting./ … /Researcher A talked about not feeling ashamed about this; it’s really part of being innovative or part of being in an organization, to be there for one another.

4.1.2 An understanding of the impacts of collaborative, reflective work processes

It’s been exciting and interesting to have been part of such a big project; it’s an inspiration to try new ideas and work methods to develop our business./ … /[what’s] most rewarding has been to network with other companies and academics and to benchmark both the good and the bad, the negative and the positive experiences. And what you’ve seen in the project, really regardless of what we manufacture or what we do, is that we all face the same challenges.
… to follow other companies for three years, to see their journey with the things they try, that makes us learn as well and see what we need to do next. It’s an amazing opportunity to get to be close to other companies in this way and to follow them all. It’s been very inspiring and rewarding, and we’ve received many tips and thoughts from the other companies, I think.

In one of the workshops, a manager from a company outside of the project shared some quite provocative thoughts on management. This company had grown fast and yet decided not to have any dedicated managers apart from its CEO. Collaboration, mutual trust, and feedback were brought forward as the company's key success factors. There was an intense and interesting discussion of these issues after the presentation was completed. One of the participating companies referred back to this session and stated that, after this session, the company started with quarterly based co-worker assessments to identify important issues and problems regarding the work situation. This, in turn, led to the closer involvement of manufacturing staff in project start-ups and in upcoming changes in the firm, which meant that problems and issues could be detected earlier. In other words, they learned that interacting and reflecting with more of their employees seemed to lead to better well-being and to better results in their operational work.

4.1.3 An understanding of the meaning of one's own organizational context

Throughout the project, the participants continuously brought up and emphasized the importance of their specific organizational contexts in their innovation capabilities and in the way in which these companies are managed. When they addressed their own contexts, the sizes of their companies (all of them are small to medium-sized companies), how they are owned and managed, the businesses they are in, the needs of their customers, and the region where they are located, it seems as if their ways of reasoning have expanded, and their appreciation and respect for their own specific contexts have developed.

… the pride is considerably larger in such a company, and you have a holistic view and a holistic picture in a different way. In a large company, it’s like, ‘Our department does this,’ but you don’t know the bigger picture … [here] even if you work with the introduction process, you’re fully aware what others do, what the company does and produces, and in what way. It builds on ‘I’m an important part of the whole puzzle.’
… a benefit we’ve seen too is the consensus in our management team. That we are all participating in the project meetings has given us strength. We know what vision we want to reach, and we form new goals and action plans to reach these new ideas, solutions, and everything that we’ve gotten from this project.
… as a small company, you think that everyone knows why; therefore, we don’t really establish why we’re here, but you think that everyone should know why we’re here.
Where are we heading? How shall we work? The entire management team can benefit from answers to these questions. All of us need to be involved in this.
… is it so that in owner-managed companies, SMEs, you’re prepared to take certain risks; no super advanced calculations are being made. You’re telling that you, as the owner, have stood for two and a half years and said, ‘I support this. I know it will cost something; we don’t have any calculations on it.’ If you had known, maybe you would have said no, but now you’re in it and then you just go for it… it’s more based on emotions …. “I’m not fully aware (of the costs). I follow my gut feelings. Then I can have nightmares about it.”

4.1.4 An understanding of the importance of increased organizational self-awareness

… this [project] was supposed to be about innovation, and I felt rather hesitant to do it. It didn’t really fit the vision. We’re a pretty small company… we don't have our own products… so it felt a bit weird, this thing with innovation…‘I didn’t even dare to say the word ‘innovation’ in any context involving our company. To me, innovation was only about one thing, a product that you invent; it’s about patents, research, laboratories, large research groups where you develop a new product for a new market, theories, yes.
… we’ve learned quite a lot from this project. One part of it is that we’re proud that we implement innovations. …We’ve realized that all of us do innovations, that we can make an impact, change, and come up with things. Our views, all the way from the management level to the individual co-worker, have changed.
… I was very hesitant from the beginning whether I should be involved in such a project…I considered myself extremely innovative with lots of ideas. But what I may have forgotten was to include others on that journey. I just started and forced it into the business, so it’s very much managed from the top down.
… it may be the curse of an entrepreneur that you think you can manage everything on your own. But it has turned out that that’s not the case; [throughout the project], we’ve gotten a really good activity going on throughout the organization.…If we see where we are today in comparison to when we started all of this, there’s a big difference, of course. Today, I work much more with strategies and the entire organization.
That others have asked, seen, and investigated [issues and problems in and about their organization] has given us very much. It gives a kind of boost—really, that's fun. We've seen our own innovation capability in a completely different way, and it has sparked a positive spiral of new innovations.

4.1.5 An understanding of the potential in human interaction and communication

… we’ve come to realize that the individual is important, all the way from the top to the bottom…it doesn’t help that we write a new routine for everyone to be involved; we have to have that feeling and build that feeling to get there. That’s something I believe we’ve learned in this project. It’s an important part; it’s the key to be able to move forward.
If I had involved them much earlier, which is what we do now…they’ll tell you, ‘No, you can’t think like that because this and that will happen’, because they work there every day…the best innovations you get, you get when they own it. When they come from the shop floor and start chasing white collar workers, that’s when you’ve started it.
… everyone is equally important, and we can learn a lot from one another. Many organizations have someone who’s very dominant with lots of ideas; with this method, we also let others speak up. There might be many people who are more cautious and a bit quiet but who are very clever and spend a lot of time at work and at home thinking about potential aspects. You want to capture those.
… we created a common platform, a common office where purchase, warehouse, and production management, all the ones who have many daily contacts with one another, sit together so that they can simply just talk to one another over the desk instead of moving to long meetings. So, we’ve shortened the ways of communication.
… when we’ve reached a stage with tangible suggestions, we put them into a sort of plan-do-control-action part, where we work on how to bring the process forward. In this group, when we use this method, we bring out the smartest [ideas] because it’s built on the knowledge of each and everyone in the group. When they’re allowed to participate and have a voice, it creates involvement, and you go from talking to actually doing, and doing creates value, partly building more value but also contributing to the culture in the company and encouraging co-workers to participate in many ways.

4.2 Linking business insights to project design, process and content

By identifying the five categories of business insights stemming from an interactive research project with SMEs we have shed light on the complexities surrounding knowledge creation in projects and associated learning outcomes. The following section focuses on how these five categories link to the design, process, and content of projects and on practices that can nurture conditions for knowledge creation in these types of settings.

First, we see that the category that captures an understanding of concepts and theories (in this particular case, concerning innovation capabilities and ambidexterity) relates to the content development of the research project. The participants not only captured mainstream definitions; they also formed their own understandings and beliefs. The learning outcome here is a change in mindset, as also mentioned by Ellström (1992) . The participants also followed the way of reasoning addressed by Dewey (2002) —to learn while acting without exactly knowing what the outcomes would be. It is obvious that habits and assumptions ( Moscovici, 1981 ) about innovation were disrupted ( Dewey, 2002 ) during the project's collective activities ( Granberg and Ohlsson, 2005 ; Ohlsson, 1996 ), such as the workshops. Reflection led to a new way of viewing innovation conceptually and to the use of an ambidextrous way of thinking in practice ( Sollander and Engström, 2021 ). This implies that the project design and the actual process that the participants followed were essential for the content development.

The second category of business insights captures the impacts of collaborative, reflective work processes. This insight mirrors a deeper understanding and appreciation of what can be gained when reflecting on issues together with others with similar challenges in an open and trusting environment, just as what Edmondson (2011) calls for. The project activities, that is, the way the project was designed and executed, were founded in interaction, dialogue ( Döös and Wilhelmson, 2011 ; Ohlsson, 1996 ), and reflection ( Boud et al. , 2006 ; Dewey, 2002 ; Ellström, 2006 ; Wenestam and Lendahl Rosendahl, 2005 ), and gave the participants both challenges and support ( Svensson et al. , 2009 ) in getting out of their comfort zone.

The third category of business insights captures that, by using emic and etic approaches ( Chilcott and Barry, 2016 ) and feedback processes ( Egan, 2002 ) in the project, participants could be aware of the meaning of their own organizational context in relation to their own and others' challenges and struggles. They all seemed to, throughout the project process, have gained the insight that everyone has their own specific conditions to adhere to. They also realized that these are not necessarily unique. Therefore, it seems that being part of this type of process advanced the participants' ways of using their own contexts as stepping stones to develop their organizations further.

For the fourth category of business insights (an understanding of the importance of increased organizational self-awareness) we see signs of learning outcomes related to a deeper understanding of how the participants view their own company and the roles they play in their organizations. We argue that this is stimulated by the emic approach used in the data collection and the reflective activities in the project ( Chilcott and Barry, 2016 ). This category of insights also indicates how project activities facilitated the avoidance of organizational traps and the participants' own defense behaviors ( Argyris, 1994 , 2010 ).

The fifth category of business insights emphasizes the potential for human interaction and communication offered by the design of these types of projects. We see how the project activities on inclusiveness and learning culture were supported by a climate in the research project of psychological safety ( Edmondson, 1999 ) and the ability to learn from failure ( Edmondson, 2011 ). The project activities also became role models for how the companies organized knowledge creation activities, i.e. the project process, and actively dealt with discrepancies and disturbances as learning potentials ( Engström, 2014 ) within their own organizations.

These five insights originate from the project design, process, and content. The interactive and iterative design, including workshops and homework where the steering group decided the upcoming activities in the project, allowed the participants to investigate and dig deeper into their companies' challenges using the theoretical concepts discussed during workshops. This design ensured practical relevance for the companies, and during the process the companies gained a sense of project ownership which strengthened their engagement and gave them time for both self and organizational reflection. To continue the path of learning, the process was essential for continuously creating a trustful and open environment, which paved the way for critical dialogue, reflection and feedback, all of which are important aspects for learning. The project activities, such as inspirational lectures within the area of innovation, challenging and validating results, and the companies’ own input sharing experiences acted as a final push for the five insights.

5. Discussion

In this project, the participants gained a new understanding of the impacts of collaborative, reflective work processes, along with new knowledge on concepts and theories. This type of knowledge creation corresponds to the knowledge type episteme ( Gustavsson, 1996 , 2000 ). Additionally, we have seen how the participants gained deeper understandings about the meaning of the unique context that their businesses, customers, organizations, and industries constitute together; about their organizational self-awareness; and about the potential for human interaction and communication for new, deeper insights. We connect this with the fact that the interactive cycles gradually made actions and thought patterns visible to the participants, which meant that they developed a reflection-based, deeper understanding, as suggested by Ellström (2011) . This is also in line with the ideas of Granberg and Ohlsson (2005) and Ohlsson (1996) that suggest there is a learning potential in dealing with challenges in a social context. One example of this is a clear shift in how the participants jointly shaped new insights regarding innovation capabilities and ambidexterity, which, in the long term, can strengthen the strategic processes in their organizations.

It is noteworthy that many of the above-mentioned learning outcomes indicate not only the fulfillment of the purpose of the research project (related to innovation capabilities) but also the generation of additional results. Examples include understanding themselves as leaders, as well as insights into the potential of human interaction and communication for deeper insights, which are key life lessons that impact innovation and other business process developments. After the completion of the project, the participants seemed to understand the concept of innovation in a completely different way, on a more general level, and in relation to other phenomena in the organization. A concept they previously barely used in the organizations has become a convenient term to use in their businesses. A shift in the evaluation of both the phenomenon and of themselves seemed to have taken place, or phronesis ( Gustavsson, 1996 , 2000 ). In the project, innovation and unplanned work became linked to one another, and the participants seem to have gained knowledge of how these entities are connected, or episteme ( Gustavsson, 1996 , 2000 ). They also said that they had the opportunity to test and introduce many new methods, or techne ( Gustavsson, 1996 , 2000 ), during the project, which indicates that applied knowledge was activated. This is related to dialectical processes around the known that are challenged by the unknown and that provide new insights, as well as to the fact that the boundaries between theory and practice were blurred ( Ellström, 2011 ). In sum, the interactive approach with the integrated learning cycles catalyzed all three of Aristotle's foundational knowledge types ( Gustavsson, 1996 , 2000 ).

We have managed to tease out several key practices in the project design, process, and content that have had a particular impact on the knowledge created. To start with, the participants' activities and attendance at workshops were consistent during the project, despite the many changes that took place in the management groups (i.e. people leaving for other companies). This indicates that commitment throughout the learning process remained; the companies' sense of commitment and value gained was strong. To achieve this, the project management team carefully aligned the project's research purpose and process with practical relevance ( Geraldi and Söderlund, 2018 ) and fostered an inclusive, psychologically safe environment ( Edmondson, 1999 , 2011 ). Furthermore, the fact that the workshops facilitated reflection ( Döös and Wilhelmson, 2011 ; Ohlsson, 1996 ) and enabled distancing from and the formation of perspectives on everyday problems appeared fundamental. Several participants attested that the resistance ( Argyris, 2010 ) they previously had regarding the ability to be innovative was alleviated by the project's approach and dialogue. Another key practice associated with the project's design was the guidance provided by the iterative process and by steering group decisions on themes and the homework. This seems to have triggered a sense of project ownership and a focus on the companies' own input to the project. The participants described the comments they received from others in the project group as supportive, and various types of input, such as guest lectures, during workshops were considered challenging, according to the three important components of a learning process. Throughout the project, critical dialogue facilitated reflection, which led to deeper insights into the companies' own operations. In all, this supports the notion of steering, supporting, and challenging to create robust knowledge ( Svensson et al. , 2009 ).

6. Conclusions

The purpose of this paper was to shed light on the different types of knowledge created in an interactive research project and to analyze how they are linked to the project design, process, and content. The key features of the project design, process, and content are all connected with state-of-the-art knowledge on how knowledge creation is orchestrated—stimulating psychological safety; steering, supporting, and challenging; and ensuring the alignment of theoretical rigor with practical relevance. In the present study, we confirm that this important knowledge and all three basic types of knowledge that were stimulated— episteme , phronesis, and techne —are indeed transferable to the context of interactive research when using the project form, especially if the goals are to stimulate both intended and unintended learning outcomes, including reflective knowledge and insights.

In this paper we shed light on a key potential of interactive research project management, namely, how to obtain deeper and potentially more sustainable learning effects for the participating partners beyond the explicit project purpose at hand. We have studied how knowledge is created in relation to the project design, process and content.

First, we want to highlight the findings in our study which confirm previous studies. We confirm that Ellström's (2007) ; Ellström et al. (2020) model of interactive research indeed provides conditions for providing valuable insights into the research problem at hand. In our case, we studied how small and medium-sized companies could increase their innovation capabilities while better balancing innovation activities with daily operations. The published results from the project were highly dependent on the reflection, validation, and feedback that took place in our meetings with the participating practitioners. Our results also confirm earlier studies on the productive relationship between the different roles of researchers and practitioners in collaboration ( Aagard Nielsen and Svensson, 2006 ) and the importance of the level of interaction in different phases of the research process ( Svensson et al. , 2002 ; Cronholm and Goldkuhl, 2003 ) as well as the importance of steering , supporting, and challenging to create robust knowledge ( Svensson et al. , 2009 ).

Second, we provide substantial additions to existing knowledge. Our study shows that the interactive and iterative approach with the recurring homework, workshops, and guided reflections contributed not only to joint knowledge creation in a broader sense, but also to deeper insights. We sometimes referred to the metaphor of peeling an onion in our workshops with the companies to show them how we, together, could gain a better understanding of the questions at hand using reflection. Our findings suggest an alteration of Ellström's model with an empowering, expanded view of the taken-for-granted interest of participants in the practice system. We saw that the practitioners were interested not only in practical issues or implications but also in the theoretical underpinnings of their problems; they played a pivotal role in creating theoretical knowledge. The results also complement earlier research by exemplifying and unpacking the key practices of interaction. For example, steering group meetings that assigned homework to the companies fulfilled the steering aspects of the learning process. Inspirational and theoretical lecturers seemed to challenge existing knowledge and mindsets, while feedback meetings and workshops supported knowledge creation and strengthened work with innovation and meta-reflection. A surprising finding was that the interactive and iterative model changed the mindsets of the participating company representatives and increased organizational self-awareness. This, in turn, formed a crucial basis for driving change in work methods and making investments in the organizations, as well as for changing assumptions about customer offerings. While these theoretical contributions primarily belong to the domain of knowledge creation and interaction research, we also contribute more specifically to the field of project management research by illustrating how knowledge creation can take place in practice through examples and rich empirical accounts, thereby contributing to the call by Lindhult and Axelsson (2021) to expand project management research.

Project management scholars can also find practical implications for research in our study. Research seeking to examine the conditions for reflexive knowledge creation and deeper insights can benefit from searching for evidence of the key features of the project design, process, and content, as indicated in the discussion section. Researchers who are eager to design their own interactive, collaborative research projects can hopefully also be inspired by our learning loop design and the transfer of theoretical state-of-the-art knowledge into hands-on practical activities.

We advise practitioners interested in expanding the outcomes of projects beyond the explicit targets to pay careful attention to how they set up their projects. They need to make room for iterative reflection, be mindful of creating a trusting and open environment in the team, challenge results with opposing views and theories, and make room for sharing experiences and giving feedback. In doing so, our study suggests that it is possible to gain deeper insights into complex issues that have the potential to have long-lasting effects on both people and businesses.

There are particularities in a study that are not fully captured and explained. While we cannot tease out any direct cause–effect relationships between specific activities and specific learning outcomes, we can conclude a relationship between the project design, process, and content with the identified learning outcomes. Similarly, while we can verify that learning has taken place, we cannot quantify the learning outcomes in terms of how many participants have gained knowledge. Further research is needed to validate our findings, so we encourage other authors to adopt the presented research process and activities and to focus on the meta-analysis of the impacts that the process has on the outcomes. Preferably, this could take place by assigning a dedicated researcher to follow this process in parallel to the focal problems defined in the project.

Illustration of joint activities, between the two systems involved, in interactive research

The projects iterative process in four-month cycles

Participating companies

ArgonBismuthFermiumHydrogenLithiumMercury
Number of employees (2017)4190100232650
Yearly revenue (2017)12,4 MEUR21,5 MEUR14 MEUR5 MEUR3,7 MEUR7,3 MEUR
OwnershipOwned by current CEOPrivate. Part of company group of 37 branchesFamily business, 3rd generationPrivate. Part of company group of 6 branches. Owned by family company groupFamily business, 2nd generationFamily business, 1st generation. Part of company group of eight branches
Type of productionCustomized plastic injection molding itemsCustomized turned metallic componentsParts or complete products for blower and fan solutionsVentilation and fire protectionCustomized high-pressure aluminum die casting itemsCustomized cutting assignments in aluminum, steel, stainless steel and plastics
Geographical marketsSweden (international outreach via customers)GlobalEurope, Americas and AsiaSwedenSweden (international outreach via customers)Sweden (international outreach via customers)
Customer industries (in order of turnover size)Subcontractor to primarily the furniture and automotive industriesSubcontractor to the automotive, hydraulic and pump and motor industriesSubcontractor to the automotive and home electronics industriesThe real estate and construction industrySubcontractor to the automotive, disability aid, machinery, telecom, furniture and building industriesSubcontractor to the defense and medical technology industries

Aagard Nielsen , K. and Svensson , L. ( 2006 ), Action and Interactive Research: Beyond Practice and Theory , Shaker Publishing , Maastricht .

Adler , N. , Shani , A.B.R. and Styhre , A. (Eds) ( 2004 ), in , Collaborative Research in Organizations: Foundations for Learning, Change, and Theoretical Development , SAGE , London .

Andersson , S. , Balkmar , D. and Callerstig , A.C. ( 2022 ), “ From glass ceiling to firewalls: detecting and changing gendered organizational norms ”, NORA-nordic Journal of Feminist and Gender Research , Vol.  30 No.  2 , pp. 140 - 153 , doi: 10.1080/08038740.2021.1931438 .

Argyris , C. ( 1990 ), Overcomming Organizational Defenses: Facilitating Organizational Learning , Prentice-Hall , New Jersey .

Argyris , C. ( 1994 ), “ Good communication that blocks learning. Harvard business review ”, July-August .

Argyris , C. ( 2010 ), Organizational Traps. Leadership, Culture, Organizational Design , Oxford University Press , New York .

Argyris , C. and Schön , D.A. ( 1978 ), Organizational Learning: A Theory of Action Perspective , Addison-Wesley , Reading, MA .

Avison , D.E. , Davison , R.M. and Malaurent , J. ( 2018 ), “ Information systems action research: debunking myths and overcoming barriers ”, Information and Management , Vol.  55 No.  2 , pp.  177 - 187 .

Börjesson , S. and Fredberg , T. ( 2004 ), “ Jam sessions for collaborative management research ”, in Adler , N. , Shani , A.B.R. and Styhre , A. (Eds), Collaborative Research in Organizations, Foundations for Learning, Change and Theoretical Development , Sage , Thousand Oaks , pp.  135 - 148 .

Boud , D. , Cressey , P. and Docherty , P. ( 2006 ), Productive Reflection at Work: Learning for Changing Organizations , Routledge , London .

Canonico , P. , Söderlund , J. , De Nito , E. and Mangia , G. ( 2013 ), “ Special issue on organizational mechanisms for effective knowledge creation in projects - guest editorial ”, International Journal of Managing Projects in Business , Vol.  6 No.  2 , pp.  223 - 235 .

Chalmers , A.F. ( 2013 ), What Is This Thing Called Science? , Hackett Publishing , Indianapolis, IN .

Chilcott , M. and Barry , D. ( 2016 ), “ Narrating creativity: developing an emic, first person approach to creativity research ”, International Journal of Narrative Therapy and Community Work , Vol.  3 , pp.  57 - 67 .

Choo , A. , Linderman , K. and Schroder , R. ( 2007 ), “ Method and psychological effects on learning behaviors and knowledge creation in quality improvement projects ”, Management Science , Vol.  53 No.  3 , pp.  437 - 450 .

Cronholm , S. and Goldkuhl , G. ( 2003 ), “ Conceptualising participatory action research—three different practices ”, Electronic Journal of Business Research Methods , Vol.  2 No.  2 , pp.  47 - 58 .

Czarniawska , B. ( 2007 ), “ Narrative inquiry in and about organizations ”, in Clandinin , J. (Ed.), Handbook of Narrative Inquiry: Mapping a Methodology , Sage Publications , pp.  383 - 404 .

Davies , A. and Hobday , M. ( 2005 ), The Business of Projects , Cambridge University Press , Cambridge .

Dewey , J. ( 2002 ), Human Nature and Conduct , Dover Publications , Chelmsford, MA .

Döös , M. and Wilhelmson , L. ( 2011 ), “ Collective Learning: interaction and a shared action arena ”, Journal of Workplace Learning , Vol.  23 No.  8 , pp.  487 - 5000 .

Edmondson , A. ( 1999 ), “ Psychological safety and learning behavior in work teams ”, Administrative Science Quarterly , Vol.  44 No.  2 , pp.  350 - 383 , available at: http://www.jstor.org/stable/2666999 .

Edmondson , A. ( 2011 ), “ Strategies for learning from failure ”, Harvard Business Review , Vol.  89 No.  4 , pp.  48 - 55 .

Egan , G. ( 2002 ), The Skilled Helper. A Problem-Management and Opportunity-Development Approach to Helping , 7th ed. , Brooks/Cole , Pacific Grove, CA .

Ellram , L. and Tate , W.L. ( 2015 ), “ Redefining supply management's contribution in services sourcing ”, Journal of Purchasing and Supply Management , Vol.  21 No.  1 , pp.  64 - 78 .

Ellström , P.E. ( 1992 ), Kompetens, Utbildning Och Lärande I Arbetslivet: Problem, Begrepp Och Teoretiska Perspektiv , Norstedts Juridik AB , Stockholm .

Ellström , P.E. ( 2001 ), “ Integrating learning and work: problems and prospects ”, Human Resource Development Quarterly , Vol.  12 No.  4 , pp.  421 - 430 .

Ellström , P.E. ( 2006 ), “ The meaning and role of reflection in informal learning at work ”, in Boud , D. , Cressey , P. and Docherty , P. (Eds), Productive Reflection at Work , Routledge , New York .

Ellström , P.E. ( 2007 ), “ Knowledge creation through interactive research: a learning perspective ”, Paper presented at the HSS-07 Conference , Jönköping .

Ellström , P.-E. ( 2010a ), “ Organizational learning ”, in McGaw , B. , Peterson , P.L. and Baker , E. (Eds), International Encyclopedia of Education , Elsevier , Amsterdam , Vol.  1 , pp.  47 - 52 .

Ellström , P.-E. ( 2010b ), “ Practice-based innovation: a learning perspective ”, Journal of Workplace Learning , Vol.  22 Nos 1/2 , p. 27 .

Ellström , P.-E. ( 2011 ), “ Informal learning at work: conditions, processes and logics ”, in Malloch , M. , Cairns , L. , Evans , K. and O´Connor , B.N. (Eds), The SAGE Handbook of Workplace Learning , SAGE , Los Angeles .

Ellström , P.E. , Elg , M. , Wallo , A. , Berglund , M. and Kock , H. ( 2020 ), “ Interactive research: concepts, contributions and challenges ”, Journal of Manufacturing Technology Management , Vol.  31 No.  8 , pp.  1517 - 1537 .

Engeström , Y. ( 1987 ), Learning by Expanding: An Activity-Theoretical Approach to Developmental Research , Orienta Konsultit Oy , Helsinki .

Engeström , Y. ( 2008 ), From Team to Knots: Activity-Theoretical Studies of Collaboration and Learning at Work , Cambridge University Press , Cambridge .

Engström , A. ( 2014 ). Lärande Samspel För Effektivitet: En Studie Av Arbetsgrupper I Ett Mindre Industriföretag . ( Fil dr ). Linköpings Universitet , Linköping . ( Linköping Studies in Behavioural Science No 185 ).

Faccin , K. and Balestrin , A. ( 2018 ), “ The dynamics of collaborative practices for knowledge creation in joint R&D projects ”, Journal of Engineering and Technology Management , Vol.  48 , pp.  28 - 43 .

Geraldi , J. and Söderlund , J. ( 2018 ), “ Project studies: what it is, where it is going ”, International Journal of Project Management , Vol.  36 No.  1 , pp.  55 - 70 .

Gioia , D.A. , Corley , K.G. and Hamilton , A.L. ( 2013 ), “ Seeking qualitative rigor in inductive research: notes on the Gioia methodology ”, Organizational Research Methods , Vol.  16 No.  1 , pp. 15 - 31 , doi: 10.1177/1094428112452151 .

Granberg , O. and Ohlsson , J. ( 2005 ), “ Kollektivt lärande i team: om utveckling av kollektiv handlingsrationalitet ”, Pedagogisk Forskning I Sverige , Vol.  10 Nos 3/4 , pp.  227 - 243 .

Gunnarsson , E. , Hansen , H.P. , Nielsen , B.S. and Sriskandarajah , N. ( 2015 ), Action Research for Democracy: New Ideas and Perspectives from Scandinavia , Routledge , London .

Gustavsen , B. and Engelstad , P.H. ( 1986 ), “ The design of conferences and the evolving role of democratic dialogue in changing working life ”, Human Relations , Vol.  39 No.  2 , pp.  101 - 116 .

Gustavsson , B. ( 1996 ), Bildningens Väg , Wahlström & Widstrand , Borås .

Gustavsson , B. ( 2000 ), Kunskapsfilosofi: Tre Kunskapsformer I Historisk Belysning , Fälth & Hässler , Smedjebacken .

Gustavsson , M. ( 2007 ), “ The potential for learning in industrial work ”, Journal of Workplace Learning , Vol.  19 , No.  7 , pp.  453 - 463 .

Heron , J. and Reason , P. ( 2006 ), “ The practice of Co-operative inquiry: research ‘with’ rather than ‘on’ people ”, in Reason , P. and Bradbury , H. (Eds), Handbook of Action Research: The Concise Paperback Edition , Sage , London , pp.  144 - 154 .

Hevner , A.R. , March , S.T. , Park , J. and Ram , S. ( 2004 ), “ Design science in information systems research ”, MISQ , Vol.  28 No.  1 , pp.  75 - 105 .

Illeris , K. ( 2007 ), Lärande , Studentlitteratur , Lund .

Jacob , M. , Hellström , T. , Adler , N. and Norrgren , F. ( 2000 ), “ From sponsorship to partnership in academy‐industry relations ”, R&D Management , Vol.  30 No.  3 , pp.  255 - 262 .

Johannisson , B. , Gunnarsson , E. and Stjernberg , T. ( 2008 ), Gemensamt kunskapande: Den interaktiva forskningens praktik , University Press, Växjö universitet , Göteborg .

Junni , P. , Sarala , R.M. , Taras , V. and Tarba , S.Y. ( 2013 ), “ Organizational ambidexterity and performance: a meta-analysis ”, The Academy of Management Perspectives , Vol.  27 No.  4 , pp. 299 - 312 .

Larsson , A.-C. ( 2006 ), “ Interactive research - methods and conditions for joint analysis ”, in Aagaard Nielsen , K. and Svensson , L. (Eds), Action Research and Interactive Research - beyond Practice and Theory , Shaker Publishing , Maastricht, The Netherlands , pp.  241 - 258 .

Lave , J. and Wenger , E. ( 1991 ), Situated Learning. Legitimate Peripheral Participation , Cambridge University Press , New York .

Lett , J. ( 1990 ), “ Emics and etics: notes on the epistemology of anthropology ”, Emics and Etics: The Insider/outsider Debate , Vol.  7 , pp.  127 - 142 .

Lindhult , E. and Axelsson , K. ( 2021 ), “ The logic and integration of coproductive research approaches ”, International Journal of Managing Projects in Business , Vol.  14 No.  1 , pp. 13 - 35 .

Lubatkin , M.H. , Simsek , Z. , Ling , Y. and Veiga , J.F. ( 2006 ), “ Ambidexterity and performance in small-to medium-sized firms: the pivotal role of top management team behavioral integration ”, Journal of Management , Vol.  32 No.  5 , pp. 646 - 672 .

Lundin , R. , Arvidsson , N. , Brady , T. , Ekstedt , E. , Midler , C. and Sydow , J. (Eds) ( 2017 ), Managing and Working in Project Society - Institutional Challenges of Temporary Organizations , Cambridge University Press .

Magnusson , L. and Ottosson , J. ( 2012 ), Den hållbara svenska modellen , SNS Förlag , Stockholm .

March , J.G. ( 1991 ), “ Exploration and exploitation in organizational learning ”, Organization Science , Vol.  2 No.  1 , pp.  71 - 87 .

Mohrman , S.A. , Gibson , C.B. and Mohrman , A.M. ( 2001 ), “ Doing research that is useful to practice: a model and empirical exploration ”, Academy of Management Journal , Vol.  44 No.  2 , pp.  357 - 375 .

Moscovici , S. ( 1981 ), “ On social representations ”, Social Cognition: Perspectives on Everyday Understanding , Vol.  8 No.  12 , pp.  181 - 209 .

Näslund , D. , Kale , R. and Paulraj , A. ( 2010 ), “ Action research in supply chain management - a framework for relevant and rigorous research ”, Journal of Business Logistics , Vol.  31 No.  2 , pp.  331 - 355 .

Nilsen , E.R. ( 2013 ), “ Organizing for learning and knowledge creation–are we too afraid to kill it? Projects as a learning space ”, International Journal of Managing Projects in Business , Vol.  6 No.  2 , pp. 293 - 309 .

Ohlsson , J. ( 1996 ), Kollektivt Lärande: Lärande I Arbetsgrupper Inom Barnomsorgen. (PhD) , Stockholms universitet , Stockholm .

O’Reilly , C.A. and Tushman , M.L. ( 2008 ), “ Ambidexterity as a dynamic capability: resolving the innovator’s dilemma ”, Research in Organizational Behavior , Vol.  28 , pp. 185 - 206 .

Palm , J. ( 2007 ), Kunskapsbildning mellan träindustri och akademi: en studie av dess förutsättningar och möjligheter , Doctoral dissertation , Linne Univerity , Växjö .

Pettigrew , A.M. ( 1990 ), “ Longitudinal field research on change: theory and practice ”, Organization Science , Vol.  1 No.  3 , pp.  267 - 292 .

Raisch , S. and Birkinshaw , J. ( 2008 ), “ Organizational ambidexterity: antecedents, outcomes, and moderators ”, Journal of Management , Vol.  34 No.  3 , pp.  375 - 409 .

Runsten , P. and Werr , A. ( 2016 ), Kunskapsintegration: Om Kollektiv Intelligens I Organisationer , Studentlitteratur , Lund .

Sandberg , F. and Wallo , A. ( 2013 ), “ The interactive researcher as a virtual participant: a Habermasian interpretation ”, Action Research , Vol.  11 No.  2 , pp. 194 - 212 .

Scarborough , H. , Bresnen , M. , Edelman , L.F. , Laurent , S. , Newell , S. and Swan , J. ( 2004 ), “ The processes of project-based learning: an exploratory study ”, Management Learning , Vol.  35 No.  4 , pp.  491 - 506 .

Schön , D. ( 1983 ), The Reflective Practitioner: How Professional Think in Action , Basic Books , New York .

Senge , P.M. ( 1990 ), The Fifth Discipline: The Art and Practice of the Learning Organization , Bantam Doubleday , New York .

Sollander , K. and Engström , A. ( 2021 ), “ Unplanned Managerial Work: An Ambidextrous Learning Potential ”, Studies in Continuing Education , pp.  1 - 19 , doi: 10.1080/0158037X.2021.1874903 .

Susman , G.I. and Evered , G.I. ( 1978 ), “ An assessment of the scientific merits of action research ”, Administrative Science Quarterly , Vol.  23 No.  4 , pp.  582 - 603 .

Svejvig , P. , Sankaran , S. and Lindhult , E. ( 2021 ), “ Guest editorial ”, International Journal of Managing Projects in Business , Vol.  14 No.  1 , pp.  1 - 12 , doi: 10.1108/IJMPB-02-2021-313 .

Svensson , L. , Brulin , G. and Ellström , P.-E. ( 2015 ), “ Interactive research and ongoing evaluation as joint learning processes ”, in Sustainable Development in Organizations , pp. 346 - 362 .

Svensson , L. , Brulin , G. , Ellström , P.-E. and Widegren , Ö. (Eds) ( 2002 ), Interaktiv Forskning - Förutveckling Av Teori Och Praktik , Arbetslivsinstitutet , Stockholm .

Svensson , L. , Ellström , P.E. and Brulin , G. ( 2007 ), “ Introduction–on interactive research ”, International Journal of Action Research , Vol.  3 No.  3 , pp.  233 - 249 .

Svensson , L. , Brulin , G. , Jansson , S. and Sjöberg , K. ( 2009 ), Lärande Utvärdering Genom Följeforskning , Studenlitteratur , Lund .

van de Ven , A.H. ( 2007 ), Engaged Scholarship A Guide for Organizational and Social Research , Oxford University Press , New York, NY .

Wallo , A. ( 2008 ), The Leader as a Facilitator of Learning at Work: A Study of Learning-Oriented Leadership in Two Industrial Firms , Doctoral dissertation, Linköping University Electronic Press .

Wallo , A. , Kock , H. and Nilsson , P. ( 2012 ), “ Accelerating and braking in times of economic crisis: organisational learning in a top management team ”, European Journal of Training and Development , Vol.  36 No.  9 , pp.  930 - 944 .

Weck , M. ( 2006 ), “ Knowledge creation and exploitation in collaborative R&D projects: lessons learned on success factors ”, Knowledge and Process Management , Vol.  13 No.  4 , pp.  252 - 263 .

Wenestam , C.G. and Lendahl Rosendahl , B. ( 2005 ), Lärande I Vuxenlivet , Studentlitteratur , Lund .

Westlander , G. ( 2008 ), Forskarroller I Interaktivt Utvecklingsarbete: Om Samverkansprocesser För Ergonomiska Förbättringar , Linköping University Electronic Press , Linköping .

Wieringa , R. and Morali , A. ( 2012 ), “ Technical action research as a validation method in information systems design science ”, Design Science Research in Information Systems: Advances in Theory and Practice LNCS , Vol.  7286 , pp.  220 - 238 .

Winn , W. and Snyder , D. ( 2004 ), “ Cognitive perspectives in psychology ”, in Jonassen , D.H. (Ed.), Handbook of Research on Educational Communications and Technology , Lawrence Erlbaum Associates Publishers , New Jersey .

Yin , R.K. ( 2014 ), Case Study Research Design and Methods , 5th ed. , SAGE , Los Angeles .

Zimmermann , A. , Raisch , S. and Birkinshaw , J. ( 2015 ), “ How is ambidexterity initiated? The emergent charter definition process ”, Organization Science , pp. 1119 - 1139 .

Acknowledgements

This work was conducted within Innovate and supported by Swedish Knowledge Foundation (grant number KK20170312).

Corresponding author

Related articles, all feedback is valuable.

Please share your general feedback

Report an issue or find answers to frequently asked questions

Contact Customer Support

When A.I.’s Output Is a Threat to A.I. Itself

As A.I.-generated data becomes harder to detect, it’s increasingly likely to be ingested by future A.I., leading to worse results.

By Aatish Bhatia

Aatish Bhatia interviewed A.I. researchers, studied research papers and fed an A.I. system its own output.

The internet is becoming awash in words and images generated by artificial intelligence.

Sam Altman, OpenAI’s chief executive, wrote in February that the company generated about 100 billion words per day — a million novels’ worth of text, every day, an unknown share of which finds its way onto the internet.

A.I.-generated text may show up as a restaurant review, a dating profile or a social media post. And it may show up as a news article, too: NewsGuard, a group that tracks online misinformation, recently identified over a thousand websites that churn out error-prone A.I.-generated news articles .

In reality, with no foolproof methods to detect this kind of content, much will simply remain undetected.

All this A.I.-generated information can make it harder for us to know what’s real. And it also poses a problem for A.I. companies. As they trawl the web for new data to train their next models on — an increasingly challenging task — they’re likely to ingest some of their own A.I.-generated content, creating an unintentional feedback loop in which what was once the output from one A.I. becomes the input for another.

In the long run, this cycle may pose a threat to A.I. itself. Research has shown that when generative A.I. is trained on a lot of its own output, it can get a lot worse.

Here’s a simple illustration of what happens when an A.I. system is trained on its own output, over and over again:

This is part of a data set of 60,000 handwritten digits.

When we trained an A.I. to mimic those digits, its output looked like this.

This new set was made by an A.I. trained on the previous A.I.-generated digits. What happens if this process continues?

After 20 generations of training new A.I.s on their predecessors’ output, the digits blur and start to erode.

After 30 generations, they converge into a single shape.

While this is a simplified example, it illustrates a problem on the horizon.

Imagine a medical-advice chatbot that lists fewer diseases that match your symptoms, because it was trained on a narrower spectrum of medical knowledge generated by previous chatbots. Or an A.I. history tutor that ingests A.I.-generated propaganda and can no longer separate fact from fiction.

Just as a copy of a copy can drift away from the original, when generative A.I. is trained on its own content, its output can also drift away from reality, growing further apart from the original data that it was intended to imitate.

In a paper published last month in the journal Nature, a group of researchers in Britain and Canada showed how this process results in a narrower range of A.I. output over time — an early stage of what they called “model collapse.”

The eroding digits we just saw show this collapse. When untethered from human input, the A.I. output dropped in quality (the digits became blurry) and in diversity (they grew similar).

How an A.I. that draws digits “collapses” after being trained on its own output

“6”“8”“9”
Handwritten digits
Initial A.I. output
After 10 generations
After 20 generations
After 30 generations

If only some of the training data were A.I.-generated, the decline would be slower or more subtle. But it would still occur, researchers say, unless the synthetic data was complemented with a lot of new, real data.

Degenerative A.I.

In one example, the researchers trained a large language model on its own sentences over and over again, asking it to complete the same prompt after each round.

When they asked the A.I. to complete a sentence that started with “To cook a turkey for Thanksgiving, you…,” at first, it responded like this:

Initial A.I. output

Even at the outset, the A.I. “hallucinates.” But when the researchers further trained it on its own sentences, it got a lot worse…

After two generations, it started simply printing long lists.

And after four generations, it began to repeat phrases incoherently.

“The model becomes poisoned with its own projection of reality,” the researchers wrote of this phenomenon.

This problem isn’t just confined to text. Another team of researchers at Rice University studied what would happen when the kinds of A.I. that generate images are repeatedly trained on their own output — a problem that could already be occurring as A.I.-generated images flood the web.

They found that glitches and image artifacts started to build up in the A.I.’s output, eventually producing distorted images with wrinkled patterns and mangled fingers.

A grid of A.I.-generated faces showing wrinkled patterns and visual distortions.

When A.I. image models are trained on their own output, they can produce distorted images, mangled fingers or strange patterns.

A.I.-generated images by Sina Alemohammad and others .

“You’re kind of drifting into parts of the space that are like a no-fly zone,” said Richard Baraniuk , a professor who led the research on A.I. image models.

The researchers found that the only way to stave off this problem was to ensure that the A.I. was also trained on a sufficient supply of new, real data.

While selfies are certainly not in short supply on the internet, there could be categories of images where A.I. output outnumbers genuine data, they said.

For example, A.I.-generated images in the style of van Gogh could outnumber actual photographs of van Gogh paintings in A.I.’s training data, and this may lead to errors and distortions down the road. (Early signs of this problem will be hard to detect because the leading A.I. models are closed to outside scrutiny, the researchers said.)

Why collapse happens

All of these problems arise because A.I.-generated data is often a poor substitute for the real thing.

This is sometimes easy to see, like when chatbots state absurd facts or when A.I.-generated hands have too many fingers.

But the differences that lead to model collapse aren’t necessarily obvious — and they can be difficult to detect.

When generative A.I. is “trained” on vast amounts of data, what’s really happening under the hood is that it is assembling a statistical distribution — a set of probabilities that predicts the next word in a sentence, or the pixels in a picture.

For example, when we trained an A.I. to imitate handwritten digits, its output could be arranged into a statistical distribution that looks like this:

Distribution of A.I.-generated data

Examples of initial A.I. output:

The distribution shown here is simplified for clarity.

The peak of this bell-shaped curve represents the most probable A.I. output — in this case, the most typical A.I.-generated digits. The tail ends describe output that is less common.

Notice that when the model was trained on human data, it had a healthy spread of possible outputs, which you can see in the width of the curve above.

But after it was trained on its own output, this is what happened to the curve:

Distribution of A.I.-generated data when trained on its own output

It gets taller and narrower. As a result, the model becomes more and more likely to produce a smaller range of output, and the output can drift away from the original data.

Meanwhile, the tail ends of the curve — which contain the rare, unusual or surprising outcomes — fade away.

This is a telltale sign of model collapse: Rare data becomes even rarer.

If this process went unchecked, the curve would eventually become a spike:

This was when all of the digits became identical, and the model completely collapsed.

Why it matters

This doesn’t mean generative A.I. will grind to a halt anytime soon.

The companies that make these tools are aware of these problems, and they will notice if their A.I. systems start to deteriorate in quality.

But it may slow things down. As existing sources of data dry up or become contaminated with A.I. “ slop ,” researchers say it makes it harder for newcomers to compete.

A.I.-generated words and images are already beginning to flood social media and the wider web . They’re even hiding in some of the data sets used to train A.I., the Rice researchers found .

“The web is becoming increasingly a dangerous place to look for your data,” said Sina Alemohammad , a graduate student at Rice who studied how A.I. contamination affects image models.

Big players will be affected, too. Computer scientists at N.Y.U. found that when there is a lot of A.I.-generated content in the training data, it takes more computing power to train A.I. — which translates into more energy and more money.

“Models won’t scale anymore as they should be scaling,” said ​​ Julia Kempe , the N.Y.U. professor who led this work.

The leading A.I. models already cost tens to hundreds of millions of dollars to train, and they consume staggering amounts of energy , so this can be a sizable problem.

‘A hidden danger’

Finally, there’s another threat posed by even the early stages of collapse: an erosion of diversity.

And it’s an outcome that could become more likely as companies try to avoid the glitches and “ hallucinations ” that often occur with A.I. data.

This is easiest to see when the data matches a form of diversity that we can visually recognize — people’s faces:

A grid of A.I.-generated faces showing variations in their poses, expressions, ages and races.

A.I. images generated by Sina Alemohammad and others .

After one generation of training on A.I. output, the A.I.-generated faces appear more similar.

This set of A.I. faces was created by the same Rice researchers who produced the distorted faces above. This time, they tweaked the model to avoid visual glitches.

This is the output after they trained a new A.I. on the previous set of faces. At first glance, it may seem like the model changes worked: The glitches are gone.

After two generations …

After three generations …

After four generations, the faces all appeared to converge.

This drop in diversity is “a hidden danger,” Mr. Alemohammad said. “You might just ignore it and then you don’t understand it until it's too late.”

Just as with the digits, the changes are clearest when most of the data is A.I.-generated. With a more realistic mix of real and synthetic data, the decline would be more gradual.

But the problem is relevant to the real world, the researchers said, and will inevitably occur unless A.I. companies go out of their way to avoid their own output.

Related research shows that when A.I. language models are trained on their own words, their vocabulary shrinks and their sentences become less varied in their grammatical structure — a loss of “ linguistic diversity .”

And studies have found that this process can amplify biases in the data and is more likely to erase data pertaining to minorities .

Perhaps the biggest takeaway of this research is that high-quality, diverse data is valuable and hard for computers to emulate.

One solution, then, is for A.I. companies to pay for this data instead of scooping it up from the internet , ensuring both human origin and high quality.

OpenAI and Google have made deals with some publishers or websites to use their data to improve A.I. (The New York Times sued OpenAI and Microsoft last year, alleging copyright infringement. OpenAI and Microsoft say their use of the content is considered fair use under copyright law.)

Better ways to detect A.I. output would also help mitigate these problems.

Google and OpenAI are working on A.I. “ watermarking ” tools, which introduce hidden patterns that can be used to identify A.I.-generated images and text.

But watermarking text is challenging , researchers say, because these watermarks can’t always be reliably detected and can easily be subverted (they may not survive being translated into another language, for example).

A.I. slop is not the only reason that companies may need to be wary of synthetic data. Another problem is that there are only so many words on the internet.

Some experts estimate that the largest A.I. models have been trained on a few percent of the available pool of text on the internet. They project that these models may run out of public data to sustain their current pace of growth within a decade.

“These models are so enormous that the entire internet of images or conversations is somehow close to being not enough,” Professor Baraniuk said.

To meet their growing data needs, some companies are considering using today’s A.I. models to generate data to train tomorrow’s models . But researchers say this can lead to unintended consequences (such as the drop in quality or diversity that we saw above).

There are certain contexts where synthetic data can help A.I.s learn — for example, when output from a larger A.I. model is used to train a smaller one, or when the correct answer can be verified, like the solution to a math problem or the best strategies in games like chess or Go .

And new research suggests that when humans curate synthetic data (for example, by ranking A.I. answers and choosing the best one), it can alleviate some of the problems of collapse.

Companies are already spending a lot on curating data, Professor Kempe said, and she believes this will become even more important as they learn about the problems of synthetic data.

But for now, there’s no replacement for the real thing.

About the data

To produce the images of A.I.-generated digits, we followed a procedure outlined by researchers . We first trained a type of a neural network known as a variational autoencoder using a standard data set of 60,000 handwritten digits .

We then trained a new neural network using only the A.I.-generated digits produced by the previous neural network, and repeated this process in a loop 30 times.

To create the statistical distributions of A.I. output, we used each generation’s neural network to create 10,000 drawings of digits. We then used the first neural network (the one that was trained on the original handwritten digits) to encode these drawings as a set of numbers, known as a “ latent space ” encoding. This allowed us to quantitatively compare the output of different generations of neural networks. For simplicity, we used the average value of this latent space encoding to generate the statistical distributions shown in the article.

  • Share full article

Advertisement

Public-Good and Commercial

Teams manager

Institutional Toolkits

The Institutional Toolkit (ITK) is a model for an enhanced collection of tools and features that will help institutions discover, analyse and manage innovation knowledge.

  • Patent Search & Analysis
  • Scholarly Search & Analysis
  • PatCite: Linking Patents and Scholarship
  • PatSeq: Exploring Biological Sequences in patents
  • In4M: Ranking Institutional Influence
  • Lens API’s and Bulk Data
  • Lens For Institutions
  • Lens Author/Inventor Profiles

Our Organisation

  • What is The Lens
  • Mission and Vision
  • Collective Action
  • Our Team This is our Team
  • Our History
  • Release Notes
  • Newsletters Archive
  • LinkedIn Feed

Twitter Feed

Release 9.2.

Improved accessibility, patent delta processing, Italian localisation, updated About site and more! New Features and ...

Recent News

SCImago Lab and The Lens Collaborate to Create New Mission-Aligned Metrics to Improve Societal Impact of Research

SCImago Lab and The Lens are pleased to announce a research collaboration to advance the ...

  • Release Notes What’s New

Lens Support Center

  • API Documentation
  • The Lens Labs

For Developers

  • Support Site
  • Developer Resources
  • Terms of Use
  • Privacy Policy

Lens Individual Commercial Use Agreement

Lens api terms, patseq bulk download terms.

  • API – Terms of Use
  • Individual Commercial Use Agreement
  • PatSeq Bulk Download – Terms of Use

What’s New

The Lens serves global patent and scholarly knowledge as a public resource to make science- and technology-enabled problem solving more effective, efficient and inclusive.

research is to create new knowledge

SCImago Lab and The Lens are pleased to announce a research collaboration to advance the use of open data to create new mission-aligned metrics that enable academic and research institutions, businesses, civil society and government to measure and map the influence of scholarly publications on enterprise and improve the relevance of investment, research and partnering for public-good outcomes.

Under this collaboration, SCImago Lab and The Lens will extend the International Industry and Innovation Influence Mapping ( In4M ) capability to new journal-level metrics that  illuminate the impact of scientific research on productive innovation based on nuanced understanding of the citations of journal articles within global patent documents. The project will utilize The Lens’ unique linked open data to calculate an In4M journal metric and create tools to map and explore the influence of research on innovation and help guide policy, research funding and partnerships.

In this collaboration, The Lens will supply uniquely-linked open data from both the global patent corpus and the entire research and scholarly literature and oversee its delivery, ensuring seamless access to comprehensive and current datasets. SCImago will  develop enhanced methodologies, working closely with The Lens to create new classifications and an impact indicator based on the normalized patent data. 

Advancements in journal metrics will reveal pathways for journals, publishers, authors and their institutions to better align their future strategies and actions with their core missions.

“Metrics for research and innovation must be aligned to institutional mission and priorities.   Decision-making throughout the innovation ecosystem needs clarity to chart pathways and partnerships that can lead to new products, practices and processes that can change lives ” said Dr Richard Jefferson, CEO of Cambia, the parent organization of The Lens.
“The tired and inward-looking Journal Impact Factor needs a transparent and customizable reboot to show and celebrate how research can lead to real-world outcomes we urgently need.”
“SCImago is excited to lead research in developing innovative methodologies through collaborations with The Lens. These advancements in journal metrics will illuminate paths for journals and authors to better align their strategies with their core missions, for publishers to focus and promote the relevant outlets, and ultimately maximizing the influence of research on innovation”  said Felix de Moya Anegon, CEO of SCImago Lab.

By combining The Lens’ robust data management capabilities, and world-class user experience with SCImago’s research metrics expertise, this collaboration seeks to drive positive societal impact through improved data accessibility and innovative research methodologies. Through this collaboration, SCImago Lab and The Lens are committed to fostering a more informed and connected global research community that serves society.

About SCImago Lab

SCImago Lab is a renowned research organization dedicated to analyzing and providing insights into scientific and scholarly activities worldwide. Utilizing advanced bibliometric and patent analysis tools, SCImago Lab offers comprehensive data and metrics that support academic institutions, researchers, and policymakers in evaluating research performance and trends. With a commitment to promoting transparency and accessibility in research, SCImago Lab plays a pivotal role in enhancing the understanding and impact of global scientific endeavors.

https://www.scimagolab.com/

About The Lens

https://www.lens.org/

The Lens is a world leader in providing free and open exploration, discovery and analysis of worldwide innovation knowledge, including patents and research knowledge, serving 270M+ scholarly work records, 155M+ patent documents from over 100 countries, and 490M+ biological sequences extracted from patents. The Lens has been operating for over twenty years as a project of the long-established non-profit social enterprise Cambia (doing business as The Lens), with support from leading global philanthropies and public institutions.

For more information, please contact [email protected] .

Related Posts

research is to create new knowledge

The Lens and Research Strategies Australia Partner to Support Australian Government Innovation

research is to create new knowledge

The Lens Scales for Impact

research is to create new knowledge

Why The Lens is focusing on Human Intentionality and Trusted partnerships

research is to create new knowledge

Mapping Climate Innovations

research is to create new knowledge

Artificial intelligence adoption in the physical sciences, natural sciences, life sciences, social sciences and the arts and humanities: A bibliometric analysis of research publications from 1960-2021

research is to create new knowledge

Iridescent Bio partners with The Lens to bring protein patents to life with AI and computational modelling

research is to create new knowledge

Advanced Manufacturing Research in Australia: What is its impact?

research is to create new knowledge

Beilstein-Institut Integrates Lens

research is to create new knowledge

Parents, Children, and Other Relationships in ROR Records

research is to create new knowledge

Nature announces support for authors from over 70 countries to publish open access

research is to create new knowledge

The Lens awarded major grant to bridge science, enterprise and government for faster climate innovation.

research is to create new knowledge

Data4Good – Innovation, Industry & Infrastructure Award Winner: Lens.org

research is to create new knowledge

Springer Nature and The Lens partner to accelerate use of science to advance solutions to global challenges

research is to create new knowledge

The Lens partners with Universidad de Colima to Accelerate Innovation across Latin America

research is to create new knowledge

Transitioning From Meta to Lens

research is to create new knowledge

The Lens Scholarly MetaRecord Strategy: Beyond Microsoft Academic Graph

research is to create new knowledge

Solving COVID and Climate: ‘Innovation Without Borders’

research is to create new knowledge

Covid-19 patent landscape confusion is a big concern for life sciences organisations

The lens awarded $2m usd to strengthen institutional innovation capabilities.

research is to create new knowledge

SPIE announces partnership with global open-knowledge platform The Lens

research is to create new knowledge

The Lens: Open for Outcomes

research is to create new knowledge

Landscape of coronavirus research across three decades

research is to create new knowledge

Using the Lens database for staff publications

research is to create new knowledge

Using Lens.org to kick start your literature review – a search strategy for finding review papers, systematic reviews and meta-analysis

research is to create new knowledge

Scholarly search engine comparison

research is to create new knowledge

The rise of the “open” discovery indexes? Lens.org, Semantic Scholar and Scinapse

research is to create new knowledge

The Lens MetaRecord and LensID: An open identifier system for aggregated metadata and versioning of knowledge artefacts

research is to create new knowledge

Apple After Jony Ive: Aligning Hardware and Software Design in an Innovation Network

research is to create new knowledge

How to Handle Data Exported from The Lens in CiteSpace

Highlighting some new visualization functions in lens 6.2.0 — three use cases for lens.org.

Latest Version

Improved accessibility, patent delta processing, Italian localisation, updated About site and more! New Features and Improvements Accessibility Improvements With this ...

Join our newsletter to receive frequent news updates, and notice on feature releases. We don’t share or sell your information ever.

  • What is the Lens
  • Lens Mission & Vision
  • Latest News
  • Support Center
  • Terms of use
  • Privacy policy
  • Lens API Terms of Use
  • PatSeq Bulk Download
  • Individual Commercial Use
  • Lens for Institutions
  • In4M Metric & Ranking

Language selection

  • Français fr

New temporary program announced to support research staff in Canada

The Government of Canada has a long-standing, foundational role in supporting and sustaining research at universities and health research institutions across the country. These institutions drive many of the discoveries and innovations that are important to Canada’s economic and societal well-being.  

It is critical that we sustain Canada’s research excellence, talent and knowledge as the academic research enterprise deals with the impacts of the COVID-19 pandemic. Canada’s universities and health research institutions play integral roles in supporting both the COVID-19 response and the post-pandemic economic recovery.

This is why the Government of Canada recently announced a new temporary program , with an investment of $450 million for one year. This program will:

  • provide wage support to universities and health research institutions for up to 12 weeks to avert layoffs of research personnel. The support will cover up to 75% of research personnel salaries funded through non-governmental sources. Consistent with the provisions of the Canada Emergency Wage Subsidy, funding will support up to a maximum of $847 per week per employee; and
  • provide support to universities and health research institutions to cover the unanticipated, incremental costs associated with maintaining research assets at risk due to the pandemic, and ramping up research activities once physical distancing measures are relaxed and ultimately lifted.

Universities will receive funding as block grants, allocated in two phases.

  • The first portion of the funding will be provided shortly to institutions as an upfront grant, using an allocation formula based on research expenditures. Institutions must prioritize wage support to avert layoffs of research personnel during the COVID-19 pandemic.
  • The second portion of the funding is planned for fall 2020, and will be based on institutional reporting on use of the first portion and demonstrated ongoing needs related to COVID-19.

This is a tri-agency program jointly managed by the Social Sciences and Humanities Research Council (SSHRC), the Natural Sciences and Engineering Research Council (NSERC) and the Canadian Institutes of Health Research (CIHR). The Canada Research Coordinating Committee will provide strategic oversight of the fund.

The agencies are working with government departments on the program’s design and implementation, and will communicate with universities and health research institutions as soon as more details become available.

Associated links

For more information, agency-specific measures to support researchers and graduate students, as well as new programs and initiatives enabling our research communities to address the COVID-19 crisis, visit:

  • COVID-19: Impacts on SSHRC's policies and programs
  • NSERC program information in relation to COVID-19
  • CIHR’s latest updates on COVID-19

Page details

arXiv's Accessibility Forum starts next month!

Help | Advanced Search

Computer Science > Computation and Language

Title: assessing large language models for online extremism research: identification, explanation, and new knowledge.

Abstract: The United States has experienced a significant increase in violent extremism, prompting the need for automated tools to detect and limit the spread of extremist ideology online. This study evaluates the performance of Bidirectional Encoder Representations from Transformers (BERT) and Generative Pre-Trained Transformers (GPT) in detecting and classifying online domestic extremist posts. We collected social media posts containing "far-right" and "far-left" ideological keywords and manually labeled them as extremist or non-extremist. Extremist posts were further classified into one or more of five contributing elements of extremism based on a working definitional framework. The BERT model's performance was evaluated based on training data size and knowledge transfer between categories. We also compared the performance of GPT 3.5 and GPT 4 models using different prompts: naïve, layperson-definition, role-playing, and professional-definition. Results showed that the best performing GPT models outperformed the best performing BERT models, with more detailed prompts generally yielding better results. However, overly complex prompts may impair performance. Different versions of GPT have unique sensitives to what they consider extremist. GPT 3.5 performed better at classifying far-left extremist posts, while GPT 4 performed better at classifying far-right extremist posts. Large language models, represented by GPT models, hold significant potential for online extremism classification tasks, surpassing traditional BERT models in a zero-shot setting. Future research should explore human-computer interactions in optimizing GPT models for extremist detection and classification tasks to develop more efficient (e.g., quicker, less effort) and effective (e.g., fewer errors or mistakes) methods for identifying extremist content.
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI)
Cite as: [cs.CL]
  (or [cs.CL] for this version)
  Focus to learn more arXiv-issued DOI via DataCite (pending registration)

Submission history

Access paper:.

  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. Neil Armstrong Quote: “Research is creating new knowledge.”

    research is to create new knowledge

  2. Neil Armstrong Quote: “Research is creating new knowledge.” (12

    research is to create new knowledge

  3. Neil Armstrong Quote: “Research is creating new knowledge.”

    research is to create new knowledge

  4. Neil Armstrong Quote: “Research is creating new knowledge.”

    research is to create new knowledge

  5. Neil Armstrong Quote: “Research is creating new knowledge.”

    research is to create new knowledge

  6. Neil Armstrong Quote: “Research is creating new knowledge.”

    research is to create new knowledge

VIDEO

  1. The Role of Basic Research in Innovation

  2. Data Analytics & Storytelling Workshop

  3. The Scientific Approach

  4. What is research

  5. Advancing Knowledge and Learning

  6. Metho1: What Is Research?

COMMENTS

  1. Neil Armstrong: 'Research is creating new knowledge.'

    Research is creating new knowledge. The quote by Neil Armstrong, "Research is creating new knowledge," is a concise yet powerful statement that encapsulates the essence and significance of research. In straightforward terms, this quote implies that research is not merely about gathering existing information but rather about generating fresh ...

  2. How research is creating new knowledge and insight

    Generating new knowledge and insight We have a long-standing commitment to research. As I write this, the Health Foundation is currently supporting or working on over 160 research projects. And since 2004 for every £3 of grant funding we have awarded, around £1 has been invested in research and evaluation. All of this work has developed our ...

  3. How research is creating new knowledge and insight

    How research is creating new knowledge and insight. The pursuit of knowledge and discovery has always been an intrinsic human characteristic, but when new knowledge is curated and put in the right hands it has the power to bring about high value change to society. I work in the research team at the Health Foundation, an independent charity committed to bringing about better health and health ...

  4. What is knowledge and when should it be implemented?

    Abstract. A primary purpose of research is to generate new knowledge. Scientific advances have progressively identified optimal ways to achieve this purpose. Included in this evolution are the notions of evidence-based medicine, decision aids, shared decision making, measurement and evaluation as well as implementation.

  5. The evolution of knowledge within and across fields in modern ...

    The exchange of knowledge across different areas and disciplines plays a key role in the process of knowledge creation, and can stimulate innovation and the emergence of new fields. We develop ...

  6. Knowledge sharing and innovation: A systematic review

    For some authors, innovation is a process wherein knowledge is acquired, shared, and assimilated to create new knowledge that embodies products and services (Herkema, 2003), methods and processes (Brewer & Tierney, 2012), and social and environmental contexts (Harrington et al., 2017). Characteristic of innovations is the creation of value.

  7. How to add to knowledge

    In one example, a study can add to knowledge by addressing a gap in the literature. Inherent to any good study is the identification of a research gap. This can be achieved by a systematic review of the literature to identify an area that has not been addressed. This does not require a completely new topic.

  8. What Is Research, and Why Do People Do It?

    Abstractspiepr Abs1. Every day people do research as they gather information to learn about something of interest. In the scientific world, however, research means something different than simply gathering information. Scientific research is characterized by its careful planning and observing, by its relentless efforts to understand and explain ...

  9. Research Definition

    Research refers to a systematic investigation carried out to discover new knowledge, test existing knowledge claims, solve practical problems, and develop new products, apps, and services. This article explores why different research communities have different ideas about what research is and how to conduct it.

  10. Basic Research, Its Application and Benefits

    "Basic research leads to new knowledge. It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn. ... intellectual challenges of inquiry-driven basic research and are trained in, or create, new questions and ways of thinking. As these skills are applied to societal priorities,

  11. 1.2 Ways of Creating Knowledge

    It requires formulation and understanding of principles that guide practice and the development and testing of new ideas/theories. 7 Research aims to be objective and unbiased and contributes to the advancement of knowledge. Research adds to existing knowledge by offering an understanding or new perspective on a topic, describing the ...

  12. Fostering Transformative Research in the Geographical Sciences

    The central purpose of all research, whether basic or applied, is to create new knowledge. Research in the domain of the geographical sciences is generally driven by a desire to generate new knowledge about that specific domain; that is, about the relationships among space, place, and the anthropogenic and non-anthropogenic features and ...

  13. Exploring new knowledge through research collaboration: the ...

    Research collaboration has long been suggested as an effective way to obtain innovative outcomes. Nevertheless, relatively little is known about whether and how different research collaboration strategies inspire or inhibit firms in the exploration of new knowledge. Drawing upon the research collaboration literature and social network theory, this study examines the effects of two specific ...

  14. Module 1: Introduction: What is Research?

    Research is a process to discover new knowledge. In the Code of Federal Regulations (45 CFR 46.102 (d)) pertaining to the protection of human subjects research is defined as: "A systematic investigation (i.e., the gathering and analysis of information) designed to develop or contribute to generalizable knowledge.".

  15. What is research?

    Collecting and publishing existing knowledge isn't research, as it doesn't create new knowledge. Research isn't just data-gathering. Data-gathering is a vital part of research, but it doesn't lead to new knowledge without some analysis, some further work. Just collecting the data doesn't count, unless you do something else with it.

  16. Understanding Knowledge Creation

    In contrast, the design mode focuses on improving ideas to create new knowledge. Activities such as hypothesising new ideas, evaluating the usefulness and adequacy of ideas, inventing and designing are common in design mode. ... Among the five theories of knowledge creation, empirical research on K-12 school teachers mainly referred to the ...

  17. Essential Ingredients of a Good Research Proposal for Undergraduate and

    Research is a careful, systematic, and patient investigation in some field of knowledge, undertaken to establish facts or principles; it is a structured inquiry that utilizes an acceptable scientific methodology to collect, analyze, and interpret information to solve problems or answer questions and to create new knowledge that is generally ...

  18. What is the Co-Creation of New Knowledge? A Content Analysis and

    Some research has focused on "how to" co-create, especially in health and community settings ; however, there remains a lack of consensus on the meaning and use of the term co-creation of new knowledge. Many terms are used interchangeably and with ill-defined or no definition as to the meaning behind the terms.

  19. 5 Measuring the Three K's: Knowledge Generation, Knowledge Networks

    Bibliometric data potentially can be used to create a number of additional indicators to provide further detail on linkages across research areas or by geographic location. This information can be particularly valuable for mapping the development of new research areas, such as green technologies, or the spread of general-purpose technologies.

  20. The critical steps for successful research: The research proposal and

    INTRODUCTION. Creativity and critical thinking are of particular importance in scientific research. Basically, research is original investigation undertaken to gain knowledge and understand concepts in major subject areas of specialization, and includes the generation of ideas and information leading to new or substantially improved scientific insights with relevance to the needs of society.

  21. What is Research

    Research is a process to discover new knowledge. In the Code of Federal Regulations (45 CFR 46.102(d)) pertaining to the protection of human subjects research is defined as: "A systematic investigation ( i.e., the gathering and analysis of information) designed to develop or contribute to generalizable knowledge." The National Academy of Sciences states that the object of research is to ...

  22. Knowledge creation in projects: an interactive research approach for

    In all, this supports the notion of steering, supporting, and challenging to create robust knowledge (Svensson et al., 2009). 6. Conclusions. The purpose of this paper was to shed light on the different types of knowledge created in an interactive research project and to analyze how they are linked to the project design, process, and content.

  23. (PDF) How to create new knowledge in your research?

    Developing T/C framework. Identify the specific variables. Create variables map (L.R map) Figure out how these are related. Group these variables into IV&DV. your research Problem/R GAP are base ...

  24. When A.I.'s Output Is a Threat to A.I. Itself

    As A.I.-generated data becomes harder to detect, it's increasingly likely to be ingested by future A.I., leading to worse results. By Aatish Bhatia Aatish Bhatia interviewed A.I. researchers ...

  25. Canadian Blood Services discovery research lab contributes to new

    Our dedicated research team and extended network of partners engage in exploratory and applied research to create new knowledge, inform and enhance best practices, contribute to the development of new services and technologies, and build capacity through training and collaboration. Find out more about our research impact.

  26. SCImago Lab and The Lens Collaborate to Create New Mission-Aligned

    The Lens is a world leader in providing free and open exploration, discovery and analysis of worldwide innovation knowledge, including patents and research knowledge, serving 270M+ scholarly work records, 155M+ patent documents from over 100 countries, and 490M+ biological sequences extracted from patents.

  27. Title: No Dataset Needed for Downstream Knowledge Benchmarking

    This research seeks to obviate the need for creating QA datasets and grading (chatbot) LLM responses when comparing LLMs' knowledge in specific topic domains. This is done in an entirely end-user centric way without need for access to any inner workings of the LLM, so long as it can be prompted and given a random seed to create different generations to the same prompt. The paper does this by ...

  28. New temporary program announced to support research staff in Canada

    It is critical that we sustain Canada's research excellence, talent and knowledge as the academic research enterprise deals with the impacts of the COVID-19 pandemic. Canada's universities and health research institutions play integral roles in supporting both the COVID-19 response and the post-pandemic economic recovery.

  29. Assessing Large Language Models for Online Extremism Research

    View a PDF of the paper titled Assessing Large Language Models for Online Extremism Research: Identification, Explanation, and New Knowledge, by Beidi Dong and 3 other authors View PDF Abstract: The United States has experienced a significant increase in violent extremism, prompting the need for automated tools to detect and limit the spread of ...