• Earth Science
  • Physics & Engineering
  • Science Kits
  • Microscopes
  • Science Curriculum and Kits
  • About Home Science Tools

Teaching Resources & Guides > How to Teach Science Tips > Writing a Science Report  

Writing a Science Report

With science fair season coming up as well as many end of the year projects, students are often required to write a research paper or a report on their project. Use this guide to help you in the process from finding a topic to revising and editing your final paper.

Brainstorming Topics

Sometimes one of the largest barriers to writing a research paper is trying to figure out what to write about. Many times the topic is supplied by the teacher, or the curriculum tells what the student should research and write about. However, this is not always the case. Sometimes the student is given a very broad concept to write a research paper on, for example, water. Within the category of water, there are many topics and subtopics that would be appropriate. Topics about water can include anything from the three states of water, different water sources, minerals found in water, how water is used by living organisms, the water cycle, or how to find water in the desert. The point is that “water” is a very large topic and would be too broad to be adequately covered in a typical 3-5 page research paper.

When given a broad category to write about, it is important to narrow it down to a topic that is much more manageable. Sometimes research needs to be done in order to find the best topic to write about. (Look for searching tips in “Finding and Gathering Information.”) Listed below are some tips and guidelines for picking a suitable research topic:

  • Pick a topic within the category that you find interesting. It makes it that much easier to research and write about a topic if it interests you.
  • You may find while researching a topic that the details of the topic are very boring to you. If this is the case, and you have the option to do this, change your topic.
  • Pick a topic that you are already familiar with and research further into that area to build on your current knowledge.
  • When researching topics to do your paper on, look at how much information you are finding. If you are finding very little information on your topic or you are finding an overwhelming amount, you may need to rethink your topic.
  • If permissible, always leave yourself open to changing your topic. While researching for topics, you may come across one that you find really interesting and can use just as well as the previous topics you were searching for.
  • Most importantly, does your research topic fit the guidelines set forth by your teacher or curriculum?

Finding and Gathering Information

There are numerous resources out there to help you find information on the topic selected for your research paper. One of the first places to begin research is at your local library. Use the Dewey Decimal System or ask the librarian to help you find books related to your topic. There are also a variety of reference materials, such as encyclopedias, available at the library.

A relatively new reference resource has become available with the power of technology – the Internet. While the Internet allows the user to access a wealth of information that is often more up-to-date than printed materials such as books and encyclopedias, there are certainly drawbacks to using it. It can be hard to tell whether or not a site contains factual information or just someone’s opinion. A site can also be dangerous or inappropriate for students to use.

You may find that certain science concepts and science terminology are not easy to find in regular dictionaries and encyclopedias. A science dictionary or science encyclopedia can help you find more in-depth and relevant information for your science report. If your topic is very technical or specific, reference materials such as medical dictionaries and chemistry encyclopedias may also be good resources to use.

If you are writing a report for your science fair project, not only will you be finding information from published sources, you will also be generating your own data, results, and conclusions. Keep a journal that tracks and records your experiments and results. When writing your report, you can either write out your findings from your experiments or display them using graphs or charts .

*As you are gathering information, keep a working bibliography of where you found your sources. Look under “Citing Sources” for more information. This will save you a lot of time in the long run!

Organizing Information

Most people find it hard to just take all the information they have gathered from their research and write it out in paper form. It is hard to get a starting point and go from the beginning to the end. You probably have several ideas you know you want to put in your paper, but you may be having trouble deciding where these ideas should go. Organizing your information in a way where new thoughts can be added to a subtopic at any time is a great way to organize the information you have about your topic. Here are two of the more popular ways to organize information so it can be used in a research paper:

  • Graphic organizers such as a web or mind map . Mind maps are basically stating the main topic of your paper, then branching off into as many subtopics as possible about the main topic. Enchanted Learning has a list of several different types of mind maps as well as information on how to use them and what topics fit best for each type of mind map and graphic organizer.
  • Sub-Subtopic: Low temperatures and adequate amounts of snow are needed to form glaciers.
  • Sub-Subtopic: Glaciers move large amounts of earth and debris.
  • Sub-Subtopic: Two basic types of glaciers: valley and continental.
  • Subtopic: Icebergs – large masses of ice floating on liquid water

Different Formats For Your Paper

Depending on your topic and your writing preference, the layout of your paper can greatly enhance how well the information on your topic is displayed.

1. Process . This method is used to explain how something is done or how it works by listing the steps of the process. For most science fair projects and science experiments, this is the best format. Reports for science fairs need the entire project written out from start to finish. Your report should include a title page, statement of purpose, hypothesis, materials and procedures, results and conclusions, discussion, and credits and bibliography. If applicable, graphs, tables, or charts should be included with the results portion of your report.

2. Cause and effect . This is another common science experiment research paper format. The basic premise is that because event X happened, event Y happened.

3. Specific to general . This method works best when trying to draw conclusions about how little topics and details are connected to support one main topic or idea.

4. Climatic order . Similar to the “specific to general” category, here details are listed in order from least important to most important.

5. General to specific . Works in a similar fashion as the method for organizing your information. The main topic or subtopic is stated first, followed by supporting details that give more information about the topic.

6. Compare and contrast . This method works best when you wish to show the similarities and/or differences between two or more topics. A block pattern is used when you first write about one topic and all its details and then write about the second topic and all its details. An alternating pattern can be used to describe a detail about the first topic and then compare that to the related detail of the second topic. The block pattern and alternating pattern can also be combined to make a format that better fits your research paper.

Citing Sources

When writing a research paper, you must cite your sources! Otherwise you are plagiarizing (claiming someone else’s ideas as your own) which can cause severe penalties from failing your research paper assignment in primary and secondary grades to failing the entire course (most colleges and universities have this policy). To help you avoid plagiarism, follow these simple steps:

  • Find out what format for citing your paper your teacher or curriculum wishes you to use. One of the most widely used and widely accepted citation formats by scholars and schools is the Modern Language Association (MLA) format. We recommended that you do an Internet search for the most recent format of the citation style you will be using in your paper.
  • Keep a working bibliography when researching your topic. Have a document in your computer files or a page in your notebook where you write down every source that you found and may use in your paper. (You probably will not use every resource you find, but it is much easier to delete unused sources later rather than try to find them four weeks down the road.) To make this process even easier, write the source down in the citation format that will be used in your paper. No matter what citation format you use, you should always write down title, author, publisher, published date, page numbers used, and if applicable, the volume and issue number.
  • When collecting ideas and information from your sources, write the author’s last name at the end of the idea. When revising and formatting your paper, keep the author’s last name attached to the end of the idea, no matter where you move that idea. This way, you won’t have to go back and try to remember where the ideas in your paper came from.
  • There are two ways to use the information in your paper: paraphrasing and quotes. The majority of your paper will be paraphrasing the information you found. Paraphrasing is basically restating the idea being used in your own words.   As a general rule of thumb, no more than two of the original words should be used in sequence when paraphrasing information, and similes should be used for as many of the words as possible in the original passage without changing the meaning of the main point. Sometimes, you may find something stated so well by the original author that it would be best to use the author’s original words in your paper. When using the author’s original words, use quotation marks only around the words being directly quoted and work the quote into the body of your paper so that it makes sense grammatically. Search the Internet for more rules on paraphrasing and quoting information.

Revising and Editing Your Paper

Revising your paper basically means you are fixing grammatical errors or changing the meaning of what you wrote. After you have written the rough draft of your paper, read through it again to make sure the ideas in your paper flow and are cohesive. You may need to add in information, delete extra information, use a thesaurus to find a better word to better express a concept, reword a sentence, or just make sure your ideas are stated in a logical and progressive order.

After revising your paper, go back and edit it, correcting the capitalization, punctuation, and spelling errors – the mechanics of writing. If you are not 100% positive a word is spelled correctly, look it up in a dictionary. Ask a parent or teacher for help on the proper usage of commas, hyphens, capitalization, and numbers. You may also be able to find the answers to these questions by doing an Internet search on writing mechanics or by checking you local library for a book on writing mechanics.

It is also always a good idea to have someone else read your paper. Because this person did not write the paper and is not familiar with the topic, he or she is more likely to catch mistakes or ideas that do not quite make sense. This person can also give you insights or suggestions on how to reword or format your paper to make it flow better or convey your ideas better.

More Information:

  • Quick Science Fair Guide
  • Science Fair Project Ideas

Teaching Homeschool

Welcome! After you finish this article, we invite you to read other articles to assist you in teaching science at home on the Resource Center, which consists of hundreds of free science articles!

Shop for Science Supplies!

Home Science Tools offers a wide variety of science products and kits. Find affordable beakers, dissection supplies, chemicals, microscopes, and everything else you need to teach science for all ages!

Related Articles

29 Creative Ways to Use a Home Science Tools Beaker Mug

29 Creative Ways to Use a Home Science Tools Beaker Mug

Infuse a dash of experimentation into your daily routine with a Home Science Tools Beaker Mug! As we gear up for our 29th Anniversary, we've compiled a list of 29 exciting ways to use your beaker mug in everyday life. From brewing up creative concoctions to unleashing...

Next Generation Science Standards (NGSS)

Next Generation Science Standards (NGSS)

What are the Next Generation Science Standards (NGSS)?  These guidelines summarize what students “should” know and be able to do in different learning levels of science. The NGSS is based on research showing that students who are well-prepared for the future need...

The Beginners Guide to Choosing a Homeschool Science Curriculum

The Beginners Guide to Choosing a Homeschool Science Curriculum

Get Started: Researching Homeschool Science   Curriculums  Teaching homeschool science is a great way for families to personalize their child's education while giving you the flexibility to teach it your way. There are many wonderful science curriculums...

Synthetic Frog Dissection Guide Project

Synthetic Frog Dissection Guide Project

Frog dissections are a great way to learn about the human body, as frogs have many organs and tissues similar to those of humans. It is important to determine which type of dissection is best for your student or child. Some individuals do not enjoy performing...

Snowstorm in a Boiling Flask Density Project

Snowstorm in a Boiling Flask Density Project

You know the mesmerizing feeling of watching the snow fall during a snowstorm? With this project, you can make your own snowstorm in a flask using an adaptation from the lava lamp science experiment! It’s a perfect project for any winter day.

JOIN OUR COMMUNITY

Get project ideas and special offers delivered to your inbox.

should I learn computer coding

How to Write a Science Fair Project Report

Lab Reports and Research Essays

  • Projects & Experiments
  • Chemical Laws
  • Periodic Table
  • Scientific Method
  • Biochemistry
  • Physical Chemistry
  • Medical Chemistry
  • Chemistry In Everyday Life
  • Famous Chemists
  • Activities for Kids
  • Abbreviations & Acronyms
  • Weather & Climate
  • Ph.D., Biomedical Sciences, University of Tennessee at Knoxville
  • B.A., Physics and Mathematics, Hastings College

Writing a science fair project report may seem like a challenging task, but it is not as difficult as it first appears. This is a format that you may use to write a science project report. If your project included animals, humans, hazardous materials, or regulated substances, you can attach an appendix that describes any special activities your project required. Also, some reports may benefit from additional sections, such as abstracts and bibliographies. You may find it helpful to fill out the science fair lab report template to prepare your report.

Important: Some science fairs have guidelines put forth by the science fair committee or an instructor. If your science fair has these guidelines, be sure to follow them.

  • Title:  For a science fair, you probably want a catchy, clever title. Otherwise, try to make it an accurate description of the project. For example, I could entitle a project, "Determining Minimum NaCl Concentration That Can Be Tasted in Water." Avoid unnecessary words, while covering the essential purpose of the project. Whatever title you come up with, get it critiqued by friends, family, or teachers.
  • Introduction and Purpose:  Sometimes this section is called "background." Whatever its name, this section introduces the topic of the project, notes any information already available, explains why you are interested in the project, and states the purpose of the project. If you are going to state references in your report, this is where most of the citations are likely to be, with the actual references listed at the end of the entire report in the form of a bibliography or reference section.
  • The Hypothesis or Question:  Explicitly state your hypothesis or question.
  • Materials and Methods:  List the materials you used in your project and describe the procedure that you used to perform the project. If you have a photo or diagram of your project, this is a good place to include it.
  • Data and Results:  Data and results are not the same things. Some reports will require that they be in separate sections, so make sure you understand the difference between the concepts. Data refers to the actual numbers or other information you obtained in your project. Data can be presented in tables or charts, if appropriate. The results section is where the data is manipulated or the hypothesis is tested. Sometimes this analysis will yield tables, graphs, or charts, too. For example, a table listing the minimum concentration of salt that I can taste in water, with each line in the table being a separate test or trial, would be data. If I average the data or perform a statistical test of a null hypothesis , the information would be the results of the project.
  • Conclusion:  The conclusion focuses on the hypothesis or question as it compares to the data and results. What was the answer to the question? Was the hypothesis supported (keep in mind a hypothesis cannot be proved, only disproved)? What did you find out from the experiment? Answer these questions first. Then, depending on your answers, you may wish to explain the ways in which the project might be improved or introduce new questions that have come up as a result of the project. This section is judged not only by what you were able to conclude but also by your recognition of areas where you could not draw valid conclusions based on your data.

Appearances Matter

Neatness counts, spelling counts, grammar counts. Take the time to make the report look nice. Pay attention to margins, avoid fonts that are difficult to read or are too small or too large, use clean paper, and make print the report cleanly on as good a printer or copier as you can.

  • Science Fair Project Help
  • How to Select a Science Fair Project Topic
  • Chemistry Science Fair Project Ideas
  • Science Fair Project Ideas for 12th Graders
  • 11th Grade Science Fair Projects
  • 3rd Grade Science Fair Projects
  • College Science Fair Projects
  • 10th Grade Science Fair Projects
  • 5th Grade Science Fair Projects
  • 8th Grade Science Fair Project Ideas
  • Magnetism Science Fair Projects
  • Sports Science Fair Project Ideas
  • Why Do a Science Fair Project?
  • First-Grade Science Projects
  • Crystal Science Fair Projects
  • 6th Grade Science Fair Projects

purple text on a black background

  • News and Events
  • College of Science, Engineering & Technology
  • Southern Minnesota Regional Science & Engineering Fair

Project/Paper Guide

In order to be successful at the Science Fair, it's important to plan your project well and understand the types of projects that will be accepted in the science fair. This section is designed to help you plan your project or paper and choose your research method, so that you can create a successful project.

Planning your Project

Through a science fair project you can learn what it is like to think like a scientist or engineer, to investigate and experiment in an area of your interest, and to share your results. This page explain the difference between a scientific method and an engineering method, and will help you decide which to use for your project. You will also find a link to the International Science and Engineering Fair rules.

Project Competition

This page explains how to execute a project that will be competitive at the fair. It explains the major features of a first-class project and provides tips that you can use in your process to help you be successful.

Research Paper Competition

Research paper entries will be allowed for 6-12 graders competing in the Middle/High School Fair. This page explains the paper requirements and provides the information you need to know if you are submitting a research paper individually or as a team of 3 or less.

  • Future Students
  • Current Students
  • Alumni, Donors and Friends
  • Faculty and Staff

Minnesota State University, Mankato Logo

Mankato, MN 56001 1-800-722-0544

  • Directories
  • Federal Compliance

Visits and Tours

Request Information

Apply Today

VFM 8th Grade Science Fair Project: Step 4: Background Research

  • Step 1: Find a Project Idea
  • Step 2: Formulate a Research Question & do a Project Proposal
  • Step 3: State the Purpose
  • Step 4: Background Research
  • Free Web Search
  • Step 5: Bibliography
  • Step 6: Identify the Variables in your Experiment
  • Step 7: Form a Hypothesis
  • Step 8: Materials
  • Step 9: Design & Write the Procedure
  • Step 10: Perform the Experiment
  • Step 11: Record your Data and Results
  • Step 12: Analyze your Data & Results
  • Step 13: Make a Conclusion
  • Step 14: Write the Abstract
  • Step 15: Acknowledgments
  • Step 16: Title Page and Table of Contents
  • Step 17: Proofread!
  • Step 18: Write a Final Copy of your Lab Report
  • Step 19: Create your Display Board
  • Step 20: The VFMS Science Fair
  • Oral Presentation
  • Lab Journal/Notebook
  • Schedule and Due Dates

In-Text Citation

  • [APA] How do I write an APA parenthetical (in-text) reference? Give Credit to your website or author in the body of your research.

Background Research

Background research should help you to educate the reader of your project about important aspects of your topic.  

Using multiple resources, students should learn about past results of other experiments that are similar to theirs. Students should know how and why previous experimenters arrived at their conclusions. The background research should help the students give the “because…” in the “if… then… because…” section of their hypothesis.

20 - 30 facts from 3 sources  is a reasonable expectation for this section. In the final paper, this background research will be put into paragraph form.

Use the  Background Research Planning Worksheet  to help you formulate questions that you need to answer for your topic.  Each student should become an expert on anything that is closely related to their area of research. 

  • Background Research Worksheet Complete this worksheet prior to beginning your background research for your project.
  • Fact Collection Worksheet Collect 20-30 facts from a minimum of 3 sources: one source must be from Gale Science in Context
  • Sample of Background Research Paragraphs Here is a sample of what your background research paragraphs might look like. Sample found on the web here: http://www.oncoursesystems.com/images/user/2162/302482/img074.jpg

Why You Should Use Databases

science fair research papers

Databases are sometimes called the "deep web" or "invisible web" because their information is usually only accessible through paid subscriptions using passwords and isn't usually found (indexed) by search engines such as Google.

Database records are organized using a variety of indexes such as author and subject but are keyword searchable as well. 

Databases are either subject specific such as World History in Context or content specific such as the newspaper and magazine database through EBSCO. 

Databases contain information that has been checked for the  ABC's   of  authority  &   accuracy, bias, and content  &  currency . You can trust the information you find in databases, not like on the web or through Google searches. Sometimes it's accurate, but many times it isn't. 

Library Databases: Start your Search Here

science fair research papers

What is Research?

Research is: 

  • Driven by a question that guides the process.
  • Seeking information with a clear goal.
  • A process, which works best when done step- by-step. The steps may need to be repeated.
  • Collection and interpretation of data in an attempt to resolve the problem.
  • Going beyond facts and old ideas.
  • Taking a new look at the information and taking a stand.

Research is not:

  • Copying and pasting information you find through a Google search.
  • Combining a paragraph from one article with a couple of paragraphs from websites. That's plagiarism.
  • Rearranging facts
  • Rewording each phrase and citing each source. That's just a summary of facts with someone else's name on them and still can be classified as plagiarism.

Words for the wise student: 

  • Remember, begin with a "wide net" and then narrow your search results.
  • If you only look for specific information to answer a specific question, you may miss many opportunities to broaden your understanding .
  • Allow for surprises- you may find your views on your topic will change and take you in an entirely new direction.
  • Remember that research is searching again and again.
  • In the process of doing research, you will be looking at information that others have looked at before, trying to see something that they have not seen.
  • << Previous: Step 3: State the Purpose
  • Next: Free Web Search >>
  • Last Updated: Jan 27, 2016 2:09 PM
  • URL: https://tesd.libguides.com/VFMScienceFairProject

How to Write a Regeneron ISEF Abstract

What is the purpose of the abstract.

The abstract should be a brief, yet comprehensive synopsis of the research project. It should seek to highlight the research question(s), experimental procedures, data, and conclusions in a way that is concise and easy to understand. It will be reviewed by Special Award Organization and Grand Award Judges to determine whether the project stands out within its category or qualifies for special awards. The general public and other Regeneron ISEF visitors read the abstract for a quick overview of the research design and findings.

Rules for completion:

The abstract should be 250 words or less. Do not discuss specific aspects of the research in great detail, including experimental procedures and statistical methods. Any information that is unnecessary to include in a brief explanation should be saved for the written research paper or the project exhibit board.

If the project is a continuation from a previous year, the abstract should summarize the current year’s work only. If mention of supporting research from previous year(s) is necessary, it must be minimal.

If the abstract text includes special characters, such as mathematical symbols, which won’t be translated electronically, please spell out the symbol.

Do not include acknowledgements in the abstract. This includes any references to mentors, institutional facilities, and awards or patents received.

All abstracts must be submitted on the Regeneron ISEF online system. Many regional and state fairs also use the Regeneron ISEF Official Abstract Form, which can be found here . This form is not necessary for most local fairs.

What should the abstract include?

(or names, if a team project)

Best practices:

Remember- Revision is Key

  • Make sure that the abstract includes all parts outlined in this guide
  • Omit unnecessary details and discussions
  • Use the past tense in descriptions
  • Write in short, but complete sentences
  • Avoid extra jargon and any slang
  • Use concise wording throughout, especially when expressing concepts and processes with scientific language
  • Check for correct spelling, grammar, and punctuation
  • Ask for writing help from an English teacher or librarian. Writing an abstract is an exercise in using language effectively to convey scientific ideas and procedures.
  • It never hurts to have an extra pair of eyes glance it over

Sample abstract

Please view the following example abstract, which is displayed two ways: In paragraph form, as will be presented at the Regeneron ISEF, and divided in parts to show how it would fit the general abstract template.

Snot Science: How far does a sneeze travel?

Bethany Brookshire, Ph.D.

Science News for Students, Society for Science & the Public, Washington, D.C.

Viruses, such as those that cause colds and influenza, spread via droplets of mucus that are produced when an infected person sneezes or coughs. Using thick and thin mucus and a model sneeze, we tested the hypothesis that thin mucus will travel farther than thick mucus.

Thin and thick mucus were represented by 1-milliliter volumes of colored water or a mixture of corn syrup and gelatin, respectively. Fluid was squirted from a plastic dropper with enough force to model a sneeze. Each sample was analyzed for maximum distance traveled and distribution of droplets. Data was analyzed using a two-tailed t test.

Compared to thick mucus (mean distance of 110.8 cm, SD 103.7 cm, n=26/group), thin mucus squirted a greater mean distance (302.4 cm, SD 45.06 cm, n=26/group, p<0.0001, Cohen’s d 2.395). Thick mucus traveled a maximum of 310 cm. Thin mucus traveled a maximum of 400 cm. Thick mucus also formed fewer visible droplets, and droplets concentrated closer to the origin of the “sneeze.”

This study showed that thin mucus travels farther than thin mucus in the plastic dropper sneeze model. Thin mucus traveled a maximum of 400 cm, suggesting a potential spread of virus-containing particles of up to 4 meters in our tests. Further experiments will clarify differences in viscosity between thick and thin mucus and potential differences in droplet size.

Purpose:  Viruses, such as those that cause colds and influenza, spread via droplets of mucus that are produced when an infected person sneezes or coughs. Using thick and thin mucus and a model sneeze, we tested the hypothesis that thin mucus will travel farther than thick mucus.

Procedure:  Thin and thick mucus were represented by 1-milliliter volumes of colored water or a mixture of corn syrup and gelatin, respectively. Fluid was squirted from a plastic dropper with enough force to model a sneeze. Each sample was analyzed for maximum distance traveled and distribution of droplets. Data was analyzed using a two-tailed t test.

Results:  Compared to thick mucus (mean distance of 110.8 cm, SD 103.7 cm, n=26/group), thin mucus squirted a greater mean distance (302.4 cm, SD 45.06 cm, n=26/group, p<0.0001, Cohen’s d 2.395). Thick mucus traveled a maximum of 310 cm. Thin mucus traveled a maximum of 400 cm. Thick mucus also formed fewer visible droplets, and droplets concentrated closer to the origin of the “sneeze.”

Conclusions:  This study showed that thin mucus travels farther than thin mucus in the plastic dropper sneeze model. Thin mucus traveled a maximum of 400 cm, suggesting a potential spread of virus-containing particles of up to 4 meters in our tests. Further experiments will clarify differences in viscosity between thick and thin mucus and potential differences in droplet size.

Works consulted

Cole, John. (2008). Mastering the Abstract Writing Process.

Caprette, David. (1995, August 25). How to Write a Research Paper. Retrieved from Rice

University, Experimental Biosciences Web site: http://www.ruf.rice.edu/~bioslabs/tools/report/reportform.html#abstract

Carroll, Leah. HOW TO WRITE AN ABSTRACT: Tips and Samples. Retrieved from University of

California Berkeley, Office of Undergraduate Research Web site:            http://hsp.berkeley.edu/sites/default/files/HOW%20TO%20WRITE%20AN%20ABST…

The Writing Center at UNC-Chapel Hill. Writing Abstracts. Retrieved from:

http://writingcenter.unc.edu/esl/resources/writing-abstracts/

science fair research papers

Classical Conversations

Your Guide to a Successful Homeschool Science Fair

A student conducts an experiment for science fair.

The late afternoon sun poured liquid gold through the west-facing windows. A girl, a dog, and a mom stood by the lit stove in a quiet house, methodically adding sugar to water in a pan to create a super-saturated solution. As the steam from the open pan drifted upward, the sun illuminated the molecules and transformed this scene from our homeschool science fair into something extraordinary.

How to Do a Homeschool Science Fair Project

A homeschool science fair is the perfect opportunity to explore the scientific method, encourage creativity, and foster a love for learning. We roughly followed the same steps presented in the Simplifying Science Fair video series available on CC Connected.

Step 0: Develop an Exploratory Question

Do you remember in your child’s early years they would ask delightful and sometimes impossible questions? Why don’t dogs like cats? Will a watermelon grow in my tummy if I swallow a seed? Why can’t I see wind? We are born full of wonder, and the first step of Science Fair is to rekindle wonder. What makes your child curious? What puzzles them?  What captures their interest again, again and again? Do they constantly test the material world with “I wonder what will happen if . . . ?” It is in these exploratory questions that the first step lies.

Of course, we can’t allow for infinite wonder, but we do want to encourage our students to be willing to wonder about their topic for at least 10 weeks.  In my daughter’s case, she has had an ongoing fascination with crystals since she attended Science Fair in our local homeschooling community when she was yet a wee one!  Her project was the perfect application for wonder!

After that initial step of dreaming and of wondering, we roughly followed the same steps outlined in CC Connected (condensing them into a single step in some places):

CC Connected: Simplifying Science Fair

1. The Lab Journal. My daughter prepared a simple, well-organized journal for her science project. She would write everything in a notebook by hand, kept it consistent, and kept it neat.

2. Research. Starting from that place of initial wonderment and guided by the scientific method, my daughter developed a research plan, a research problem, and a research question. Once those were in place, she began her background research. She was interested in crystals, which led her to develop a research plan centered around crystal growth.

3. Hypothesis. A good hypothesis is the starting point for every experiment, and so my daughter spent a lot of time on this particular step.

4. Experimentation. Together, we gathered materials, and she drew up a procedure that included safety precautions (growing crystals involves a lot of heat—always a source of caution for scientific experimentation!) and a plan for analyzing her data. Then, and only then, she did she conduct her experiment.

5. Analysis. My daughter measured her crystal growth, recorded her findings, compared them to her hypothesis, drew up her bibliography, and then wrote her research paper.

6. Presentation. Finally, as we organized the project according to the classical method, my daughter presented her findings using the five canons of rhetoric (more on that in a moment).

A Note on CC Connected and the Challenge A Science Fair

My daughter and I were very well equipped with content in our Challenge A guide and support from our Director. Furthermore, Classical Conversations members have the benefit of wisdom and memory in their local community from parents and students who have done it before. You’ll learn so much from asking questions and having conversations. Further, there’s a wealth of resources in CC Connected including a Parent’s Guide taking you step by step through the experience, videos, and even a certificate of completion to award your student!

Keep in mind that simple is best.  It’s not intended to be ironic that 14 of the 17 assets in the Challenge A Learning Center in CC Connected begin with “Simplifying” in their title!  There are amazing charts, connections to the Five Common Topics, sample questions, and more.

CC Connected Science Fair resources.

Classical Education and Science Fairs

It is important to note that the classical tools of learning played a vital role in shaping the order of the science fair for my family. We used the Socratic method during the initial stages to spark an ongoing dialogue about crystals, a dialogue that deepened both our understandings. Furthermore, the five canons of rhetoric naturally guided much of our inquiry and activity:

The canon of invention helped us engage some of the big ideas necessary for us to understand the elements of our project and helped us identify the resources we needed.

We used the canon of arrangement to organize our thoughts. My daughter delighted in being wholly responsible for generating her lab notebook, presentation board, and research report. It was necessary to use precise language to describe the process and report findings.

The canon of elocution helped my daughter articulate her thinking and conclusions well.

The canon of memory required attention and practice as my daughter had many big words to learn to pronounce correctly, some detailed technical information to share, and an involved process to explain.

And finally, this all came together in the canon of delivery as my daughter stood before judges, shared her abstract, detailed her process, shared her conclusions, and, of course, offered them some sugar crystal candy!

What Do Homeschoolers Learn From Science Fairs?

Sure, we were “just” following the scientific method as we conducted my daughter’s Challenge A science fair experiment. But, in the quietness of that moment, my eyes were opened to see more than an assignment. I saw ownership, order, and an expression of wonder and delight as my daughter interacted with her physical world.

Science Fairs Encourage Students to Take Ownership

From the beginning of her Challenge A experience, my daughter had an unwavering desire to grow sugar and borax crystals and develop an experiment for science fair that would allow her to test a dependent variable. She enthusiastically watched YouTube videos, one after the other, of different (but remarkably similar) ways to grow crystals. She talked about crystals, made shopping lists for supplies to grow crystals, set out rows of food coloring drops to make psychedelic crystals, and created a plan for producing and then presenting her classmates with edible crystals as part of her science fair presentation. She read her Challenge A guide faithfully, followed the weekly assignments, and utilized the feedback and support her Challenge A Director offered. She was on top of it, ready to go, eager to get started.

And then she hit a roadblock.

She had not designed a good experiment. Although there are plenty of resources for planning and executing a crystal-growing science fair project with good experiments, she had discounted any need for outside input. “I’ve got this mom!” she told me, all about the science and blissfully confident in the process. And it was when she bumped into herself and her unaccounted-for barriers that her ownership of her science fair project began to soar.

Back to research and YouTube, she went! A review of her notes from Challenge A seminar, her guide, and her Director’s feedback were in order. She asked lots of questions, employed dialectic skills through the five common topics, and reset her course. She had a new hypothesis, a new experiment, and a measurable dependent variable for which she could expect satisfactory results.

She did it! Without too much drama and with steadfastness, she kept hold of science fair as her project. Mom wasn’t there to “bail her out.” Instead, I supported her steps. I let her hold the responsibility and discomfort of her initial mistakes. And, as I let her endure this modest failure, she overcame her frustrations, renewed her efforts toward a project she still loved, and took hold of her experience and bore up under its requirements. Indeed, the theme of Challenge A, “Attention Leads to Ownership,” rings true!

Science Fairs Show Students Order

Science fair in Challenge A is beautifully planned. There’s nothing left to chance, guesswork, or up to the parent or student to invent. The Challenge A guide does a commendable job of moving your student along weekly. Its appendices are the perfect balance of supplemental information. Equipping and encouragement are available on CC Connected. And your Challenge A Director is sincerely interested in your student’s project and is a wealth of support and motivation!

It’s probably not fair to say that science fair is easy. There is quite a bit of work involved, and sometimes the “extra” elements may catch you off guard. For example, I hadn’t understood just how much sugar my daughter would need to complete her crystals, nor did I anticipate the totality of what she hoped to gift her classmates! But, on the flip side, it’s probably not accurate to say that science fair is difficult. It falls somewhere in the middle. And, in the scope and sequence of the Challenge years, Challenge A is its “sweet spot”!

There are character issues that come up. Ordering one’s time, following instructions, responding appropriately to authority, following through — all these and more will be a part of the science fair experience. And I would not be honest if I didn’t disclose that there were indeed times when the disorder of my mind or my daughter’s attitude was evident. But it was our behavior in the difficulties that arose which allowed us to break through, reorder our efforts, and move forward. We returned to order, and our path grew smooth.

Science Fairs Allow Students to Wonder and Delight

My daughter never grew tired of growing crystals. Instead, she delighted in each new batch as if it were the first time she’d ever seen such a thing. She held regular science fair chats with her dog, believing all along that Primrose (her pet) was equally interested in the unique formations growing on our kitchen counter. She acted as if she was the first human being to understand sugar crystals because, in a way, it’s as if she was  the first—because it was her discovery !

In  A Different Kind of Teacher: Solving the Crisis of American Schooling , John Taylor Gotto says, “The primary goal of real education is not to deliver facts but to guide students in the truths that will allow them to take responsibility for their lives.”

This is the beauty of the science fair. My daughter encountered truths about her physical world, herself and her capacities, and how to take responsibility for an extended project. In a small and impactful way, she took responsibility for her life in active participation in science fair.

A Challenge A student stirs a mixture on the stove for science fair

What is the Challenge A Science Fair?

I’m not incredible at remembering to take pictures, but we took some that golden afternoon. The light was so intense you can see individual steam droplets rising in the photos. My daughter stands confidently in front of the range, her dog obediently at her feet, and my daughter’s face is composed; her eyes sparkle, and it is evident that she was experiencing the harmony of the ages as she quietly stirred.

After all was said and done and the project board tucked away for memory’s sake, I was asked to write my response to the question, “What is the Challenge A science fair?” Now, I could offer academic, philosophical, or characterological answers. Still, they wouldn’t ring as true as this: science fair, in our particular case, is a gift . It is an unanticipated joy. And, it now remains a precious memory of a specific afternoon, a successful project, and the treasures of ownership, order, wonder, and delight.

Written by:

Laura Kooistra, homeschool mom

Laura Kooistra

Parent and Lead Communications Writer

Laura Kooistra, the wife of Kent, mother to six, belongs body and soul to her Lord and Savior, Jesus Christ. A decades-long home educator, she has always employed classical learning tools but found her people and place in community when she joined CC as a Challenge B Director in 2012. Her youngest daughter is enrolled in a local CC community. A hobby farmer in Southwest Michigan, she enjoys both sunrise and sunset from her open property. Laura collects big words, loves challenging books, drinks strong coffee, devours podcasts, grows flowers, overuses commas, and enjoys time together (her love language)!

I want to start homeschooling!

A Classical Conversations team member will contact you shortly to help you learn more about enriching your child’s classical, Christian homeschool education.

  • Nurturing Growth in Homeschooling: Lessons from South Africa
  • Five Ways to Take a Summer Break without Taking a Brain Break
  • The New Renaissance: Christianity and the Arts
  • What Homeschoolers Should Know about Biblical Financial Literacy
  • But What About Beauty?
  • Classical Christian Education
  • Classical Conversations Programs
  • Encouragement
  • Homeschooling
  • Impact Your Community
  • International Spotlight

Join the Conversation

Community is at our core, with families doing life together as they learn.

Related Posts - Classical Conversations Programs

Classical Learning Cohort

Join the Classical Learning Cohort!

Have you ever walked into a friend’s living room, where you have been many times,...

A kindergartener sitting at a desk coloring in a book

Kindergarten Homeschool Curriculum: 3 Things to Consider (Updated for 2024)

Should reading instruction be whole-language or phonics-based? Should math instruction follow traditional models or embrace...

Students working through The Math Map.

The Math Map: A Classical Approach to Mathematics Education

In the world of classical homeschooling movement, a groundbreaking development is currently taking shape. Classical...

What can we help you find?

Homepage image

Data Science Journal

Ubiquity Press logo

  • Download PDF (English) XML (English)
  • Alt. Display
  • Collection: International Data Week

Practice Papers

A data-driven approach to monitor and improve open and fair research data in a federated research ecosystem.

  • Markus Kubin
  • Mojeeb Rahman Sedeqi
  • Alexander Schmidt
  • Astrid Gilein
  • Tempest Glodowski
  • Vivien Serve
  • Gerrit Günther
  • Nina Leonie Weisweiler
  • Gabriel Preuß
  • Oonagh Mannix

In this contribution we present a data-driven approach to monitoring and assessing the state of open and FAIR data in an interdisciplinary, federated research ecosystem. The project is part of a multi-method approach by the Helmholtz Metadata Collaboration (HMC) to monitor and assess the state of open and FAIR data practices in the Helmholtz Association of German research centers, Germany’s largest non-university research organization. The approach consists of two parts: a modular data harvesting and assessment pipeline, and an openly accessible dashboard with interactive statistics about the data publications identified with the pipeline. The dashboard provides insight into which data repositories research communities use to publish research data and it allows for assessing systematic gaps of this data with respect to the FAIR data guidelines. We illustrate how the approach can be used to engage communities in FAIR data practices and to counsel data infrastructure towards improving FAIR data across a federated research organization. All software and data discussed here are published under an open license and reusable by data professionals at other research performing organizations.

  • Open Science
  • FAIR metrics
  • Cultural change

1 Introduction

The FAIR principles, formulated in 2015 by Wilkinson et al. strive to enhance the findability, accessibility, interoperability, and reusability of research data ( Wilkinson et al., 2016 ). Their goals align closely with those of the open data initiative, as mirrored by their inclusion into the UNESCO Recommendations for Open Science in 2021 ( UNESCO, 2021 ).

Implementing FAIR data practices on the level of research performing organizations and research data repositories in a harmonized way is challenging. In the Helmholtz Association – Germany’s largest, federated, non-university research organization with 18 research centers, the Helmholtz Metadata Collaboration (HMC) platform was launched in 2019 to turn FAIR into reality. Its long-term goal is establishing a FAIR data space, as defined by Nagel et al. a ‘decentralized infrastructure enabling trustworthy data sharing’ and exchange within data ecosystems, founded on commonly-agreed principles and extending across research centers and domains. Helmholtz Metadata Collaboration is focused on bringing together technical and social solutions to improve FAIR data practices.

Here we discuss a three-step, data-driven approach to improving FAIR data practices in our federated research organization:

  • Measure: Where is research data in a cross-disciplinary, federated research organization published and what FAIR criteria does this data meet or not meet?
  • Learn: Identify data publishing themes and gaps towards improving the FAIRness of data.
  • Act: Engage communities and data infrastructure to implement steps to close identified gaps.

Iteration through these steps forms an effective feedback-loop towards improving FAIR data practices in a step-wise manner. After engaging communities and data infrastructure we reassess progress, identify next steps, and engage again. Over time this builds trust with our communities, which in turn leads to more and faster progress.

The first of these steps, finding research data outputs from a specific research performing organization or ecosystem is challenging. Institutional catalogs often contain only minor amounts of research data publications, since data historically was not recognized as a first-class research output. To find data publications associated with research articles, harvesting approaches often engage complex text mining workflows to extract data availability statements from research articles. This requires access to full-text articles and often also manual supervision, and hence additional resources. ( Bassinet et al., 2023 ; Iarkaeva et al., 2023 ).

Alternative approaches include harvesting metadata directly from repositories; however, these often lack application programming interfaces and standardized affiliation metadata which would allow linking data publications back to the research organizations they were created at. Such an approach requires knowing which repositories to harvest from—you need to know where your community deposits its data. This itself is non-trivial—in HMC we have used a multi-method approach ranging from community surveys ( Arndt et al., 2022 ) to manual mapping to answer the question of where Helmholtz research data lies.

In contrast, the data harvesting approach discussed in this contribution harvests the manually curated literature metadata provided by the research organization’s institutional libraries. As funding has historically been linked to this information there is a strong interest for it to be correct. Starting from these publications, linked data publications are identified by harvesting metadata that is openly available in so-called Scholix links, ( Burton et al., 2017 ) as described in the methods section. In contrast to harvesting data citations from research articles only, Scholix links unveiled forward- and backward-references from literature to data publications and vice versa. As the approach relies on metadata in standard formats it is resource-light compared to text-mining workflows.

Since research data is on its way to becoming a first-class research output, quality metrics are required to allow for quality control of data publications. ( Bahim et al., 2020 ; Castell et al., 2024 ; Cobey et al., 2023 ) Evaluating and monitoring how several thousands of data publications perform regarding the FAIR criteria requires automated tools like the F-UJI framework. ( Devaraju & Huber, 2020 ; Devaraju & Huber, 2021 ; Huber & Devaraju, 2021 ) This provides useful information to identify and address gaps when counseling communities and infrastructure to effectively improve FAIR data practices.

In this contribution, the results obtained from a combined data harvesting and evaluation pipeline are presented in an openly accessible dashboard. Open science dashboards are widely used tools to monitor and communicate research outputs on institutional ( BIH Quest, n.d .; BIH Quest Center for responsible research, 2024 ) to the national ( Jeangirard, 2019 ) and international scales. ( Liu et al., 2024 ; OpenAIRE, n.d. ; OpenAlex, n.d. ). Here, a dashboard approach is presented as an initial way to communicate data and results and to engage communities and infrastructure. This aligns with our overall strategy to measure, learn, and act on our way to a unified FAIR data space in our federated research organization.

To solve these challenges, we have built a pipeline to find data publications of a research organization. Our key approach is to identify data publications linked to literature, published at our research organization. The modules of this pipeline are schematically visualized in Figure 1 .

Schematic illustration of the modular pipeline used to find and assess linked data publications of a research organization

Schematic illustration of the modular pipeline used to find and assess linked data publications of a research organization.

First, metadata of literature publications is harvested from the OAI-PMH interfaces ( Lagoze et al., 2015 ), as provided by the libraries of 15 research centers of our federated research organization (status: June 2023). Connection of further center libraries is underway. Library metadata is provided mainly in OAI-DC and MARC-XML formats. Harvesting from the manually curated literature catalogs of the research institutes in our federated research organization makes sure that, with a high probability, these and all linked ‘supplementary’ data publications found in the next step can be assumed to originate from these centers. This affiliation, and optionally other relevant metadata, is hence inferred from the literature to the linked data publications.

Second, for all literature PIDs (Persistent Identifiers) found in these literature catalogs, we identify linked datasets by harvesting Scholix-links ( Burton et al., 2017 ) via the ScholExplorer API, ( ScholExplorer, n.d. ) representing a subset of the OpenAIRE research graph. All software code and data presented in this contribution are based on data collected in the first two quarters of 2023, collected with version 1 of the ScholExplorer API. In an upcoming, yet unpublished software update, version 2 is employed.

All harvested Scholix-links are filtered for the target -type to be dataset and the Scholix RelationshipType (in version 2 SubType ) for the value IsSupplementedBy . This combination of filters selects an incomplete, yet reliable set of data publications with ‘supplementary’ and hence closely associated connection between literature and linked data publications. This is supported by manual analyses ( Kubin, 2022 ) and and an evaluation of the most frequent Scholix RelationshipType (SubType) categories where we assumed that data publications with a ‘supplementary’ character, in relation to their linked literature publications, share similar titles and lists of author names ( Kubin, 2024 ). The essence of this analysis is shown in Figure 2 and substantiates our choice of filters to maximize the confidence in the ‘supplementary’ character while minimizing false-positive hits for the data publications included in the dashboard.

Evaluation of Scholix RelationshipTypes (SubTypes) based on sample data (Kubin, 2024). The category ‘matches found (…)’ requires at least 25% matches, on average, of significant words in the publication titles (excluding articles, prepositions, etc.) or at least one same author name

Evaluation of Scholix RelationshipTypes (SubTypes) based on sample data ( Kubin, 2024 ). The category ‘matches found (…)’ requires at least 25% matches, on average, of significant words in the publication titles (excluding articles, prepositions, etc.) or at least one same author name.

Third, as a first approach to automated FAIR assessment of the datasets identified thereby, we have adopted the F-UJI framework ( Devaraju & Huber, 2020 ; Devaraju & Huber 2021 ). F-UJI automatically tests 15 out of 17 FAIR principles based on metrics developed by the European FAIRsFAIR project ( Devaraju et al., 2020 ). F-UJI scores were determined automatically for each dataset included in the dashboard, using a locally deployed docker image of F-UJI ( Huber & Devaraju, 2021 ). The code and data discussed here are based on F-UJI version 1.4.7 and metrics in version 0.4. An upcoming, yet unpublished update employs F-UJI version 3 probing metrics in version 0.5.

The results retrieved with this approach are stored in a relational database and visualized in an interactive dashboard ( Helmholtz-Metadata Collaboration, 2023 ). Separate sub-pages of the dashboard allow to explore statistics of data publications and FAIR-related aspects from different viewpoints, e.g., from the institute, repository, or the individual perspective to address target groups ranging from institutional data professionals and repository providers to individual researchers interested in improving FAIR data.

Disclaimer: Data publication statistics presented here and in the dashboard are neither complete nor entirely free of falsely identified data publications. These numbers can be used to identify qualitative trends. If intended to be used for sensitive topics such as targeted funding, we highly recommend a manual review of the data. Please note that F-UJI cannot truly assess how FAIR data is but provides useful guidance for improving FAIR data .

With the data harvesting approach described in the methods section, 300’684 literature publications and 32’817 ‘filtered,’ linked data publications (see the methods section for details) have been identified in the time span from January 2000 to June 2023 for the 15 research centers harvested at that time.

The time evolution of these publication numbers is observed in Figure 3 , as presented on the landing page of the dashboard. Since the data was collected in the middle of 2023, publication numbers decrease towards the end of the timeline. Before this decrease, literature publications range on a plateau on the order of 20,000 per year, data publications on the order of 3,000 per year.

Time evolution of literature- and linked data-publication numbers (after filtering), as presented on the dashboard. (status: June 2023)

Time evolution of literature- and linked data-publication numbers (after filtering), as presented on the dashboard. (status: June 2023).

As motivated in the introduction, identifying the repositories that research communities use to publish research data is often challenging. Our data pipeline provides one approach to collecting this information. For the data on the dashboard, we identified a list of 56 unique publishers (repository providers).

Figure 4 shows an exemplary decomposition of annual publication numbers for the five most frequently found publishers. These range from non-institutional generic to non-institutional disciplinary and institutional disciplinary data repositories. This information can be used to learn about the time-evolution of community-specific data publishing practices and is a starting point for further analyses.

Time evolution of linked data publications contained in the dashboard approach, after filtering (status: June 2023). The color codes indicate the five most frequent publisher names associated to that data

Time evolution of linked data publications contained in the dashboard approach, after filtering (status: June 2023). The color codes indicate the five most frequent publisher names associated to that data.

Integration of research data into a federated FAIR data space requires closing systematic gaps in the FAIRness of this data. Automated evaluation tools can help to identify such systematic gaps. In a first approach, we adopted the F-UJI framework to automatically perform a number of automated tests, probing specific aspects of FAIR. A weighted sum of scores for these tests leads to an overall score.

A histogram of overall F-UJI scores can be observed in Figure 5 . It reveals several clusters with similar scores. The Figure also encodes information about how the five most frequently used data repositories are represented in that data. A prominent observation is the almost uniform F-UJI score for data published in a disciplinary data repository for the high energy physics communities. This observation indicates a major influence of data repositories on specific aspects of FAIR. We attribute this to repository-specific technical implementations.

Histogram of scores determined with F-UJI version 1.4.7 for filtered data publications, as presented in the dashboard (status: June 2023). The color codes indicate the five most frequent publisher names associated with these data publications

Histogram of scores determined with F-UJI version 1.4.7 for filtered data publications, as presented in the dashboard (status: June 2023). The color codes indicate the five most frequent publisher names associated with these data publications.

The dashboard allows for detailed inspection of the test results related to specific FAIR criteria and hence for targeted gap analyses. For example, a follow-up analysis for FAIR principle R1.1 (not shown) reveals that a large portion of data publications in crystallographic databases lack clearly identifiable human- or machine-readable license information. This contributes to the rather low overall scores for the cluster observed at the lower end of Figure 5 . This example illustrates how the data shown in the dashboard can help to identify and visualize specific gaps related to specific aspects of FAIR on the level of data repositories.

4 Discussion

4.1 findability of research data publications.

Data publications, to be discovered and included in the dashboard, need to meet minimum requirements of findability (the F in FAIR). The dashboard, hence, systematically lacks research data that is neither indexed nor linked to literature publications. The list of 56 research repositories identified from the data harvesting pipeline overlaps only partially with (i) a list of institutional repositories with active contributions from our federated research organization ( Helmholtz Open Science Office, n.d. ) and (ii) a list of repositories derived from a broad community survey. ( Arndt et al., 2022 ; Gerlich et al., 2022 ) We attribute this discrepancy (a) to the findability-bias and (b) to the finding that lots of research data is published in community-established, non-institutional, disciplinary repositories which are not included in the list of institutional data repositories. This also shows the importance of a multi-method approach.

Our advice to research data infrastructure who strive to improve the findability of data publications by automated approaches is, hence, (a) to register persistent identifiers (e.g. DOIs) with metadata records and (b) to include qualified links (for example, when choosing DataCite, by using the field relatedIdentifiers to link related research outputs while specifying the field relationType for the relationship between the primary and the linked research outputs). Specifically, when registering a PID at DataCite for data that directly supplements a research article, the use of the term IsSupplementTo supports an improved findability and a clearer assignment of its relationship to the research article. The evaluation of Scholix links for literature and linked data publications ( Kubin, 2024 ) indicates that a considerable amount of data publications is not identified by the filter mechanisms discussed here. A reason for this could be inconsistent registration practices of relationship types by research data infrastructure or publishers.

A general advice to those publishing research articles is to incentivize research communities to ‘formally’ cite supplementary data publications in the reference list with their persistent identifiers. Manual sampling of the literature harvested indicated that data citation practices in research articles are heterogeneous, ranging from mentioned database IDs to data availability statements with a data citation in the reference list. Along the lines of Gregory et al. ( 2023 ), this greatly improves the findability of supplementary data publications by humans and machines. Changing the culture of data citation practices will involve stakeholders from data infrastructure providers, journal publishers and research communities. Communication to include the topic of data citation practices into training curricula is underway. A future topic to study with adaptations of the dashboard could focus on data citation practices.

4.2 FAIR evaluation of research data

The data harvesting and evaluation pipeline presented here adopted the F-UJI framework as one prominent example for automated FAIR evaluation of research data. As illustrated in the Results section, testing specific, machine-actionable aspects of the FAIR principles can provide useful guidance for identifying and closing systematic gaps in FAIR data practices, particularly on the level of research data infrastructures. For the future, we aim at integrating complementary evaluation frameworks to ensure cross-validation of results.

Most machine actionable aspects of FAIR depend on the research data infrastructure used to make data available. As discussed above, critical infrastructure to ensure findability and accessibility of research data is in the hands of repository providers. In addition, they can help by technically supporting machine-understandable metadata in standardized metadata records and on the landing pages as well as by incentivizing the use of license information to enable the reuse of research data.

A pilot approach to counseling research data infrastructure providers revealed first insights into repository providers’ needs ranging from (a) metadata considerations and consultation, (b) addressing technical questions, and (c) general guidelines and resources to (d) FAIR metrics and tools. We would also like to highlight the expressed need for (e) networking and exchange with other institutional repository providers. We aim at further intensifying our exchange with data infrastructure providers in the future. The dashboard was a useful communication tool for such an individual counseling session and further consultancy approaches are underway.

Manual FAIR evaluation of research data based on the FAIR Data Maturity Model ( Bahim et al., 2020 ; FAIR Data Maturity Model Working Group, 2020 ) complementary to the automated assessment with F-UJI, shows that on the dataset level , interoperability and reusability of research data rather depends on the researchers and data professionals producing and curating the data. ( Günther et al., 2024 ; Kubin et al., 2022 ) Improving these requires effective strategies to implement and harmonize metadata practices across researchers and data professionals. Models on how to engage communities on the practical implementation of the FAIR principles are described, for example, in recent articles by Belliard et al. (2023) and Rocca-Serra et al. (2023) .

5 Conclusion

We have established a data collection and analysis pipeline to monitor and assess the state of FAIR data within our federated research organization. The data collected in this way can be explored interactively in an open-access dashboard aimed at various communities. The dashboard serves several purposes: It enables the identification of the research data infrastructure used by researchers within the research organization to make their research data available, it enables the review of assessment results regarding the FAIRness of data publications, and helps to identify gaps on the way to a unified FAIR data space. It engages both the research communities and data infrastructure providers, fostering collaboration and improving data management practices. By providing this interactive dashboard, we aim to improve the visibility and accessibility of research data assets within our organization while making progress towards FAIR data practices on a broad scale.

Data Accessibility Statement

The data discussed in this contribution and presented on the dashboard ( Helmholtz-Metadata Collaboration, 2023 ), and data used to analyze Scholix RelationshipType categories, are published on Zenodo ( Kubin, 2024 ; Kubin et al., 2024 ).The source code developed for the harvesting pipeline ( Preuß et al., 2024 ) and the interactive dashboard ( Sedeqi et al., 2024 ) is published on Zenodo and on GitLab. ( Helmholtz Metadata Collaboration, 2023 ).

Funding Statement

Funded by: Helmholtz-Zentrum Berlin (HZB); Helmholtz Metadata Collaboration (HMC).

Acknowledgements

We thank the Helmholtz libraries for the friendly support in setting up OAI-PMH harvesting pipelines, the Helmholtz Open Science Office for many useful discussions, individual test users providing essential feedback for improving the usability of the dashboard, and the staff of RODARE for the insightful discussions.

Funding information

This project was developed by HMC Hub Matter, located at Helmholtz-Zentrum Berlin (HZB) and part of the Helmholtz Metadata Collaboration (HMC), an incubator-platform of the Helmholtz Association within the framework of the Information and Data Science strategic initiative.

Competing Interests

The authors have no competing interests to declare.

Author Contributions

Author contributions are summarized following the CRediT taxonomy: M.K. conceptualized the study. M.K, M.R.S, G.P. curated the data. M.K, M.R.S, A.S, A.G, G.G. formally analyzed the data. M.K, M.R.S, A.S, A.G, T.G, V.S, G.G, N.L.W, G.P. conducted the investigation. M.K, M.R.S, A.S, A.G, T.G, V.S, G.P, O.M developed the methodology. M.K. administrated the project. M.K, A.G, G.G, N.L.W, G.P, O.M provided study materials (resources). M.K, M.R.S, A.S, A.G, T.G, G.P. worked on the software. M.K, O.M supervised the project. M.K, M.R.S, A.S, A.G, G.G, G.P. validated the data. M.K, M.R.S, A.S, A.G, V.S. visualized the data. M.K. wrote the original draft of this report. M.K, T.G, N.L.W, G.P, O.M reviewed and edited the report.

Arndt, W., Gerlich, S., Hofmann, V., Kubin, M., Kulla, L., Lemster, C., Mannix, O., Rink, K., Nolden, M., Schweikert, J., Shankar, S., Söding, E., Steinmeier, L. and Süß, W. (2022) ‘A survey on research data management practices among researchers in the helmholtz association’ (tech. rep.). Germany: HMC Office, GEOMAR Helmholtz Centre for Ocean Research Kiel. DOI: https://doi.org/10.3289/hmc_publ_05  

Bahim, C., Casorrán-Amilburu, C., Dekkers, M., Herczog, E., Loozen, N., Repanas, K., Russell, K. and Stall, S. (2020) ‘The fair data maturity model: An approach to harmonise fair assessments’, Data Science Journal , p. 19. DOI: https://doi.org/10.5334/dsj-2020-041  

Bassinet, A., Bracco, L., L’Hôte, A., Jeangirard, E., Lopez, P. and Romary, L. (2023) Large-scale machine-learning analysis of scientific PDF for monitoring the production and the openness of research data and software in France (hal-04121339v3). Available at: https://hal.science/hal-04121339  

Belliard, F., Maineri, A.M., Plomp, E., Ramos Padilla, A.F., Sun, J. and Zare Jeddi, M. (2023) ‘Ten simple rules for starting fair discussions in your community’, PLOS Computational Biology , 19(12), pp. 1–16. DOI: https://doi.org/10.1371/journal.pcbi.1011668  

BIH QUEST Center for Responsible Research. (n.d.) Charité dashboard on responsible research . Available at: https://quest-dashboard.charite.de/ (Accessed: 27 March 2024).  

BIH Quest Center for responsible research. (2024) ‘Dashboard on Responsible Research’, GitHub repository . Available at: https://github.com/quest-bih/dashboard (Accessed: 27 March 2024).  

Burton, A., Aryani, A., Koers, H., Manghi, P., Bruzzo, S. L., Stocker, M., Diepenbroek, M., Schindler, U. and Fenner, M. (2017) ‘The scholix framework for interoperability in data-literature information exchange’, D-Lib Magazine , 23(1/2). DOI: https://doi.org/10.1045/january2017-burton  

Castell, W., Dransch, D., Juckeland, G., Meistring, M., Fritzsch, B., Gey, R., Höpfner, B., Köhler, M., Meesen, C., Mehrtens, H., Mühlbauer, F., Schindler, S., Schnicke, T. and Bertelmann, R. (2024) Towards a quality indicator for research data publications and research software publications – A vision from the helmholtz association . arXiv [Preprint]. DOI: https://doi.org/10.48550/arXiv.2401.08804  

Cobey, K.D., Haustein, S., Brehaut, J., Dirnagl, U., Franzen, D.L., Hemkens, L.G., Presseau, J., Riedel, N., Strech, D., Alperin, J.P., Costas, R., Sena, E.S., van Leeuwen, T., Ardern, C.L., Bacellar, I.O.L., Camack, N., Correa, M.B., Buccione, R., Cenci, M.S., Fergusson, D.A., Gould van Praag, C., Hoffman, M.M., Bielemann, R.M., Moschini, U., Paschetta, M., Pasquale, V., Rac, V.E., Roskams-Edris, D., Schatzl, H.M., Stratton, J.A. and Moher, D. (2023) ‘Community consensus on core open science practices to monitor in biomedicine’, PLOS Biology , 21(1), e3001949. DOI: https://doi.org/10.1371/journal.pbio.3001949  

Devaraju, A. and Huber, R. (2020) F-uji – an automated fair data assessment tool [software V1.0.0)]. DOI: https://doi.org/10.5281/ZENODO.4063720  

Devaraju, A. and Huber, R. (2021) ‘An automated solution for measuring the progress toward fair research data’, Patterns , 2(11), 100370. DOI: https://doi.org/10.1016/j.patter.2021.100370  

Devaraju, A., Huber, R., Mokrane, M., Herterich, P., Cepinskas, L., de Vries, J., LH́ours, H., Davidson, J. and White, A. (2020) Fairsfair data object assessment metrics (Working paper v.0.4). DOI: https://doi.org/10.5281/zenodo.4081213  

FAIR Data Maturity Model Working Group. (2020) Fair data maturity model: Specification and guidelines , Zenodo. DOI: https://doi.org/10.15497/RDA00050  

Gerlich, S. C., Hofmann, V., Kubin, M., Kulla, L., Lemster, C., Mannix, O., Rink, K., Schweikert, J., Shankar, S., Söding, E., Steinmeier, L. and Suess, W. (2022) HMC community survey 2021 [Data File v.1.0.0]. DOI: https://doi.org/10.7802/2433  

Gregory, K., Ninkov, A., Ripp, C., Roblin, E., Peters, I. and Haustein, S. (2023) ‘Tracing data: A survey investigating disciplinary differences in data citation’, Quantitative Science Studies , 4(3), pp. 622–649. DOI: https://doi.org/10.1162/qss_a_00264  

Günther, G., Baunack, S., Capozza, L., Freyermuth, O., Gonzalez-Caminal, P., Gou, B., Isaak, J., Karstensen, S., Lindner, A., Maas, F., Mannix, O., Mistry, A., Oceano, I., Schneide, C., Schwarz, K., Schörner-Sadenius, T., Serve, V., Stein, L. M., Typel, S. and Wilfert, M. (2024) ‘IR of FAIR – Principles at the Instrument Level’, Proc. ICALEPCS2́3 , (19), pp. 1046–1050. DOI: https://doi.org/10.18429/JACoW-ICALEPCS2023-WE3BCO09  

Helmholtz-Metadata Collaboration. (2023) ‘HMC dashboard on open and FAIR data in Helmholtz’. Helmholtz Metadata Collaboration (HMC) . Available at: https://fairdashboard.helmholtz-metadaten.de/ (Accessed: 26 July 2024).  

Helmholtz Metadata Collaboration. (2023) ‘HMC dashboard on open and FAIR data in Helmholtz’. GitLab repository . Available at: https://codebase.helmholtz.cloud/hmc/hmc-public/FAIR-dashboard/ (Accessed: 29 March 2024).  

Helmholtz Open Science Office. (n.d.) Research data infrastructures at Helmholtz Association . Available at: https://os.helmholtz.de/open-research-data/forschungsdatenrepositorien/ (Accessed: 16 March 2023).  

Huber, R. and Devaraju, A. (2021) F-UJI: FAIRsFAIR Research Data Object Assessment Service. GitHub repository . Available at: https://github.com/pangaea-data-publisher/fuji (Accessed: 27 March 2024).  

Iarkaeva, A., Nachev, V. and Bobrov, E. (2023) Workflow for detecting biomedical articles with underlying open and restricted-access datasets. MetaArxiv [Preprint]. DOI: https://doi.org/10.31222/osf.io/z4bkf  

Jeangirard, E. (2019). ‘Monitoring open access at a national level: French case study’, ELPUB 2019 23rd edition of the International Conference on Electronic Publishing. Marseille, France, June 2019. Hal Open Science. DOI: https://doi.org/10.4000/proceedings.elpub.2019.20  

Kubin, M. (2022) ‘Monitoring data publications – A dashboard approach in HMC hub matter’, 2nd Helmholtz Open Science Practice Forum Research Data Management , Oct 20. Zenodo. DOI: https://doi.org/10.5281/zenodo.7313753  

Kubin, M. (2024) Sample data for evaluating Scholix relationship SubTypes for linked data publications (Version 1.0), Zenodo. DOI: https://doi.org/10.5281/zenodo.11372082  

Kubin, M., Günther, G., Cristiano, L., Görzig, H., Krahl, R. and Mannix, O. (2022). ‘Lessons learned from applying the fair data maturity model to a prototypical data pipeline in matter’, HMC Conference 2022, Zenodo. DOI: https://doi.org/10.5281/ZENODO.7313873  

Kubin, M., Sedeqi, M.R., Schmidt, A., Gilein, A., Glodwoski, T., Preuß, G. and Mannix, O. (2024). Dataset shown on the HMC FAIR Data Dashboard as of June 2023 (Version 1.0), Zenodo. DOI: https://doi.org/10.5281/zenodo.10890383  

Lagoze, C., de Sompel, H.V., Nelson, M. and Warner, S. (2015) The open archives initiative protocol for metadata harvesting (v.2.0). Available at: https://www.openarchives.org/OAI/openarchivesprotocol.html (Accessed: 26 March 2024).  

Liu, S., Golozar, A., Buesgens, N., McLeggon, J.A., Black, A. and Nagy, P. (2024) ‘A framework for understanding an open scientific community using automated harvesting of public artifacts’, JAMIA Open , 7(1). DOI: https://doi.org/10.1093/jamiaopen/ooae017  

Nagel, L. and Lycklama, D. (2021) ‘Design Principles for Data Spaces - Position Paper’. Zenodo . DOI: https://doi.org/10.5281/zenodo.5105743  

OpenAIRE. (n.d.) ‘OpenAIRE | Monitor’. OpenAIRE . Available at: https://monitor.openaire.eu (Accessed: 26 July 2024).  

OpenAlex. (n.d.) The open catalog to the global research system . Available at: https://openalex.org/ (Accessed: 27 March 2024).  

Preuß, G., Schmidt, A., Gilein, A., Glodowski, T., Serve, V., Sedeqi, M.R., Mannix, O. and Kubin, M. (2024, March) HMC toolbox for data mining (v.1.0.0), Zenodo. DOI: https://doi.org/10.52825/cordi.v1i.389  

Rocca-Serra, P., Gu, W., Ioannidis, V., Abbassi-Daloii, T., Capella-Gutierrez, S., Chandramouliswaran, I., Splendiani, A., Burdett, T., Giessmann, R.T., Henderson, D., Batista, D., Emam, I., Gadiya, Y., Giovanni, L., Willighagen, E., Evelo, C., Gray, A.J.G., Gribbon, P., Juty, N., Welter, D., Quast, K., Peeters, P., Plasterer, T., Wood, C., van der Horst, E., Reilly, D., van Vlijmen, H., Scollen, S., Lister, A., Thurston, M., Granell, R., Backianathan, G., Baier, S., Thomsen, A.C., Cook, M., Courtot, M., d’Arcy, M., Dauth, K., del Piico, E.M., Garcia, L., Goldmann, U., Grouès, V., Clarke, D.J.B., Lefloch, E., Liyanage, I., Papadopoulos, P., Pommier, C., Reynares, E., Ronzano, F., Delfin-Rossaro, A., Sagatopam, V., Sedani, A., Sedlyarov, V., Shilova, L., Singh, S., Strubel, J., van Bochove, K., Warnes, Z., Woollard, P., Xu, F., Zaliani, A., Sansone, S.A. and the FAIR Cookbook Contributors. (2023) ‘The fair cookbook – the essential resource for and by fair doers’, Scientific Data , 10(1), pp. 292. DOI: https://doi.org/10.1038/s41597-023-02166-3  

ScholExplorer. (n.d.) ‘The OpenAIRE Scholexplorer: the data literature interlinking service’. OpenAIRE . Available at: https://scholexplorer.openaire.eu/ (Accessed: 27 March 2024).  

Sedeqi, M.R., Preuß, G., Gilein, A., Glodowski, T., Serve, V., Schmidt, A., Mannix, O. and Kubin, M. (2024) HMC fair data dashboard (V.1.0.0), Zenodo. DOI: https://doi.org/10.52825/cordi.v1i.389  

United Nations Educational Scientific and Cultural Organization. (2021) ‘UNESCO recommendation on open science’, The General Conference of the United Nations Educational, Scientific and Cultural Organization (UNESCO). Paris, France, pp. 9–24, Nov 2021. UNESCO. DOI: https://doi.org/10.54677/MNMH8546  

Wilkinson, M.D., Dumontier, M., Aalbersberg, I.J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L.B., Bourne, P.E., Bouwman, J., Brookes, A.J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C.T., Finkers, R., Gonzalez-Beltran, A., Gray, A.J.G., Groth, P., Goble, C., Grethe, J.S., Heringa, J., ’t Hoen, P.A.C., Hooft, R., Kuhn, T., Kok, R., Kok, J., Lusher, S.J., Martone, M.E., Mons, A., Packer, A.L., Persson, B., Rocca-Serra, P., Roos, M., van Schaik, R., Sansone, S.-A., Schultes, E., Sengstag, T., Slater, T., Strawn, G., Swertz, M.A., Thompson, M., van der Lei, J., van Mulligen, E., Velterop, J., Waagmeester, A., Wittenburg, P., Wolstencroft, K., Zhao, J. and Mons, B. (2016) ‘The fair guiding principles for scientific data management and stewardship’, Scientific Data , 3(1), p. 160018. DOI: https://doi.org/10.1038/sdata.2016.18  

July 24, 2024

China-U.S. Science Collaborations Are Declining, Slowing Key Research

The U.S. and China are collaborating less on projects across scientific disciplines amid a culture of fear in both countries

By Gemma Conroy & Nature magazine

Chinese and US Flags cut in stripes.

Manuel Augusto Moreno/Getty Images

China’s scientific collaboration with other countries has declined since the pandemic, driven by falling partnerships with the United States, an analysis shows.

Scientists have been warning that political tensions between China and the United States, combined with the pandemic, have affected research collaborations between the two countries. But it takes time for evidence of this sort of decline to accumulate in research databases.

The latest evidence comes from an analysis conducted by Springer Nature’s team in China. ( Nature ’s news team is editorially independent of its publisher, Springer Nature.) The authors used InCites, a tool owned by publishing-analytics firm Clarivate, based in London, to analyse internationally co-authored articles that were published between 2013 and 2023. InCites draws on papers indexed in the science-citation database Web of Science.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

They found that in 2022, the total number papers co-authored by researchers from China and their international peers declined for the first time since 2013 (See 'Research Powerhouse').

RESEARCH POWERHOUSE. Chart compares the number of research papers by Chinese authors with those co-authored with international peers.

The proportion of research papers with Chinese and international co-authors has been falling for even longer. At its peak, in 2018, 26.6% — roughly 110,000 articles — of China’s output in the InCites database was co-authored with international colleagues. By 2023, the proportion of the country’s articles with international peers had dropped by 7.2%, despite China’s overall number of articles almost doubling to 759,000 over the same period.

The drop in internationally co-authored papers is mainly due to China’s declining share of papers published with US researchers, which fell by 6.4% between its peak in 2017 and 2023 — the largest decline of any country included in the analysis. The findings were presented at the Zhongguancun Forum in Beijing on 25 April.

The decline in US-China collaborations echoes findings from a 2022 analysis conducted for Nature , which found that the number of researchers with dual US and China affiliations on research articles in Elsevier’s Scopus database had fallen by more than 20% between 2019 and 2021.

Although the latest analysis shows that the share of US–China articles has been slowly declining over the past six years, the pandemic exacerbated the downward trend, says Marina Zhang, an innovation researcher who focuses on China at the University of Technology Sydney in Australia.

Political tensions

Zhang says that ongoing geopolitical tensions between the United States and China have also fuelled the decline. “This is especially worrying for researchers,” says Zhang. The US Department of Justice’s controversial China Initiative — which was launched in 2018 to tackle espionage in research and industry — ended in 2022. The crackdown resulted in several scientists being arrested over their ties to collaborators or institutions in China, and has stoked fear among researchers of Chinese descent. Since then, the US government has adopted a range of policies focused on tightening research security . And in July 2023, the Chinese government implemented its revised counter-espionage law, which broadened the definition of what constitutes spying.

The crackdown on perceived foreign interference in both the United States and China is making researchers more cautious about collaborating, says Zhang. Restrictive policies and the climate of fear could end up driving talent away from certain countries and fields, leading to a “brain drain and a loss of valuable human capital”, she says.

This “chilling effect” on US–China collaborations is already hindering influential research, says Tang Li, a researcher who specializes in science and innovation policy at Fudan University in Shanghai, China. For instance, a 2024 study examined the effect that the foreign-interference investigations at the US National Institutes of Health (NIH) had on researchers and found that those in the United States with collaborators in China were less productive during this period than were their colleagues with scientific partners in other countries.

Zhang says that the faltering collaborative ties between the United States and China could also result in the countries pursuing the same types of research separately, instead of joining forces to tackle global problems such as climate change , pandemics and food security.

Turning inwards

More worryingly, the countries might increasingly prioritize domestic interests over international cooperation, which could make scientific research a more nationalistic endeavour, says Zhang.

China’s collaborations with other countries have also tapered off since 2020, but not as markedly as those with the United States. Tang says that reviving US–China collaborations is crucial because such scientific partnerships could help to bridge the gap between the two countries. “Given the increasing global disasters and uncertainties, humanity cannot afford to waste time on nationalistic rivalries,” she says.

This article is reproduced with permission and was first published on July 19, 2024 .

science fair research papers

  • Programs and Projects
  • Work with us
  • Diversity, Equity, and Inclusion at NTI
  • Annual Reports and Financials

Machine-Building Plant (Elemash)

  • Location Elektrostal, Moscow Oblast
  • Type Nuclear-Weaponization
  • Facility Status Operational

Want to dive deeper?

Visit the Education Center

My Resources

Send saved resources to:

Zhukovsky International Airport

Zhukovsky International Airport, formerly known as Ramenskoye Airport or Zhukovsky Airfield - international airport, located in Moscow Oblast, Russia 36 km southeast of central Moscow, in the town of Zhukovsky, a few kilometers southeast of the old Bykovo Airport. After its reconstruction in 2014–2016, Zhukovsky International Airport was officially opened on 30 May 2016. The declared capacity of the new airport was 4 million passengers per year.

science fair research papers

Sygic Travel - A Travel Guide in Your Pocket

Get it on Google Play

More interesting places

  • Privacy Policy
  • STOCK 360° TRAVEL VIDEOS

POLICY AND PRACTICE REVIEWS article

Politicizing science funding undermines public trust in science, academic freedom, and the unbiased generation of knowledge.

\r\nIgor R. Efimov

  • 1 McCormick School of Engineering, Northwestern University, Chicago, IL, United States
  • 2 Harvard Medical School, Boston, MA, United States
  • 3 James Madison Program in American Ideals and Institutions, Princeton University, Princeton, NJ, United States
  • 4 Department of Chemistry, University of Southern California, Los Angeles, CA, United States
  • 5 Department of Biology, Williams College, Williamstown, MA, United States
  • 6 Department of Molecular and Cell Biology, University of California Berkeley, Berkeley, CA, United States
  • 7 Independent Consultant, Pasadena, CA, United States
  • 8 Department of Mathematics, University of California, Davis, Davis, CA, United States

This commentary documents how federal funding agencies are changing the criteria by which they distribute taxpayer money intended for scientific research. Increasingly, STEMM (Science, Technology, Engineering, Mathematics, and Medicine) funding agencies are requiring applicants for funding to include a plan to advance DEI (“Diversity, Equity, and Inclusion”) in their proposals and to dedicate a part of the research budget to its implementation. These mandates undermine the academic freedom of researchers and the unbiased generation of knowledge needed for a well-functioning democracy. Maintaining excellence in science is fundamental to the continuation of the U.S. as a global economic leader. Science provides a basis for solving important global challenges such as security, energy, climate, and health. Diverting funding from science into activities unrelated to the production of knowledge undermines science's ability to serve humankind. When funding agencies politicize science by using their power to further a particular ideological agenda, they contribute to public mistrust in science. Hijacking science funding to promote DEI is thus a threat to our society.

Do we want the mixture of students who are going to be trained to do advanced medical research to be representative of the demographic make-up of the population as a whole—or do we want whatever students, from whatever backgrounds, who have track records demonstrating a mastery of medical science that gives them the highest probability of finding cures for cancer, Alzheimer's, and other devastating diseases? Endeavors have purposes . Is indulging ideological visions more important than ending cancer and Alzheimer's?

– Thomas Sowell, Social Justice Fallacies (2023)

1 Introduction

Science is essential for humankind to thrive. Science is the foundation of technologies that deliver food, energy, and medicine. Scientific progress has contributed to the greatly improved human condition worldwide, including higher standards of living, lengthened lifespans, and the eradication of deadly diseases and famine. Science was the key ingredient of the industrial revolution, which propelled Western democracies to economic prosperity. Maintaining strong basic science research and education is essential for a country's security and technological competitiveness. The U.S. and other developed countries have long recognized the need for sustained support of the STEMM (Science, Technology, Engineering, Mathematics, and Medicine) fields.

The U.S. invests hundreds of billions of dollars annually in support of STEMM (AIP, (n.d.)). These funds, which ultimately derive from the taxpayers, are managed by federal funding agencies, whose role is to distribute these resources to scientists and to ensure that the funds are used effectively to produce the best outcomes for the public good. Funding agencies, therefore, play a key role in the scientific enterprise, the production of knowledge, and—ultimately—technological progress and improved quality of life.

In the U.S., the major agencies responsible for science funding are the National Science Foundation (NSF), the National Institutes of Health (NIH), the Department of Energy (DOE), the National Aeronautics and Space Administration (NASA), and the Department of Defense (DOD). Funding agencies have domains and priorities defined by their missions. For example, NSF funds fundamental research, NIH focuses on research related to human health, DOE supports research related to energy, and NASA funds research related to space. Table 1 lists the mission statements of selected agencies.

www.frontiersin.org

Table 1 . Budgets and mission statements of NSF, NIH, DOE, and NASA.

The U.S. funding agencies have excellent track records, as evidenced by the immense success that American science has enjoyed ( Graham and Diamond, 1996 ; Urquiola, 2020 ). Recently, however, the function of these essential institutions has been undergoing significant changes. There has been a broad effort to use science funding to further the “Diversity, Equity, and Inclusion” (DEI) agenda ( OSTP, 2022 ; Barabino et al., 2023 ; EO 13985 ; EO 14091 ). While the terms “diversity,” “equity,” and “inclusion” connote lofty goals with which the majority of Americans agree, a close look at what is actually implemented under the DEI umbrella reveals that these words represent something entirely different.

Actual DEI policies do not promote viewpoint diversity, equitable treatment of individuals based on their accomplishments, or equal opportunity for individuals regardless of their identity (e.g., race, sex, ethnicity). It can scarcely be questioned ( Krylov and Tanzman, 2024 ) that DEI programs today are driven by an ideology, an offshoot of Critical Social Justice 1 (CSJ) ( Pluckrose, 2021 ; Deichmann, 2023 ). DEI programs elevate the collective above the individual. They group people into categories defined by immutable characteristics (race, sex, etc.) and classify each group as either “privileged” or “victimized,” as “oppressor” or “oppressed.” The goals of DEI programs are to have each group participate in proportion to their fraction of the population in every endeavor of society and to obtain proportionate outcomes from those endeavors. Disproportionate outcomes (with respect to science, such outcomes as publications, funding, citations, salaries, and awards), or disparities, are axiomatically ascribed to systemic factors, such as systemic racism and sexism, without consideration of alternative explanations ( Sowell, 2019 , 2023 ). Claims, such as “The presence of disparities is proof of systemic racism” and “Meritocracy is a myth” are propagated widely despite the vagueness of the claims and their lack of support by concrete data. Similarly, tenets that are central to DEI ideology—such as diversity is excellence, diverse teams outperform homogenous teams, and the advancement of women is impeded by biases—lack a robust evidence base, particularly when applied to science ( Ceci et al., 2021 , 2023 ; Abbot et al., 2023 ; Krylov and Tanzman, 2023 ; Ceci and Williams, 2024 ). Table 2 lists several examples of such axiomatic statements that funding agencies have made.

www.frontiersin.org

Table 2 . Exhibits of statements made by funding agencies on the benefits of DEI.

Disturbingly, CSJ is increasingly infused into every domain of the scientific enterprise—education, publishing, hiring and promotion, conferences, awards, and the allocation of funding ( Abbot et al., 2023 ). Institutions (universities, professional associations and honor societies, and publishing houses) are subordinating scientific achievement and promise to CSJ-informed practices, such as DEI initiatives.

In this commentary, we discuss how U.S. funding agencies have begun to impose DEI requirements as a prerequisite for STEMM funding without evidence that it furthers their mission or improves the research they fund. The spread of the DEI agenda is driven both by grassroots activism and by the government, mandated by executive orders ( OSTP, 2022 ; EO 13985 ; EO 14091 ). We note that funding agencies spend significant amounts of money on specific DEI initiatives, such as research on systemic “-isms” or specialized training and support for prioritized groups. However, in this commentary, we focus on how DEI has become a mandatory part of fundamental research proposals. The current approach to linking DEI considerations to funding decisions dilutes achievement- and merit-based criteria, which means that the funds are not necessarily used to support the best scientific ideas and projects. By requiring specific demographic outcomes, it also undermines the academic freedom of scientists to execute their research plans in an optimal manner. This diversion of public funds undermines science's ability to serve society. Moreover, when funding agencies use their power to further a particular political or ideological agenda, they contribute to public mistrust of science and scientific institutions. For these reasons, using science funding for promotion and adoption of ideologically driven DEI programs undermines the integrity of science funding, contributes to politicization of science, and represents a threat to society.

2 Introduction of DEI considerations into science funding decisions

Historically, U.S. funding agencies have sought to allocate funds based on the intellectual merit of the proposed ideas, the soundness of the technical plan, the feasibility of the requested budget, the alignment of the proposed research with the agency mission, and the principal investigators' (PIs) track records ( Geiger, 1993 ; Morin, 1993 ; Graham and Diamond, 1996 ). This merit-based approach, however, is now being deemphasized. Funding agencies are introducing additional requirements not related to the proposed science, such as mandated DEI statements and plans and requirements to formulate research projects through a lens sympathetic to DEI ideology. These requirements are often vague and the related review criteria opaque. Introducing non-scientific review criteria implies a trade-off between scientific merit and other factors. At present, there is no concrete evidence that such changes in funding policies result in better scientific outcomes. 2 We illustrate, below, the injection of DEI considerations into scientific funding decisions with examples from the DOE, NASA, NIH, and NSF.

The DOE has introduced a requirement that every research proposal include a PIER (Promoting Inclusive and Equitable Research) plan ( DOE, 2023a ), which “should describe the activities and strategies that investigators and research personnel will incorporate to promote diversity, equity, inclusion, and accessibility in their research projects.... The PIER Plans will be evaluated... as part of the peer review process ( DOE, 2022 ).” The DOE webpage ( DOE, 2023b ) “ Things to Consider When Developing a PIER Plan ” “encourages” applicants to consider the composition of the project team, including project personnel and partnering institutions; the research environment; the implementation of the research project; and the scholarly and professional growth of project personnel. “This includes but is not limited to: distribution of leadership responsibilities among project key personnel; mentoring and/or training opportunities for project personnel; equitable access of project personnel to professional development opportunities; inclusive and equitable plans for recognition on publications and presentations; inclusive practices for community engagement and strategic planning meetings or events; and/or communication of research goals and results to broader audiences.” Even proposals requesting funds to support a conference need to include a PIER plan ( DOE, 2023a ).

NASA now requires research proposals to elaborate how the proposed work will “further NASA's inclusion goals.” These inclusion plans will be evaluated by panels composed of 50% scientists and 50% DEI professionals using the criteria shown in Figure 1 , which, notably, necessitate that investigators applying for funds accept the axiomatic existence of systemic barriers preventing their research team from being inclusive. Quoting Nahm and Watkins (2023) , applicants for funding are encouraged to:

• Request time or funded work effort for team members to carry out proposed IP [Inclusion Plan] activities.

• Hire IDEA [Inclusion, Diversity, Equity, and Accessibility] experts as consultants to advise the team on the proposed IP activities ( consider paying them well, too! ). [emphasis ours]

• Cite references to appropriate [i.e., papers purporting existence of systemic -isms and biases] literature in a references section separate from that of the S/T/M [i.e., technical] section.

• Request funds to support IP activities, such as training for the proposal team.

www.frontiersin.org

Figure 1 . Slides from the presentation ( Nahm and Watkins, 2023 ) describing NASA's Inclusion Plan Pilot Program (June 22, 2023).

Hence, applicants must profess the belief (one not itself supported by science) that certain systemic barriers exist, dedicate time and budget for DEI activities to reduce said barriers, provide a lengthy plan (a recommendation of the Pilot Program is to extend the page limit for Inclusion Plans) and—if the funding application is successful—report on these activities.

The NIH's BRAIN initiative has implemented a requirement that, as part of the grant application, applicants submit a “Plan for Enhancing Diverse Perspectives (PEDP)” ( NIH, 2021 ). NIH explains, however, that by “diverse perspectives,” they mean people, without regard to their scientific or scholarly perspectives. In their own words, “PEDP is a summary of strategies to advance the scientific and technical merit of the proposed project through inclusivity. Broadly, diverse perspectives refer to the people who do the research, the places where research is done, as well as the people who participate in the research as part of the study population.... Applicants are expected to show how enhancing diverse perspectives is supported throughout the application and how this strengthens the scientific and technical merit of the project (in terms of significance, investigator(s), innovation, approach, and environment)” ( NIH, 2023a ; emphasis ours).

Other NIH programs require similar DEI plans and reporting. For example, applicants are required to describe how their strategies for recruiting students and postdocs (“trainees,” in NIH's lingo) will increase the participation of underrepresented groups ( NIH, 2023a , b ). This requirement implicitly makes talent, skills, motivation, ability to carry out research, and scientific potential of the future trainees secondary to the goal of increasing diversity:

Recruitment Plan to Enhance Diversity (NOT-OD-20-031) :

The applicant must provide a recruitment plan to enhance diversity. Include outreach strategies and activities designed to recruit prospective participants from diverse backgrounds, e.g., those from groups described in the Notice of NIH's Interest in Diversity . Describe the specific efforts to be undertaken by the program and how the proposed plan reflects past experiences in recruiting individuals from underrepresented groups.

New applications must include a description of plans to enhance recruitment, including the strategies that will be used to enhance the recruitment of trainees from nationally underrepresented backgrounds and may wish to include data in support of past accomplishments.

Renewal applications must include a detailed account of experiences in recruiting individuals from underrepresented groups during the previous funding period, including successful and unsuccessful recruitment strategies. Information should be included on how the proposed plan reflects the program's past experiences in recruiting individuals from underrepresented groups.

For those individuals who participated in the research education program, the report should include information about the duration of education and aggregate information on the number of individuals who finished the program in good standing. Additional information on the required Recruitment Plan to Enhance Diversity is available at Frequently Asked Questions: Recruitment Plan to Enhance Diversity ( Diversity FAQs ).

Applications lacking a diversity recruitment plan will not be reviewed. [Emphasis ours.]

These requirements to incorporate DEI into each research proposal are alarming. They constitute compelled speech, they undermine the academic freedom of researchers, they dilute merit-based criteria for funding, they incentivize illegal discriminatory hiring practices, they erode public trust in science, and they contribute to administrative overload. “Diversity,” which is sometimes described as “diverse backgrounds” or “diverse views,” actually refers to select underrepresented identity groups ( Honeycutt, 2022 ; Brint, 2023 ; Brint and Frey, 2023 ).

2.1 The integration of DEI into fundamental, unrelated research is compelled speech

DEI statements are compelled speech ( AFA, 2022 ; Brint, 2023 ; Brint and Frey, 2023 ; Kennedy, 2024 ). Paraphrasing Sowell (2023) : Spiritual leaders encourage—funding agencies compel. As amply documented in the context of hiring (UC Berkeley, (n.d.); Abbot et al., 2023 ; Brint, 2023 ; Brint and Frey, 2023 ; Sailer, 2023a , b ), it is not sufficient for the applicant to make a thoughtful, reasonable statement about non-discrimination in line with applicable law; rather, the statement must fully align with DEI ideology. For example, applicants' DEI statements professing a doctrine of colorblindness have been systematically given the lowest score (UC Berkeley, (n.d.); Sailer, 2024 ). Similar rubrics are used by universities participating in the NIH program Faculty Institutional Recruitment for Sustainable Transformation (FIRST), 3 which provides support to new faculty hires ( Sailer, 2024 ). The guidelines these agencies provide to PIs (see, for example, Figures 1 , 2 ; Nelson, 2022 ; NIH, 2022 ; DOE, 2023b ; Nahm and Watkins, 2023 ) and funded proposals shared with us suggest that DEI plans must fully align with DEI ideology in order for the proposal to have any hope of success. For example, according to NASA's guidelines ( Nahm and Watkins, 2023 ):

The assessment of the Inclusion Plan will be based on […] the extent to which the Inclusion Plan demonstrated awareness of systemic barriers to creating inclusive working environments that are specific to the proposal team . [Emphasis ours.]

www.frontiersin.org

Figure 2 . Slides from the presentation by an NIH program officer explaining DEI requirements for NIH training grants ( Nelson, 2022 ).

However, not all observed disparities are due to systemic discrimination or biases ( Sowell, 2019 ; Ceci et al., 2023 ; Ceci and Williams, 2024 ). If an applicant's institution has already overcome any systemic barriers it may have had (or the applicant believes it has done so), then the applicant must lie or the proposal is doomed to fail. The requirement to write an “inclusion plan” implies that remedial action is needed. But if access to an applicant's research team is already fair and non-discriminatory, why should an applicant be required to write an inclusion plan, a plan that requires “awareness of systemic barriers specific to the proposal team” ( Nahm and Watkins, 2023 ), in order to succeed?

The demand to provide an inclusion plan without evidence that there is a need for one is compelled speech and an intrusion of ideology into the conduct of science. Forcing scientists to “acknowledge” and “show awareness of” systemic racism and “barriers to participation” in their institutions and teams ( Nahm and Watkins, 2023 ), even if none can be documented, misrepresents reality, is an offense to scientists who have worked hard to establish fair and transparent hiring practices in their institutions, and is inconsistent with scientific professional ethics and, indeed, the very vocation of the scientist.

Based on feedback the authors have received from federal agencies, uncritical adoption of the doctrine of systemic racism is required, even if entirely unrelated, or even detrimental to, the proposed project. Similar to what has been observed in faculty hiring (UC Berkeley, (n.d.); AFA, 2022 ; Abbot et al., 2023 ; Brint, 2023 ; Brint and Frey, 2023 ; Sailer, 2023a , b ), DEI statements informed by a doctrine of colorblindness and equal opportunity are generally rejected as “insufficient.” Proposing educational initiatives aimed at filling gaps in underrepresented candidates' skill sets is considered “deficit-centered” and is not well-received by review panels. In contrast to the well-established requirement that PIs document a track record of successful mentoring, DEI statements must contain lengthy narratives using prescribed terminology to explain how researchers and their institutions plan to uplift underrepresented groups. This conformity to ideological language is evident in funded proposals for which abstracts are in the public record (see, for example, Bamman, (n.d.); Simon, (n.d.) 4 ). Scientists cannot propose plans to help overcome what they observe to be the real barriers to success in their field; rather, they are compelled to conform to the DEI narrative.

2.2 DEI vs. merit: participation vs. substantive outcome; equity vs. equality

The interaction of DEI ideology with merit raises serious concerns ( Abbot et al., 2023 ). Introducing DEI plans into the evaluation of scientific proposals dilutes the criterion of intellectual merit, creating fertile ground for social engineering and corruption. Which proposal should be given priority for funding by DOE—the one demonstrating genuine promise in advancing solar energy research or the one promising to involve more female students? Should NIH fund the best ideas in cancer research or the best plans for achieving higher representation of LGBTQ+ researchers? We know from the history of totalitarian regimes that subjugated science to ideology, that when merit is diluted by other criteria, the chances that the most-meritorious research is funded are diminished ( Graham, 1987 ; Josephson, 2005 ).

While the goal of achieving equal opportunity is uncontroversial in the scientific community and in American society at large ( Gramlich, 2023 ), equality of outcome—so-called “equity”—is not. In the human rights literature, the “right to science” has been interpreted as a right to benefit fairly from the outcomes of science (AAAS, (n.d.)). Ensuring that the benefits of scientific progress are available to all can and should be ensured by policymakers. However, participating in the process of science must be merit-based, as in any field requiring specialized skill. The focus on “participation” in science treats science as an entitlement, requiring equal participation for all groups. Previously, the paradigm for allocating funding in science was to treat science as an investment and to strive to do the best possible science for the money, focusing on scientific outcomes rather than on group participation or representation. The merit-based system has historically outperformed the equity-based system in science by a wide margin ( Graham, 1987 ; Josephson, 2005 ).

2.3 Legal considerations/civil rights laws

The interaction of DEI with the legal system is troubling. First, the demands that PIs “acknowledge” systemic racism and “barriers to participation” in their institutions ( Nahm and Watkins, 2023 ), and insert land acknowledgments in their scientific publications [NSF, (n.d.(b))] raise grave legal concerns. The First Amendment of the Constitution of the United States strictly forbids compelling people to say things they do not believe are true. The circumstances under which government may condition grants or benefits on attesting that one holds a certain belief (e.g., “acknowledges” the truth to be this or that with respect to a contested matter), though somewhat obscure, are certainly limited ( Supreme Court, 2013 ). At a minimum, government's engaging in such conditioning on contested questions raises significant civil liberties concerns and is in tension with core First Amendment values.

Second, there are strict laws against discrimination on the basis of race and gender, both at federal and state levels. Thus, invoking DEI explicitly attempts to circumvent existing laws. Any actual “barriers” or “systemic discrimination” can be prosecuted under existing anti-discrimination statutes, following due process.

Third, even more worrying is that successful applications require principal investigators and their home institutions to engage in practices that are likely illegal. 5 For example, DEI “equity”-based plans for equal gender or racial participation can, in practice, only be implemented by gender- and race-preferential hiring. This is strictly illegal under civil rights employment law (EEOC, (n.d.); Title VI; Title IX).

Direct evidence of the intent of funding agencies to consider race as a factor in funding was revealed in an NIH initiative from 2021. The NIH put out a notice encouraging black scientists and those from other underrepresented groups to fill out a box for race on the funding application, which would flag their application for further consideration “even if the quality score that peer-review panels award the proposal falls outside the cutoff for most grants” ( Kaiser, 2021a ). The initiative has since been rescinded ( Kaiser, 2021b ), but NIH continues to emphasize that “diversity of the teams” is an asset in funding decisions. This creates a moral dilemma for scientists of “diverse” ancestry, as explained by Professor Kevin Williams of Temple University:

Do I deserve to jump the line? If I say yes, I may play a leading role in ending the scourge of atherosclerosis—also known as hardening of the arteries. If I play fair, I may lose the opportunity to save people around the world from heart attacks and strokes. I'm angry at the National Institutes of Health for putting me in this position. I'm even angrier it has done so in the name of racial equity....

If I refuse to identify myself as African-American, our application is more likely to lose on “diversity” grounds. It's a double wrong. Not only is the system rigged based on nonscientific—and possibly illegal—criteria; it encourages me to join in the rigging.

Truth be told, I made my decision years ago. When my study team files our application, it won't note my West African origins. If we don't get the grant, so be it. I refuse to engage in a moral wrong in pursuit of a moral good—even one as important as saving lives from the leading killer on earth. My father, who struggled against racism to achieve so much on the merits of his own work, would never forgive me for “checking the box” to grab a race-based advantage.

And no matter what happens, I can never forgive the National Institutes of Health for reinjecting racism into medical research ( Williams, 2024 ).

Funding agencies attempt to circumvent the laws prohibiting them from basing funding decisions on race or ethnicity by cloaking DEI requirements in nebulous language ( NIH, 2019 ; Renoe, 2023 ) and by disguising racial preferences and even quotas as “diversity of backgrounds” and unequal treatment as “broadening participation of underrepresented groups.” The determination of which groups to treat as underrepresented and worthy of special treatment is highly subjective, as Americans hold many identities and can be split up in a multitude of ways. In practice, implementing equity-focused DEI programs means preferring members of some groups over others ( Kendi, 2019 ). To paraphrase Orwell, all groups are equal, but some groups are more equal than others ( Orwell, 1945 ).

The evaluations of submitted DEI plans are not open to public scrutiny. Agencies run diversity-focused programs but refuse to give guidance on how to determine eligibility for them; they are careful to state that compliance with all applicable employment laws is the responsibility of the host institution. However, DEI metrics, which must be reported annually to the funding agency, are criteria for renewal ( NIH, 2023b ). It remains unclear how a principal investigator is supposed to be non-discriminatory in hiring and at the same time fulfill de facto DEI quotas for renewal. In this way, programs are developed that are de jure “open to everyone,” but de facto allocated according to identity metrics, reminiscent of the pre-civil rights era in the U.S.

The extensive collection of demographic information by the funding agencies is also concerning. For example, the portal for managing NSF grants (grants.gov) requests users (prospective and funded PIs) to report their demographic details (see Figure 3 ). The stated purpose of this request is “to gauge whether our programs and other opportunities in science and technology are fairly reaching and benefiting everyone regardless of demographic category; and to ensure that those in under-represented groups have the same knowledge of and access to programs, meetings, vacancies, and other research and educational opportunities as everyone else.” However, the cited Privacy Act statement ( Plimpton, 2020 ) does not indicate that the collected information will be used only in aggregated form and that demographic data of the PIs will not be visible to the program officers. On the contrary, it states, “The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process.” ( Plimpton, 2020 ). The NSF Privacy Act ( NSF, 2018 ) clarifies that the purpose of the online portal system is to “provide a dashboard for administrators to easily manage NSF system roles for their organizations” and to “provide demographic data that NSF may track over time in order to review and evaluate NSF programs.”

www.frontiersin.org

Figure 3 . Screenshot of grants.gov website used to submit proposals to NSF and to manage reporting of the funded grants. The PIs are required to complete their demographic profile before they are allowed to use the website.

2.4 Politicization of science erodes public trust

DEI politicizes science, which erodes public trust in scientific institutions, scientists, and the entire scientific enterprise.

Public trust in institutions is essential for a functioning democracy. Science funding ultimately depends on the goodwill of the voting and tax-paying public. When federal funding agencies infuse political agendas into their function, they contribute to public mistrust in the process by which science is funded. When universities become complicit by subjugating their mission of truth seeking to ideologically driven DEI programs, they contribute to public mistrust in scientific institutions ( Kennedy and Tyson, 2023 ). Should the public withdraw its support for science, loss of funding will ultimately ensue, with attendant detrimental consequences to the nation.

The politicization of science by DEI also erodes the trust in scientists and the scientific enterprise itself ( Kahan, 2010 , 2015 ) that is required for experts, the public, and legislators to effectively work together to solve pressing problems, such as climate, energy, and pandemics. Mistrust in science also provides fertile ground for science denial, conspiracy theories, and political opportunism.

2.5 Administrative overload

Scientists are already overwhelmed with administrative tasks. Time spent writing a winning DEI statement is time not spent thinking about how to solve a difficult scientific problem. Mervin Kelly, Director of Bell Labs, one of the most innovative and impactful US research institutions of the 20th century, famously stated about the work environment:

We give much attention to the maintenance of an atmosphere of freedom and an environment stimulating to scholarship and scientific research interest. It is most important to limit [the scientist's] work to that of research ( Georgescu, 2022 ).

Yet, by some estimates, top researchers today spend more than 50% of their time writing grant applications ( Belluz et al., 2016 ), many of which comprise hundreds of pages of documents and forms required by the federal government, the majority of which are unrelated to the science being proposed. DEI statements and reporting requirements add to that. These statements and reports are difficult and time-consuming to write as the applicant is expected to use prescribed, often convoluted, DEI terminology, reflecting, for example, on how “inclusion” differs from “diversity” ( Nahm and Watkins, 2023 ). An example of a successful DEI plan from a funded NIH proposal gives an idea of what is expected from PIs (Bamman, (n.d.)).

For some NIH training grants, multilayered diversity plans ( Nelson, 2022 ) are required for the mentoring faculty, for the leadership team, and for the program's directors. In addition, NIH requires DEI plans for recruitment of scholars ( NIH, 2022 , 2023b ) and mandates diversity training for mentors—“training the trainers”—and a plan for how the scholars themselves can mentor other students to propagate DEI [NIH, (n.d.(a)), 2024b ; Nelson, 2022 ]. Even scholars who are underrepresented themselves, and sometimes have overcome significant hardship, must articulate how they plan to promote DEI, at the expense of focusing on their own research plans.

2.6 Why is this happening?

Why are funding agencies participating in activities that are arguably unrelated to their stated missions, and in important respects even undermining them? The DEI movement has undeniable grassroots components, comprising both sincere, activist scholars and cynical opportunists who use DEI to advance their careers. But these elements alone cannot explain why funding agencies have so radically veered from their original missions.

In fact, the mandate that funding agencies implement DEI comes directly from the White House. Executive Order 13985, titled “Advancing Racial Equity and Support for Underserved Communities Through the Federal Government,” directed all federal agencies to allocate resources to DEI and to incorporate “equity” into their decision making as a principle ( EO 13985 ).

EO 13985 begins with acknowledging that equal opportunity is a bedrock of American democracy and that historic injustices have denied this equal opportunity to certain groups and individuals. It cites existing gaps and inequalities:

For purposes of this order: (a) The term “equity” means the consistent and systematic fair, just, and impartial treatment of all individuals, including individuals who belong to underserved communities that have been denied such treatment, such as Black, Latino, and Indigenous and Native American persons, Asian Americans and Pacific Islanders and other persons of color; members of religious minorities; lesbian, gay, bisexual, transgender, and queer (LGBTQ+) persons; persons with disabilities; persons who live in rural areas; and persons otherwise adversely affected by persistent poverty.

If “consistent and systematic fair, just, and impartial treatment of all individuals” means equality of opportunity and equitable treatment of people's accomplishments based on their merit, we're all for it. However, the Order goes on to make clear that the goal is not to achieve equal opportunity and equitable treatment, but to achieve equal outcomes for identity groups. The Order conflates racism in the past with disparities in the present and equitable treatment with equal outcomes. It attributes unequal participation in the present to alleged discrimination in the present. It charges the Domestic Policy Council with the task “[of] remov[ing] systemic barriers,” thus implicitly asserting the existence of such barriers in the present. It calls for “redress[ing] inequities,” “affirmatively advancing equity,” and “allocating Federal resources in a manner that increases investment in underserved communities, as well as individuals from those communities.” Whatever is to be said about such goals in relation to, say, social welfare programs, we question their value and appropriateness for science funding.

The American public generally agrees with equal opportunity as a goal, but not with identity group-based preferences ( Gramlich, 2023 ). The courts have held the same, with equal opportunity mandated by civil rights law since the 1970s and identity group-based preferences in college admissions struck down by the Supreme Court in 2023. But EO 13985 turns the argument that equity/preferences is unfair on its head and instead claims that equity/preferences is a prerequisite for “equal opportunity”. In this way, supporting equal opportunity tacitly requires supporting race- or identity-based preferential treatment as a precondition to equal opportunity.

The words “merit,” “excellence,” and related words such as scientific “achievement” and “accomplishment,” are conspicuously absent in the six-page Order ( EO 13985 ). It is clear that the EO does not call for equal recognition for equal merit (achievement, accomplishment, promise), but aims to give preferential treatment to specific identity groups (listed above) in an attempt to close achievement gaps and redress past injustices.

The theme that “bias, discrimination, and harassment plague the science and technology ecosystem, from school to workforce and beyond” is continued in the official statement “Equity and Excellence: A Vision to Transform and Enhance the U.S. STEMM Ecosystem,” issued by the Office of Science and Technology Policy ( OSTP, 2022 ). Citing unequal outcomes in the distribution of research funding—e.g., that black PIs were funded at a lower rate than white PIs—the document calls for sustained intervention to “close the [research] funding gap and support researchers and communities who have been historically excluded from access to key resources.” The document is replete with DEI vocabulary (“equitable access and outcomes,” “holistic support,” “systemic barriers—including bias, racism, sexism, ableism, exclusion, discrimination, cultural disincentives,” “structural barriers”). Again, no mention of merit and the like.

The goal of promoting “equity” in science is reinforced in Executive Order 14091 ( EO 14091 ). Titled “Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government,” it explains how equity is to be implemented in various domains, and specifically calls for the “promot[ion] [of] equity in science.” It lays out specific DEI requirements for federal agencies, including NASA and NSF, such as the following:

The Administrator of the National Aeronautics and Space Administration, the Director of the National Science Foundation [...] (agency heads) shall, within 30 days of the date of this order, ensure that they have in place an Agency Equity Team within their respective agencies to coordinate the implementation of equity initiatives and ensure that their respective agencies are delivering equitable outcomes for the American people.

These orders do not mandate equitable treatment of applicants based on achievement and promise, i.e., merit; rather they mandate that research funding be distributed “equitably”—i.e., proportionally to demographic representation—among identity groups.

The foregoing EOs are enforced by the Office of Management and Budget (OMB), which conducts DEI audits “in partnership” with each agency and makes recommendations to address DEI concerns. At the same time, OMB plays a role in determining both the level of funding for each agency and how funds are allocated to programs within each agency [NSF, (n.d.(a))]. OMB thus has the power to allocate science funding on the basis of DEI compliance, potentially at the expense of scientific merit.

As ordered, the funding agencies have responded to these government mandates by establishing DEI offices, rolling out DEI initiatives, instituting DEI plans, and introducing DEI reporting requirements [NIH, (n.d.(b)); NSF, (n.d.(b)), 2022 ; NASA, 2022 ; DOE, 2023a , c ]. The DOE Equity Action Plan ( DOE, 2023a ), for example, pledges to “increase participation by individuals and institutions that are underrepresented in DOE's research and development (R&D) programs supported through financial assistance.” It proposes to “expand Tribal engagement and stakeholder engagement across DOE.” In a truly Orwellian manner, DOE promises to “update the DOE Merit Review Program to improve equitable outcomes for DOE awards” ( DOE, 2023c ).

The NSF Equity Plan [NSF, (n.d.(b))] includes activities and steps “addressing sexual and other forms of harassment, optimizing demographic data collection in support of equity assessments, increasing participation of disadvantaged entities [including Minority Serving Institutions (MIs)], on Federal Acquisition Regulation-based solicitation and awards, and removing barriers to enhanced participation by indigenous and Native American communities.” Among specific steps, NSF now requires researchers who use selected astronomical facilities to include land acknowledgments in their work [NSF, (n.d.(b))]. As an example of how the plan is being implemented, teams applying for Centers for Chemical Innovation grants must now include a Diversity and Inclusion Plan (recommended length 2 pages) “to ensure a diverse and inclusive center environment, including researchers at all levels, leadership groups, and advisory groups,” in addition to a mandatory broader impact section (recommended length 8 pages) that should include plans for broadening participation by underrepresented groups ( NSF, 2023 ).

NIH's activities toward advancing racial equity [NIH, (n.d.(b))] include an invitation to “Take the Pledge,” which includes committing to an idea that “equity, diversity, and inclusion drives success,” “setting up a consultation with an EDI [DEI] liaison,” and “ordering the ‘EDI Pledge Poster' (or … creat[ing] your own) for your space and hav[ing] your team sign it” [NIH, (n.d.(c))].

2.7 Who or what does DEI benefit?

It is not clear how science is supposed to benefit from the imposition of DEI ideology and programs on funding decisions. To our knowledge, there have been no quantitative studies demonstrating that any DEI intervention has increased the quality or quantity of scientific output (see text footnote 2). On the other hand, one group that clearly benefits is the DEI bureaucracy itself: specialized hires within agencies 6 and universities, and highly paid DEI experts and consultants. 7 Not unlike lobbyists, DEI experts advise agency staff to create positions for themselves or their colleagues in winning grant proposals ( Nahm and Watkins, 2023 ). Since DEI statement requirements are nebulous and confusing, unsurprisingly, the solution is to hire a well-paid consultant [NIH, (n.d.(c)); Nahm and Watkins, 2023 ]. Some agencies, such as NASA, even make the inclusion of paid professional DEI consultants in the project mandatory ( Nahm and Watkins, 2023 ; see Figure 1 — “pay them well”). These highly paid consultants often have no expertise in the conduct of science. Hiring them requires administrative effort and diverts significant funding away from science. NSF also recommends allocating 5–10% of the total budget to “broader impact” activities, which heavily emphasize DEI ( Renoe, 2023 ). This continues an unfortunate trend of prioritizing documentation, compliance, and activities that do not contribute to actual scientific work over innovation in federally funded research. Quoting from a brief submitted to the Canadian House of Commons by 40 university professors, DEI “is self-perpetuating, has no end goal, and uses flawed metrics.” ( Horsman et al., 2024 ).

2.8 DEI in perspective

Although our commentary focuses narrowly on the current U.S. funding scene, the spread of DEI ideology and its intrusion into science has been limited neither geographically nor temporally. Abbot et al. (2023) provided examples of current DEI-informed policies in Europe, United Kingdom, Canada, Australia, and New Zealand [see also a brief submitted to the Canadian House of Commons by 40 academics calling for the abolition of “costly and inequitable” DEI initiatives ( Horsman et al., 2024 )]. Historically, ideological control of the scientific enterprise has been practiced by totalitarian regimes, notably, by Maoist China and Soviet Russia, with disastrous consequences for science and technology in both countries ( Graham, 1987 ; Josephson, 2005 ; Krylov, 2021 ). Identity-based social engineering was practiced to the extreme in Nazi Germany, where universities were purged of non-Aryans by government decree ( Deichmann, 1999 , 2023 ). In the USSR, professional advancement was conditioned on class (favoring the proletariat and disfavoring the educated “intelligentsia”), ethnicity (e.g., participation of Jews was limited by quotas), and ideological compliance ( Graham, 1987 ; Alexandrov, 2002 ; Shifman, 2005 ; Gruntman, 2022 ; Krylov, 2022 ).

To illustrate the pernicious effect of the ideological control of science, consider Lysenkoism—a dark period in Soviet science when the field of genetics was denounced as a “bourgeoise pseudoscience,” scores of scientists fired and jailed, with many—including brilliant biologist Nikolai Vavilov—perishing in the Gulag ( Graham, 1987 ; Kolchinsky, 2014 ; Reznik, 2017 ; Deichmann, 2023 ). The key player in this devastating attack on science was Trofim Lysenko, a poorly educated agronomist who promoted a number of scientifically flawed ideas—such as the rejection of genes and the belief in the complete malleability of phenotypes. He succeeded in destroying his scientific opponents not by beating them in a scientific debate, but by having the support by the Communist Party, including Stalin himself. Lysenko's rise to power was partially because of his pedigree—he was the poster child of a “people's scientist” because he came from a family of poor peasants. He did not learn how to read until age 14, and the Soviet press lovingly called him the “barefoot scientist.” In contrast, his main opponent, Nikolai Vavilov, was suspect because of his class (the “intelligentsia”). This was official Party policy—to rapidly promote members of the proletariat into leadership positions in agriculture, science, and industry. Lysenko also formed an alliance with a Marxist ideologue (Isaak Prezent), who cleverly used philosophical arguments, such as claiming that genetics contradicted Marxist-Leninist doctrine.

Lysenko and his supporters destroyed the entire field of genetics in the USSR, suppressing research there for more than a decade. Lysenko's bogus science was used to introduce flawed agricultural practices on a large scale, causing devastating famines in the USSR and China.

3 Vision/summary

Science is a national and, indeed, global public good that has afforded us an unprecedented standard of living and wellbeing, the eradication of diseases, healthier lives, and increased lifespans. Although the benefits have not been shared equally, scientific progress has benefitted all members of society, including members of marginalized groups.

Science is also an essential component of a well-functioning democracy. By providing a foundation for technological developments, science is instrumental in maintaining America's global competitiveness and its national security ( Deift et al., 2021 ; Eaglen, 2023 ; Luckenbauch, 2023 ). Funding for science is limited—about 3% of GDP for the US. It is imperative that this funding be used efficiently and effectively to advance knowledge, produce innovation, maintain national security, improve health, deal with unforeseen crises and challenges, and find solutions to environmental and other problems.

Science and ideology are fundamentally different things. Science, by its nature questions assumptions; to flourish, it requires freedom of inquiry and the free exchange of ideas. Ideology is hostile to such freedoms. Historically, when ideology has invaded science, stagnation and collapse have ensued ( Graham, 1987 ; Josephson, 2005 ; Krylov, 2021 ).

In order to be effective in advancing knowledge, delivering innovation, and serving humankind, science must:

• Operate free from ideology and politicization in a climate of open inquiry.

• Center the autonomy of researchers, acknowledging their expert role, allowing them to do what they do best and minimizing time spent on administrative, compliance and reporting duties, and so-called “broader impact” activities.

Systemic disparities in opportunity, especially those related to socio-economic status, are real and well-documented. Solid family structure, access to healthcare, good nutrition, an environment free from violence and drugs, high-quality preschool and K−12 education are necessary to nurture the next generation of scientists, but they are not equally available to all Americans. Rather than attempt to institute “equity” by mandating proportional participation through the manipulation of grant funding, we believe that increased efforts should be made to promote equality of opportunity as early in people's lives as possible so that young people who aspire to standing in any field, including scientific fields, can succeed on merit ( Abbot et al., 2023 , 2024 ; Loury, 2024 ).

DEI initiatives such as those related to grant funding have taken the place of efforts to investigate and solve the underlying issues leading to inequities—the root causes that prevent all Americans from achieving their potential. DEI is based on the fallacy that a fair and equitable society can be achieved by mandating proportional participation in a highly competitive, achievement-based activity, such as science ( Sowell, 2023 ). Indeed, some DEI efforts have been outright harmful to the very groups they purport to uplift. For example, in the name of “equity,” public school K−12 math curricula have been systematically dismantled ( Deift et al., 2021 ; Evers and Hofer, 2023 ), and this most-strongly disadvantages high-achieving students of families that cannot afford private schools ( McWhorter, 2021 ): potential future scientists with minority backgrounds—precisely those who DEI efforts in science purportedly aim to help. Likewise, identity group preference programs have harmed minority students admitted to universities that did not match their level of preparation ( Heriot and Schwarzchild, 2021 ; Sowell, 2023 ).

Observed disparities in participation in the scientific enterprise should be systematically investigated and analyzed for their root causes before concluding that they are the result of discrimination ( Sowell, 2019 ). When discrimination is identified, it should be remedied by enforcing existing civil rights laws. Attempting to fix disparities by social engineering is ineffective, unfair, and potentially illegal. As Chief Justice Roberts stated:

The way to stop discrimination on the basis of race is to stop discriminating on the basis of race,

a view fundamentally opposite to the CSJ approach, as succinctly expressed by author Ibrahim Kendi:

The only remedy to racist discrimination is antiracist discrimination. The only remedy to past discrimination is present discrimination. The only remedy to present discrimination is future discrimination ( Kendi, 2019 ).

We recommend that all federally funded agencies focus on their primary role in generating the maximum public good for the public funds spent, and not take on the role of promoting any ideological agenda. Funding agencies should abolish DEI requirements and focus instead on funding proposals based on their merits. Strengthening K−12 education and merit-based practices is the path toward equal opportunity, fair distribution of resources, and the best science to the benefit of all ( McCloskey, 2016 ; Wooldridge, 2021 ; Abbot et al., 2023 , 2024 ).

Author contributions

IE: Investigation, Writing – review & editing. JF: Conceptualization, Investigation, Writing – review & editing. RG: Investigation, Writing – review & editing. AK: Conceptualization, Investigation, Writing – original draft, Writing – review & editing. LM: Investigation, Writing – review & editing. JS: Conceptualization, Investigation, Writing – original draft, Writing – review & editing. JT: Conceptualization, Investigation, Writing – original draft, Writing – review & editing. AT: Investigation, Writing – review & editing.

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. JS received funding from the Foundation for Individual Rights and Expression and from Heterodox Academy.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

1. ^ CSJ is rooted in Critical Theories and postmodernism ( Lyotard, 1984 ; Delgado and Stefancic, 2001 ; Aylesworth, 2015 ; Celikates and Flynn, 2023 ; Britannica, 2024 ).

2. ^ We make this statement with some authority, as several of us are involved in a comprehensive review of the literature relevant to the claim that diverse research teams produce better science.

3. ^ From the program description: “The NIH Common Fund's Faculty Institutional Recruitment for Sustainable Transformation (FIRST) program aims to enhance and maintain cultures of inclusive excellence in the biomedical research community. ‘Inclusive excellence' refers to cultures that establish and sustain scientific environments that cultivate and benefit from a full range of talent. NIH aims to facilitate institutions in their building a self-reinforcing community of scientists, through recruitment of a critical mass of early-career faculty who have a demonstrated commitment to inclusive excellence . The program also seeks to have a positive impact on faculty development, retention, progression, and eventual promotion, as well as develop inclusive environments that are sustainable.” ( NIH, 2024a ; emphasis ours).

4. ^ From NURTURE: Northwestern University Recruitment to Transform Under-Representation and achieve Equity : “NURTURE aims to disrupt systemic barriers that impede full participation of biomedical research scientists from underrepresented groups (URG) by investing in inclusive cultural change within our institution. We acknowledge that systemic racism has persisted in biomedical science, including at Northwestern. We are committed to dismantling the structures that have allowed racism and bias to persist and impeded the scientific careers of too many URG scholars. NURTURE proposes to transform siloed fiefdom structures to transdisciplinary Scientific Neighborhoods that will foster growth and accomplishment in the research, career, and personal trajectories of URG faculty.” (Simon, (n.d.)).

5. ^ See, for example, statements from the abstracts of proposals funded by the NIH FIRST program that explicitly plan hiring based on minority status ( NIH, 2024a ): “Cornell FIRST will support the hiring and retention of 10 new assistant professors from groups underrepresented in their fields, while transforming institutional climate into a culture of inclusive excellence.... Cornell's FIRST program features interdisciplinary hiring of faculty underrepresented in their fields.” (Cornell FIRST). “The overall goal of the University of California San Diego (UCSD) NIH... FIRST... Program is to:... promote institutional excellence by hiring a diverse cohort of faculty.... The objectives are to:... conduct recruitment of new faculty, outline institutional commitments, and develop recruitment committees based on commitments to diversity, equity and inclusion.” (UCSD FIRST). “This new model will seek to hire three cohorts of underrepresented scholars, who are committed to diversity in the academy.” (University of Maryland FIRST).

6. ^ To “embed and integrate Diversity, Equity, Inclusion, and Accessibility (DEIA) into NSF's policies, practices, and culture,” NSF allocated two million dollars and created DEIA implementation team of 25 staffers including newly created top-level position of Chief Diversity Officer (see Appendix B in NSF, 2022 ).

7. ^ According to salary.com , the average salary of a DEI director in California is $227,773. DEI consultant hourly rates vary, with the median being $53/h (according to ZipRecruter.com ). The NOVA collective, a Chicago based DEI company, gives the prices for their services on their website . Single instructor-led training sessions cost $500–10,000, E-learning modules cost $200–5,000, Keynotes go at $1,000–30,000. Consulting monthly retainers cost between $2,000 and 20,000, and single consulting deliverables cost $8,000–50,000.

AAAS (n.d.). Rights to do Science: FAQs . Washington, DC: American Association for the Advancement of Science. Available online at: https://www.aaas.org/programs/scientific-responsibility-human-rights-law/resources/faqs (accessed April 10, 2024).

Google Scholar

Abbot, D., Bikfalvi, A., Bleske-Rechek, A. L., Bodmer, W., Boghossian, P., Carvalho, C. M., et al. (2023). In defense of merit in science. J. Controver. Ideas . 3, 1–26. doi: 10.35995/jci03010001

Crossref Full Text | Google Scholar

Abbot, D. S., Marinovic, I., Lowery, R., and Carvalho, C. (2024). “Merit, fairness, and equality” in Book II: How to Keep Free Inquiry Alive of the Free Inquiry Papers , eds. R. Maranto, S. Satel, C. Salmon, and L. Jussim (Washington, DC: AEI Press).

AFA (2022). AFA Calls for an End to Required Diversity Statements . Princeton, NJ: Academic Freedom Alliance. https://academicfreedom.org/afa-calls-for-an-end-to-required-diversity-statements/ (accessed June 1, 2024).

AIP (n.d.). Federal Science Budget Tracker . Washington, DC: American Institute of Physics. Available online at: https://ww2.aip.org/fyi/federal-science-budget-tracker

PubMed Abstract | Google Scholar

Alexandrov, D. A. (2002). “Sovetization of Higher Education and Establishment of Soviet System of Research Institutes” in Behind the Iron Curtain, Bulanin. In Russian: Aлександров, Д. A. (2002). Советизация высшего образования и становление советской научно-исследовательской системы // 3a железным занавесом: мифы и реальность советской науки eds. Pед. Э. И. Колчинский и M. Хайнеманн. - СПб: Буланин, 528 c.

Aylesworth, G. (2015). “Postmodernism” in The Stanford Encyclopedia of Philosophy (Spring 2015 Edn) , ed. E. N. Zalta (Department of Philosophy; Stanford University; Library of Congress Catalog Data). Available online at: https://plato.stanford.edu/entries/postmodernism

Bamman, M. M. (n.d.). Recruitment and Retention Plan to Enhance Diversity . Alabama. Available online at: https://www.uab.edu/ccts/images/T_Awards/Renamed_Files/T32-NICHD-RECRUITMENT_RETENTION_DIVERSITY-BAMMAN.pdf

Barabino, G. A., Fiske, S. T., Scherer, L. A., and Vargas, E. A., (eds). (2023). Advancing Antiracism, Diversity, Equity, and Inclusion in STEMM Organizations . Washington, DC: National Academies Press. Available online at: https://nap.nationalacademies.org/catalog/26803/advancing-antiracism-diversity-equity-and-inclusion-in-stemm-organizations-beyond

Belluz, J., Plumer, B., and Resnick, B. (2016). The 7 Biggest Problems Facing Science, According to 270 Scientists . Vox. Available online at: https://www.vox.com/2016/7/14/12016710/science-challeges-research-funding-peer-review-process

Brint, S. (2023). Response to commentaries offered on the CSHE ROPS Is the University of California Drifting Toward Conformism? The Challenges of Representation and the Climate for Academic Freedom by Steven Brint and Komi Frey. Research & Occasional Paper Series: CSHE.12.2023 . Berkeley, CA: Center for Studies in Higher Education. Available online at: https://escholarship.org/uc/item/4s88j3t6/ (accessed May 6, 2024).

Brint, S., and Frey, K. (2023). Is the University of California Drifting Toward Conformism? The Challenges of Representation and the Climate for Academic Freedom. Research & Occasional Paper Series: CSHE.5.2023 . Berkeley, Center for Studies in Higher Education. Available online at: https://escholarship.org/uc/item/3pt9m168 (accessed May 6, 2024).

Britannica (2024). “Critical race theory” in The Encyclopedia Britannica . Chicago, IL: The Britannica Group. Available online at: https://www.britannica.com/topic/critical-race-theory

Ceci, S. J., Kahn, S., and Williams, W. M. (2021). Stewart-Williams and Halsey argue persuasively that gender bias is just one of many causes of women's underrepresentation in science. Eur. J. Pers . 35:40. doi: 10.1177/0890207020976778

Ceci, S. J., Kahn, S., and Williams, W. M. (2023). Exploring gender bias in six key domains of academic science: an adversarial collaboration. Psychol. Sci. Public Int . 24, 15–73. doi: 10.1177/15291006231163179

PubMed Abstract | Crossref Full Text | Google Scholar

Ceci, S. J., and Williams, W. M. (2024). Are claims of fairness toward women in the academy “manufactured”? The risk of basing arguments on incomplete data. Sex. Cult . 28, 1–20. doi: 10.1007/s12119-023-10133-8

Celikates, R., and Flynn, J. (2023). “Critical theory (Frankfurt school)” in The Stanford Encyclopedia of Philosophy, Winter 2023 Edn , eds. E. N. Zalta, and U. Nodelman (Palo Alto, CA: Department of Philosophy; Stanford University; Library of Congress Catalog Data). Available online at: https://plato.stanford.edu/entries/critical-theory/

Deichmann, U. (1999). The expulsion of Jewish chemists and biochemists from academia in Nazi Germany. Perspect. Sci. 7, 1–86. doi: 10.1162/posc.1999.7.1.1

Deichmann, U. (2023). Science, race, and scientific truth, past and present. Eur. Rev. 31, 459–478. doi: 10.1017/S1062798723000200

Deift, P., Jitomirskaya, S., and Klainerman, S. (2021). As US Schools Prioritize Diversity Over Merit, China is Becoming the World's STEM Leader. Sydney, NSW: Quillette. Available online at: https://quillette.com/2021/08/19/as-us-schools-prioritize-diversity-over-merit-china-is-becoming-the-worlds-stem-leader/

Delgado, R., and Stefancic, J. (2001). Critical Race Theory: An Introduction . New Yor, NY: New York University Press.

DOE (2022). Everyone has a Role to Play in Making Science More Equitable and Inclusive . Washington, DC: Office of Science, U.S. Department of Energy. Available online at: https://www.energy.gov/science/articles/everyone-has-role-play-making-science-more-equitable-and-inclusive

DOE (2023a). Promoting Inclusive and Equitable Research (PIER) Plans . Washington, DC: Office of Science, U.S. Department of Energy. Available online at: https://science.osti.gov/grants/Applicant-and-Awardee-Resources/PIER-Plans

DOE (2023b). Things to Consider When Developing a PIER Plan . Washington, DC: Office of Science, U.S. Department of Energy. Available online at: https://science.osti.gov/grants/Applicant-and-Awardee-Resources/PIER-Plans/Things-to-Consider-When-Developing-a-PIER-Plan

DOE (2023c). DOE Equity Action Plan . Washington, DC: Office of Energy Justice and Equity, U.S. Department of Energy. Available online at: https://www.energy.gov/justice/doe-equity-action-plan

Eaglen, M. (2023). 10 Ways the US is Falling Behind China in National Security . Washington, DC: American Enterprise Institute. Available online at: https://www.aei.org/research-products/report/10-ways-the-us-is-falling-behind-china-in-national-security/

EEOC (n.d.). Prohibited Employment Policies/Practices . Washington, DC: U.S. Equal Employment Opportunity Commission. Available online at: https://www.eeoc.gov/prohibited-employment-policiespractices

EO 13985 (Exec. Order No. 13985). Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. 86 Fed. Reg. 7009. Available online at: https://www.federalregister.gov/documents/2021/01/25/2021-01753/advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government (accessed January 20 2021).

EO 14091 (Exec. Order No. 14091). Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. 88 Fed. Reg. 10825. Available online at: https://www.federalregister.gov/documents/2023/02/22/2023-03779/further-advancing-racial-equity-and-support-for-underserved-communities-through-the-federal (accessed February 16 2023).

Evers, W. M., and Hofer, J. (2023). Unraveling the Algebra II-Data Science Debate in California K−12 Schools . Oakland, CA: The Beacon. Available online at: https://blog.independent.org/2023/07/19/data-science-algebra-ii-debate-california-k-12-schools/

Geiger, R. L. (1993). Research and Relevant Knowledge: American Research Universities since World War II . New York, NY: Oxford University Press.

Georgescu, I. (2022). Bringing back the golden days of Bell Labs. Nat. Rev. Phys . 4, 76–78. doi: 10.1038/s42254-022-00426-6

Graham, L. R. (1987). Science, Philosophy, and Human Behavior in the Soviet Union . New York, NY: Columbia University Press.

Graham, L. R., and Diamond, N. (1996). The Rise of the American Research Universities . Baltimore, MD: John Hopkins University Press.

Gramlich, J. (2023). Americans and Affirmative action: How The Public Sees the Consideration of Race in College Admissions, Hiring . Washington, DC: Pew Research Center. Available online at: https://www.pewresearch.org/short-reads/2023/06/16/americans-and-affirmative-action-how-the-public-sees-the-consideration-of-race-in-college-admissions-hiring/

Gruntman, M. (2022). My 15 Years at IKI, the Space Research Institute . California: Interstellar Trail Press.

Heriot, G., and Schwarzchild, M., (eds.). (2021). A Dubious Expediency: How Race Preferences Damage Higher Education . New York, NY: Encounter Books.

Honeycutt, N. (2022). Manifestations of Political Bias in the Academy (Ph. D. Thesis). Rutgers University Department of Psychology. Rutgers University, New Brunswick, NJ.

Horsman, G., Haskell, D., Patterson, Z., Lupker, S., Krauss, L. M., Kramar, K., et al. (2024). Submission to the Standing Committee on Science and Research . Available online at: https://www.ourcommons.ca/Content/Committee/441/SRSR/Brief/BR13156171/br-external/Jointly02-067-240604-004-e.pdf

Josephson, P. R. (2005). Totalitarian Science and Technology, 2nd Edn . New York, NY: Humanity Books, an imprint of Prometheus Books.

Kahan, D. M. (2010). Fixing the communications failure. Nature 463, 296–297. doi: 10.1038/463296a

Kahan, D. M. (2015). What is the “Science of Science Communication”? J. Sci. Comm . 14, 1–10. doi: 10.22323/2.14030404

Kaiser, J. (2021a). NIH institutes try new approach to supporting Black scientists. Science 374:15. doi: 10.1126/science.acx9216

Kaiser, J. (2021b). NIH pulls notice aimed at encouraging applications from Black scientists. Science . doi: 10.1126/science.acx9522

Kendi, I. X. (2019). How to Be an Antiracist . New York, NY: One World.

Kennedy, B., and Tyson, A. (2023). Americans' Trust in Scientists, Positive Views of Science Continue to Decline . Washington, DC: Pew Research Center. Available online at: https://www.pewresearch.org/science/2023/11/14/americans-trust-in-scientists-positive-views-of-science-continue-to-decline/

Kennedy, R. L. (2024). Mandatory DEI Statements are Ideological Pledges of Allegiance. Time to Abandon Them . Cambridge, MA: The Harvard Crimson. Available online at: https://www.thecrimson.com/column/council-on-academic-freedom-at-harvard/article/2024/4/2/kennedy-abandon-dei-statements/

Kolchinsky, E. I. (2014). Nikolai Vavilov in the years of Stalin's ‘Revolution from Above' (1929–1932). Centaurus 56:330. doi: 10.1111/1600-0498.12059

Krylov, A. I. (2021). The peril of politicizing science. J. Phys. Chem. Lett . 12, 5371–5376. doi: 10.1021/acs.jpclett.1c01475

Krylov, A. I. (2022). From Russia with Love: Science and Ideology Then and Now . Heterodox STEM. Available online at: https://hxstem.substack.com/p/from-russia-with-love-science-and

Krylov, A. I., and Tanzman, J. (2023). Critical social justice subverts scientific publishing. Eur. Rev . 31, 527–546. doi: 10.1017/S1062798723000327

Krylov, A. I., and Tanzman, J. (2024). “Fighting the good fight in an age of unreason—A new dissident guide” in Book II: How to Keep Free Inquiry Alive of the Free Inquiry Papers , eds. R. Maranto, S. Satel, C. Salmon, and L. Jussim (Washington, DC: AEI Press).

Loury, G. (2024). Culture Matters . Glenn Loury substack. Available online at: https://glennloury.substack.com/p/culture-matters

Luckenbauch, J. (2023). U.S. Falling Behind China in Critical Tech Race, Report Finds. National Defense. Available online at: https://www.nationaldefensemagazine.org/articles/2023/7/17/us-falling-behind-china-in-critical-tech-race-report-finds

Lyotard, J. F. (1984). The Postmodern Condition: A Report on Knowledge . Minneapolis, MN: University of Minnesota Press.

McCloskey, D. N. (2016). The great enrichment: a humanistic and social scientific account. Soc. Sci. Hist . 40, 583–98. doi: 10.1017/ssh.2016.23

McWhorter, J. (2021). Is it Racist to Expect Black Kids to do Math for Real? It Bears Mentioning. Available online at: https://johnmcwhorter.substack.com/p/is-it-racist-to-expect-black-kids

Morin, A. (1993). Science Policy and Politics . Prentice Hall.

Nahm, A. L., and Watkins, R. N. (2023). Explore Science: Inclusion Plans in Research Proposals. Slide Presentation . Washington, DC: NASA. Available online at: https://smd-cms.nasa.gov/wp-content/uploads/2023/08/12-inclusionplans-nahm-watkins.pdf (accessed June 22, 2023).

NASA (2022). NASA Releases Equity Action Plan to Make Space More Accessible to All . Available online at: https://www.nasa.gov/news-release/nasa-releases-equity-action-plan-to-make-space-more-accessible-to-all/

Nelson, S. (2022). Building Mentoring and Community Platforms to Support DEI in Your Scientific Network: Overview of NIH/NIGMS Programs to Enhance Diversity of the Biomedical Research Workforce. Slide Presentation . Bethesda, MD: National Institute of General Medical Sciences; U.S. National Institutes of Health. Available online at: https://www.healthra.org/wp-content/uploads/2022/04/Shakira-Nelson-DEI-Spring-2022-presentation.pdf

NIH (2019). Notice of NIH's Interest in Diversity (NIH Notice No. NOT-OD-20-031) . Bethesda, MD: U.S. National Instituted of Health. Available online at: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-20-031.html

NIH (2021). The BRAIN Initiative Plan for Enhancing Diverse Perspectives (PEDP) (NIH Notice No. NOT-MH-21-310) . Bethesda, MD: U.S. National Instituted of Health. Available online at: https://grants.nih.gov/grants/guide/notice-files/NOT-MH-21-310.html

NIH (2022). Recruitment and Retention Plan to Enhance Diversity into Ruth L. Kirschstein Institutional National Research Service Award (T32) . Bethesda, MD: National Heart, Lung, and Blood Institute, U.S. National Instituted of Health. Available online at: https://www.nhlbi.nih.gov/grants-and-training/diversity/recruitment-retention-plan-for-diversity

NIH (2023a). PEDP Key Elements and Examples . Bethesda, MD: The BRAIN Initiative, U.S. National Instituted of Health. Available online at: https://braininitiative.nih.gov/vision/plan-enhancing-diverse-perspectives/pedp-key-elements-and-examples

NIH (2023b). Programs for Inclusion and Diversity Among Individuals Engaged in Health-Related Research (PRIDE) (NIH FOA No. RFA-HL-24-004) . Bethesda, MD: U.S. National Instituted of Health. Available online at: https://grants.nih.gov/grants/guide/rfa-files/RFA-HL-24-004.html

NIH (2024a). Faculty Institutional Recruitment for Sustainable Transformation (FIRST). Funded Research . Bethesda, MD: Office of Strategic Coordination—The Common Fund, U.S. National Instituted of Health. Available online at: https://commonfund.nih.gov/first/fundedresearch

NIH (2024b). Enhancing Diversity in Training Programs . Bethesda, MD: National Institute of General Medical Science, U.S. National Instituted of Health. Available online at: https://www.nigms.nih.gov/training/diversity

NIH (n.d.(a)). Mandatory Anti-Harassment Training. Bethesda MD: Office of Equity, Diversity, and Inclusion, U.S. National Instituted of Health. Available online at: https://www.edi.nih.gov/training/mandatory-training .

NIH (n.d.(b)). EDI 365. Bethesda MD: Office of Equity, Diversity, and Inclusion, U.S. National Instituted of Health. Available online at: https://www.edi.nih.gov/people/edi-365 .

NIH (n.d.(c)). EDI 365 - Take the Pledge. Bethesda MD: Office of Equity, Diversity, and Inclusion, U.S. National Instituted of Health. Available online at: https://www.edi.nih.gov/people/the-pledge .

NSF (2018). Privacy Act of 1974; System of Records. 83 Fed. Reg. 6884 (February 15, 2018). See also Federal Register/Vol. 83, No. 32/Thursday, February 15, 2018/Notices. Alexandria, VA: U.S. National Science Foundation. Available online at: https://www.federalregister.gov/documents/2018/02/15/2018-03145/privacy-act-of-1974-system-of-records

NSF (2022). Diversity, Equity, Inclusion, and Accessibility (DEIA) Strategic Plan . Alexandria, VA: U.S. National Science Foundation. Available online at: https://www.nsf.gov/od/oecr/reports/DEIA_Strategic_Plan_2022.pdf

NSF (2023). NSF 23-575: Centers for Chemical Innovation (CCI) . Alexandria, VA: U.S. National Science Foundation. Available online at: https://new.nsf.gov/funding/opportunities/centers-chemical-innovation-cci/nsf23-575/solicitation

NSF (n.d.(a)). Federal Budgeting and Appropriations Process. Alexandria VA: U.S. National Science Foundation. Available online at: https://new.nsf.gov/about/budget/process .

NSF (n.d.(b)). Contains link to downloadable NSF's 2022 Equity Action Plan. Alexandria VA: Equity at NSF, U.S. National Science Foundation. Available online at: https://www.nsf.gov/equity/index.jsp ; https://www.nsf.gov/equity/NSF_Agency_Equity_Action_Plan_2022.pdf .

Orwell, G. (1945). Animal Farm . London: Secker And Warburg.

OSTP (2022). Equity and Excellence: A Vision to Transform and Enhance the U.S. STEMM Ecosystem . Washington, DC: White House Office of Science and Technology Policy. Available online at: https://www.whitehouse.gov/ostp/news-updates/2022/12/12/equity-and-excellence-a-vision-to-transform-and-enhance-the-u-s-stemm-ecosystem

Plimpton, S. H. (2020). Privacy Act and Public Burden Statements . Alexandira, VA: Office of the General Counsel; U.S. National Science Foundation. Available online at: https://www.nsf.gov/pubs/policydocs/pappg20_1/privacy_burden.jsp

Pluckrose, H. (2021). What do we Mean by Critical Social Justice. Counterweight. Available online at: https://archive.is/TjTSW

Renoe, S. D. (2023). Let's Talk Broader Impacts. Slide Presentation . Alexadria, VA: Division of Molecular and Cellular Biosciences; U.S. National Science Foundation. Available online at: https://www.nsf.gov/bio/mcb/June7_2023_BIwSusanRenoe_ARIS.pdf

Reznik, S. E. (2017). Эта короткая жизнь. Николай Вавилов и его время. Zakharov, Moscow.

Sailer, J. (2023a). Inside Ohio State's DEI Factory . New York City, NY: Wall Street Journal.

Sailer, J. (2023b). How ‘Diversity' Policing Fails Science . New York City, NY: Wall Street Journal.

Sailer, J. (2024). The NIH Sacrifices Scientific Rigor for DEI . New York City, NY: Wall Street Journal.

Shifman, M. (2005). You Failed Your Math Test, Comrade Einstein. Singapore: World Scientific.

Simon, M. A. (n.d.). NURTURE: Northwestern University Recruitment to Transform Under-Representation and achieve Equity. NIH project no. 1U54CA272163-01 . Bethesda, MD: U.S National Institutes of Health. Available online at: https://reporter.nih.gov/project-details/10493892

Sowell, T. (2019). Discrimination and Disparities . New York, NY: Basic Books.

Sowell, T. (2023). Social Justice Fallacies . New York, NY: Basic Books.

Supreme Court (2013). Agency for International Development v. Alliance for Open Society Int'l, 570 US 205 - Supreme Court of United States. Available online at: https://scholar.google.com/scholar_case?case=18247484204517718329&inst=7943137549644843768

Title IX. Title IX and Sex Discrimination. U.S. Department of Education. Available online at: https://www2.ed.gov/about/offices/list/ocr/docs/tix_dis.html (Revised August 2021)..

Title VI. Title VI of the Civil Rights Act of 1964. Civil Rights Division; U.S. Department of Justice. Available online at: https://www.justice.gov/crt/fcs/TitleVI (Updated February 29 2024).

UC Berkeley (n.d.). Sample Rubric for Assessing Candidate Contributions to Diversity, Equity, Inclusion, and Belonging . Berkeley, CA: Office for Equity and Welfare; University of California. Available online at: https://ofew.berkeley.edu/academic-recruitment/contributions-deib/sample-rubric-assessing-candidate-contributions-diversity

Urquiola, M. (2020). Markets, Minds, and Money: Why America Leads the World in University Research . Cambridge, MA: Harvard University Press.

Williams, K. J. (2024). Why I'm Saying No to NIH's Racial Preferences . New York City, NY: Wall Street Journal.

Wooldridge, A. (2021). The Aristocracy of Talent: How Meritocracy Made the Modern World , Penguin Random House.

Keywords: politicization of science, Critical Social Justice, STEMM funding, Meritocracy, DEI

Citation: Efimov IR, Flier JS, George RP, Krylov AI, Maroja LS, Schaletzky J, Tanzman J and Thompson A (2024) Politicizing science funding undermines public trust in science, academic freedom, and the unbiased generation of knowledge. Front. Res. Metr. Anal. 9:1418065. doi: 10.3389/frma.2024.1418065

Received: 23 April 2024; Accepted: 25 June 2024; Published: 23 July 2024.

Reviewed by:

Copyright © 2024 Efimov, Flier, George, Krylov, Maroja, Schaletzky, Tanzman and Thompson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Anna I. Krylov, krylov@usc.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

IMAGES

  1. How do i start a science fair research paper in 2021

    science fair research papers

  2. How To Write A Science Fair Paper

    science fair research papers

  3. Science Fair Research Paper by CrazyScienceLady

    science fair research papers

  4. Science Fair Research Paper Sample

    science fair research papers

  5. Writing a science fair research paper. How To Write A Science Fair

    science fair research papers

  6. PPT

    science fair research papers

VIDEO

  1. Science Fair Research and Citations

  2. GLITTER HEARTS

  3. A Wireless Pacemaker -- Science Fair Research Project

  4. Alzheimer's Research With Electronics and International Science Fair!

  5. Public launch of the Skills4EOSC Competence Centres network

  6. Easiest science fair project!

COMMENTS

  1. Writing a Research Paper for Your Science Fair Project

    When you write your research paper you might want to copy words, pictures, diagrams, or ideas from one of your sources. It is OK to copy such information as long as you reference it with a citation. If the information is a phrase, sentence, or paragraph, then you should also put it in quotation marks. A citation and quotation marks tell the ...

  2. Science Fair Project Final Report

    For a Good Science Fair Project Final Report, You Should Answer "Yes" to Every Question: Does your abstract include a short summary of the hypothesis, materials & procedures, results, and conclusion? ... Background research (your Research Paper). Materials list. Experimental procedure. Data analysis and discussion (including data table and ...

  3. PDF Writing a Research Paper for Your Science Fair Project

    The purpose of your research paper is to give you the information to understand why your experiment turns out the way it does. The research paper should include: The history of similar experiments or inventions. Definitions of all important words and concepts that describe your experiment. Answers to all your background research plan questions.

  4. Writing a Science Project Report or Research Paper

    Your report should include a title page, statement of purpose, hypothesis, materials and procedures, results and conclusions, discussion, and credits and bibliography. If applicable, graphs, tables, or charts should be included with the results portion of your report. 2. Cause and effect. This is another common science experiment research paper ...

  5. How to Write a Science Fair Project Report

    Take the time to make the report look nice. Pay attention to margins, avoid fonts that are difficult to read or are too small or too large, use clean paper, and make print the report cleanly on as good a printer or copier as you can. Cite this Article. Helmenstine, Anne Marie, Ph.D. "How to Write a Science Fair Project Report."

  6. Step 8: Write your research paper :: Science Fair Wizard

    Paper should be double-spaced, single-sided, with one inch margins on all sides, and in a standard font such as Times New Roman 10 pt. or 12 pt. All pages should be numbered. Important: Check out the Science Fair Handbook for detailed instructions regarding the content of the research paper. The handbook also includes examples of the title page ...

  7. Science Fair Success: Writing a Winning Research Paper

    The research paper is a fundamental part of any science fair project, and understanding how to craft one that meets the scientific standards required for success can make all the difference. This article has provided an overview of these essential components and techniques, offering valuable guidance for students looking to win their next ...

  8. Making the Most of Your Science Fair Research Paper

    Science fair research papers can be intimidating at first, but they don't have to be! A science fair project gives students an opportunity to create something truly unique and original. With the right guidance and resources, even beginners in the field of scientific inquiry can produce impressive results. Here are some tips for making your ...

  9. PDF GSEF Student Guide- How to Do a Science Fair Project

    1. Get a bound notebook to use as a logbook and number the pages. 2. Select a topic. 3. Narrow the topic to a specific problem, stated as a research question, with a single variable. 4. Conduct a literature review of the topic and problem and write a draft of the research report. 5.

  10. PDF Writing the Science Fair Project Report

    g the Science Fair Project Report The purpose of your science fair project report, and of any scientific paper, is to persuade the reader that the conc. ions you have drawn are correct. This goal can be accomplished. you write clearly and concisely. Your project report must be type.

  11. Project/Paper Guide

    Project/Paper Guide. In order to be successful at the Science Fair, it's important to plan your project well and understand the types of projects that will be accepted in the science fair. This section is designed to help you plan your project or paper and choose your research method, so that you can create a successful project.

  12. The Basics

    This science fair project guide published by Science Buddies can help you get started. This 15-minute animated video, by a young artist named Kevin Temmer, provides a great introduction to preparing for a science fair. Now that you know what to do, choose a topic and then: Research the topic. This means becoming a mini-expert on the topic.

  13. How to Format Your Research Paper

    This table describes how to format your research paper using either the MLA or APA guidelines. Be sure to follow any additional instructions that your teacher provides. 12-pt. Times Roman or Courier. For figures, however, use a sans serif font such as Arial. Leave one space after a period unless your teacher prefers two. Leave one space after a ...

  14. Step 4: Background Research

    In the final paper, this background research will be put into paragraph form. Use the Background Research Planning Worksheet to help you formulate questions that you need to answer for your topic. Each student should become an expert on anything that is closely related to their area of research.

  15. PDF Science Fair Written Report Information Packet Middle School Science

    An abstract is an abbreviated version of your science fair project final report. It must be limited to a maximum of 250 words. The science fair project abstract appears at the beginning of the report as well as on your display board. Your science fair project abstract lets people quickly determine if they want to read the entire report.

  16. How to Write an ISEF Abstract

    The abstract should be a brief, yet comprehensive synopsis of the research project. It should seek to highlight the research question(s), experimental procedures, data, and conclusions in a way that is concise and easy to understand. It will be reviewed by Special Award Organization and Grand Award Judges to determine whether the project stands out within its category or qualifies for special ...

  17. Your Guide to a Successful Homeschool Science Fair

    Nov 20, 2023. The late afternoon sun poured liquid gold through the west-facing windows. A girl, a dog, and a mom stood by the lit stove in a quiet house, methodically adding sugar to water in a pan to create a super-saturated solution. As the steam from the open pan drifted upward, the sun illuminated the molecules and transformed this scene ...

  18. A Data-Driven Approach to Monitor and Improve Open and FAIR Research

    The CODATA Data Science Journal is a peer-reviewed, open access, electronic journal, publishing papers on the management, dissemination, use and reuse of research data and databases across all research domains, including science, technology, the humanities and the arts. The scope of the journal includes descriptions of data systems, their implementations and their publication, applications ...

  19. China-U.S. Science Collaborations Are Declining, Slowing Key Research

    The proportion of research papers with Chinese and international co-authors has been falling for even longer. At its peak, in 2018, 26.6% — roughly 110,000 articles — of China's output in ...

  20. Pursuing the middle path to scientific discovery

    This research is reported in Science. In addition to Cao, Martin, Mitchell and Zheng, authors include Tao Zhou, Dina Sheyfer, Jieun Kim, Jiyeob Kim, Travis Frazer, Zhonghou Cai, Martin Holt and ...

  21. Elektrostal

    Elektrostal is a center of metallurgy and heavy machinery manufacturing. Major companies include: Elektrostal metallurgical factory; Elektrostal chemical-mechanical factory; Elektrostal Heavy Engineering Works, JSC is a designer and manufacturer of equipment for producing seamless hot-rolled, cold-rolled and welded steel materials and metallurgical equipment.

  22. Machine-Building Plant (Elemash)

    Today, Elemash is one of the largest TVEL nuclear fuel production companies in Russia, specializing in fuel assemblies for nuclear power plants, research reactors, and naval nuclear reactors. Its fuel assemblies for RBMK, VVER, and fast reactors are used in 67 reactors worldwide. 2 It also produced MOX fuel assemblies for the BN-800 and the ...

  23. Zhukovsky International Airport

    Zhukovsky International Airport, formerly known as Ramenskoye Airport or Zhukovsky Airfield - international airport, located in Moscow Oblast, Russia 36 km southeast of central Moscow, in the town of Zhukovsky, a few kilometers southeast of the old Bykovo Airport. After its reconstruction in 2014-2016, Zhukovsky International Airport was officially opened on 30 May 2016.

  24. Writing a Science Fair Project Research Plan

    To make a background research plan — a roadmap of the research questions you need to answer — follow these steps: Identify the keywords in the question for your science fair project. Brainstorm additional keywords and concepts. Use a table with the "question words" (why, how, who, what, when, where) to generate research questions from your ...

  25. Politicizing science funding undermines public trust in science

    Science and ideology are fundamentally different things. Science, by its nature questions assumptions; to flourish, it requires freedom of inquiry and the free exchange of ideas. Ideology is hostile to such freedoms. Historically, when ideology has invaded science, stagnation and collapse have ensued (Graham, 1987; Josephson, 2005; Krylov, 2021).

  26. File:Coat of Arms of Zhukovsky (Moscow oblast).svg

    您可以向此项目. Zhukovsky coat of arms. Date of adoption: April 25, 2002. Russian Heraldic Register no. 959. Textual description: "On a sky-blue (azure) field there are three wide arrow-heads in a triangle (two and one in a form of a plane). Above them there're two wings. All figures in gold". 2009年2月2日. File:Zhukovsky coat of arms ...