Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Academic writing
  • How to write a lab report

How To Write A Lab Report | Step-by-Step Guide & Examples

Published on May 20, 2021 by Pritha Bhandari . Revised on July 23, 2023.

A lab report conveys the aim, methods, results, and conclusions of a scientific experiment. The main purpose of a lab report is to demonstrate your understanding of the scientific method by performing and evaluating a hands-on lab experiment. This type of assignment is usually shorter than a research paper .

Lab reports are commonly used in science, technology, engineering, and mathematics (STEM) fields. This article focuses on how to structure and write a lab report.

Instantly correct all language mistakes in your text

Upload your document to correct all your mistakes in minutes

upload-your-document-ai-proofreader

Table of contents

Structuring a lab report, introduction, other interesting articles, frequently asked questions about lab reports.

The sections of a lab report can vary between scientific fields and course requirements, but they usually contain the purpose, methods, and findings of a lab experiment .

Each section of a lab report has its own purpose.

  • Title: expresses the topic of your study
  • Abstract : summarizes your research aims, methods, results, and conclusions
  • Introduction: establishes the context needed to understand the topic
  • Method: describes the materials and procedures used in the experiment
  • Results: reports all descriptive and inferential statistical analyses
  • Discussion: interprets and evaluates results and identifies limitations
  • Conclusion: sums up the main findings of your experiment
  • References: list of all sources cited using a specific style (e.g. APA )
  • Appendices : contains lengthy materials, procedures, tables or figures

Although most lab reports contain these sections, some sections can be omitted or combined with others. For example, some lab reports contain a brief section on research aims instead of an introduction, and a separate conclusion is not always required.

If you’re not sure, it’s best to check your lab report requirements with your instructor.

Check for common mistakes

Use the best grammar checker available to check for common mistakes in your text.

Fix mistakes for free

Your title provides the first impression of your lab report – effective titles communicate the topic and/or the findings of your study in specific terms.

Create a title that directly conveys the main focus or purpose of your study. It doesn’t need to be creative or thought-provoking, but it should be informative.

  • The effects of varying nitrogen levels on tomato plant height.
  • Testing the universality of the McGurk effect.
  • Comparing the viscosity of common liquids found in kitchens.

An abstract condenses a lab report into a brief overview of about 150–300 words. It should provide readers with a compact version of the research aims, the methods and materials used, the main results, and the final conclusion.

Think of it as a way of giving readers a preview of your full lab report. Write the abstract last, in the past tense, after you’ve drafted all the other sections of your report, so you’ll be able to succinctly summarize each section.

To write a lab report abstract, use these guiding questions:

  • What is the wider context of your study?
  • What research question were you trying to answer?
  • How did you perform the experiment?
  • What did your results show?
  • How did you interpret your results?
  • What is the importance of your findings?

Nitrogen is a necessary nutrient for high quality plants. Tomatoes, one of the most consumed fruits worldwide, rely on nitrogen for healthy leaves and stems to grow fruit. This experiment tested whether nitrogen levels affected tomato plant height in a controlled setting. It was expected that higher levels of nitrogen fertilizer would yield taller tomato plants.

Levels of nitrogen fertilizer were varied between three groups of tomato plants. The control group did not receive any nitrogen fertilizer, while one experimental group received low levels of nitrogen fertilizer, and a second experimental group received high levels of nitrogen fertilizer. All plants were grown from seeds, and heights were measured 50 days into the experiment.

The effects of nitrogen levels on plant height were tested between groups using an ANOVA. The plants with the highest level of nitrogen fertilizer were the tallest, while the plants with low levels of nitrogen exceeded the control group plants in height. In line with expectations and previous findings, the effects of nitrogen levels on plant height were statistically significant. This study strengthens the importance of nitrogen for tomato plants.

Your lab report introduction should set the scene for your experiment. One way to write your introduction is with a funnel (an inverted triangle) structure:

  • Start with the broad, general research topic
  • Narrow your topic down your specific study focus
  • End with a clear research question

Begin by providing background information on your research topic and explaining why it’s important in a broad real-world or theoretical context. Describe relevant previous research on your topic and note how your study may confirm it or expand it, or fill a gap in the research field.

This lab experiment builds on previous research from Haque, Paul, and Sarker (2011), who demonstrated that tomato plant yield increased at higher levels of nitrogen. However, the present research focuses on plant height as a growth indicator and uses a lab-controlled setting instead.

Next, go into detail on the theoretical basis for your study and describe any directly relevant laws or equations that you’ll be using. State your main research aims and expectations by outlining your hypotheses .

Based on the importance of nitrogen for tomato plants, the primary hypothesis was that the plants with the high levels of nitrogen would grow the tallest. The secondary hypothesis was that plants with low levels of nitrogen would grow taller than plants with no nitrogen.

Your introduction doesn’t need to be long, but you may need to organize it into a few paragraphs or with subheadings such as “Research Context” or “Research Aims.”

Don't submit your assignments before you do this

The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.

previous laboratory experiments

Try for free

A lab report Method section details the steps you took to gather and analyze data. Give enough detail so that others can follow or evaluate your procedures. Write this section in the past tense. If you need to include any long lists of procedural steps or materials, place them in the Appendices section but refer to them in the text here.

You should describe your experimental design, your subjects, materials, and specific procedures used for data collection and analysis.

Experimental design

Briefly note whether your experiment is a within-subjects  or between-subjects design, and describe how your sample units were assigned to conditions if relevant.

A between-subjects design with three groups of tomato plants was used. The control group did not receive any nitrogen fertilizer. The first experimental group received a low level of nitrogen fertilizer, while the second experimental group received a high level of nitrogen fertilizer.

Describe human subjects in terms of demographic characteristics, and animal or plant subjects in terms of genetic background. Note the total number of subjects as well as the number of subjects per condition or per group. You should also state how you recruited subjects for your study.

List the equipment or materials you used to gather data and state the model names for any specialized equipment.

List of materials

35 Tomato seeds

15 plant pots (15 cm tall)

Light lamps (50,000 lux)

Nitrogen fertilizer

Measuring tape

Describe your experimental settings and conditions in detail. You can provide labelled diagrams or images of the exact set-up necessary for experimental equipment. State how extraneous variables were controlled through restriction or by fixing them at a certain level (e.g., keeping the lab at room temperature).

Light levels were fixed throughout the experiment, and the plants were exposed to 12 hours of light a day. Temperature was restricted to between 23 and 25℃. The pH and carbon levels of the soil were also held constant throughout the experiment as these variables could influence plant height. The plants were grown in rooms free of insects or other pests, and they were spaced out adequately.

Your experimental procedure should describe the exact steps you took to gather data in chronological order. You’ll need to provide enough information so that someone else can replicate your procedure, but you should also be concise. Place detailed information in the appendices where appropriate.

In a lab experiment, you’ll often closely follow a lab manual to gather data. Some instructors will allow you to simply reference the manual and state whether you changed any steps based on practical considerations. Other instructors may want you to rewrite the lab manual procedures as complete sentences in coherent paragraphs, while noting any changes to the steps that you applied in practice.

If you’re performing extensive data analysis, be sure to state your planned analysis methods as well. This includes the types of tests you’ll perform and any programs or software you’ll use for calculations (if relevant).

First, tomato seeds were sown in wooden flats containing soil about 2 cm below the surface. Each seed was kept 3-5 cm apart. The flats were covered to keep the soil moist until germination. The seedlings were removed and transplanted to pots 8 days later, with a maximum of 2 plants to a pot. Each pot was watered once a day to keep the soil moist.

The nitrogen fertilizer treatment was applied to the plant pots 12 days after transplantation. The control group received no treatment, while the first experimental group received a low concentration, and the second experimental group received a high concentration. There were 5 pots in each group, and each plant pot was labelled to indicate the group the plants belonged to.

50 days after the start of the experiment, plant height was measured for all plants. A measuring tape was used to record the length of the plant from ground level to the top of the tallest leaf.

In your results section, you should report the results of any statistical analysis procedures that you undertook. You should clearly state how the results of statistical tests support or refute your initial hypotheses.

The main results to report include:

  • any descriptive statistics
  • statistical test results
  • the significance of the test results
  • estimates of standard error or confidence intervals

The mean heights of the plants in the control group, low nitrogen group, and high nitrogen groups were 20.3, 25.1, and 29.6 cm respectively. A one-way ANOVA was applied to calculate the effect of nitrogen fertilizer level on plant height. The results demonstrated statistically significant ( p = .03) height differences between groups.

Next, post-hoc tests were performed to assess the primary and secondary hypotheses. In support of the primary hypothesis, the high nitrogen group plants were significantly taller than the low nitrogen group and the control group plants. Similarly, the results supported the secondary hypothesis: the low nitrogen plants were taller than the control group plants.

These results can be reported in the text or in tables and figures. Use text for highlighting a few key results, but present large sets of numbers in tables, or show relationships between variables with graphs.

You should also include sample calculations in the Results section for complex experiments. For each sample calculation, provide a brief description of what it does and use clear symbols. Present your raw data in the Appendices section and refer to it to highlight any outliers or trends.

The Discussion section will help demonstrate your understanding of the experimental process and your critical thinking skills.

In this section, you can:

  • Interpret your results
  • Compare your findings with your expectations
  • Identify any sources of experimental error
  • Explain any unexpected results
  • Suggest possible improvements for further studies

Interpreting your results involves clarifying how your results help you answer your main research question. Report whether your results support your hypotheses.

  • Did you measure what you sought out to measure?
  • Were your analysis procedures appropriate for this type of data?

Compare your findings with other research and explain any key differences in findings.

  • Are your results in line with those from previous studies or your classmates’ results? Why or why not?

An effective Discussion section will also highlight the strengths and limitations of a study.

  • Did you have high internal validity or reliability?
  • How did you establish these aspects of your study?

When describing limitations, use specific examples. For example, if random error contributed substantially to the measurements in your study, state the particular sources of error (e.g., imprecise apparatus) and explain ways to improve them.

The results support the hypothesis that nitrogen levels affect plant height, with increasing levels producing taller plants. These statistically significant results are taken together with previous research to support the importance of nitrogen as a nutrient for tomato plant growth.

However, unlike previous studies, this study focused on plant height as an indicator of plant growth in the present experiment. Importantly, plant height may not always reflect plant health or fruit yield, so measuring other indicators would have strengthened the study findings.

Another limitation of the study is the plant height measurement technique, as the measuring tape was not suitable for plants with extreme curvature. Future studies may focus on measuring plant height in different ways.

The main strengths of this study were the controls for extraneous variables, such as pH and carbon levels of the soil. All other factors that could affect plant height were tightly controlled to isolate the effects of nitrogen levels, resulting in high internal validity for this study.

Your conclusion should be the final section of your lab report. Here, you’ll summarize the findings of your experiment, with a brief overview of the strengths and limitations, and implications of your study for further research.

Some lab reports may omit a Conclusion section because it overlaps with the Discussion section, but you should check with your instructor before doing so.

If you want to know more about AI for academic writing, AI tools, or fallacies make sure to check out some of our other articles with explanations and examples or go directly to our tools!

  • Ad hominem fallacy
  • Post hoc fallacy
  • Appeal to authority fallacy
  • False cause fallacy
  • Sunk cost fallacy
  • Deep learning
  • Generative AI
  • Machine learning
  • Reinforcement learning
  • Supervised vs. unsupervised learning

 (AI) Tools

  • Grammar Checker
  • Paraphrasing Tool
  • Text Summarizer
  • AI Detector
  • Plagiarism Checker
  • Citation Generator

A lab report conveys the aim, methods, results, and conclusions of a scientific experiment . Lab reports are commonly assigned in science, technology, engineering, and mathematics (STEM) fields.

The purpose of a lab report is to demonstrate your understanding of the scientific method with a hands-on lab experiment. Course instructors will often provide you with an experimental design and procedure. Your task is to write up how you actually performed the experiment and evaluate the outcome.

In contrast, a research paper requires you to independently develop an original argument. It involves more in-depth research and interpretation of sources and data.

A lab report is usually shorter than a research paper.

The sections of a lab report can vary between scientific fields and course requirements, but it usually contains the following:

  • Abstract: summarizes your research aims, methods, results, and conclusions
  • References: list of all sources cited using a specific style (e.g. APA)
  • Appendices: contains lengthy materials, procedures, tables or figures

The results chapter or section simply and objectively reports what you found, without speculating on why you found these results. The discussion interprets the meaning of the results, puts them in context, and explains why they matter.

In qualitative research , results and discussion are sometimes combined. But in quantitative research , it’s considered important to separate the objective results from your interpretation of them.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, July 23). How To Write A Lab Report | Step-by-Step Guide & Examples. Scribbr. Retrieved September 4, 2024, from https://www.scribbr.com/academic-writing/lab-report/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, guide to experimental design | overview, steps, & examples, how to write an apa methods section, how to write an apa results section, what is your plagiarism score.

National Academies Press: OpenBook

America's Lab Report: Investigations in High School Science (2006)

Chapter: 3 laboratory experiences and student learning, 3 laboratory experiences and student learning.

.

.

In this chapter, the committee first identifies and clarifies the learning goals of laboratory experiences and then discusses research evidence on attainment of those goals. The review of research evidence draws on three major strands of research: (1) cognitive research illuminating how students learn; (2) studies that examine laboratory experiences that stand alone, separate from the flow of classroom science instruction; and (3) research projects that sequence laboratory experiences with other forms of science instruction. 1 We propose the phrase “integrated instructional units” to describe these research and design projects that integrate laboratory experiences within a sequence of science instruction. In the following section of this chapter, we present design principles for laboratory experiences derived from our analysis of these multiple strands of research and suggest that laboratory experiences designed according to these principles are most likely to accomplish their learning goals. Next we consider the role of technology in supporting student learning from laboratory experiences. The chapter concludes with a summary.

GOALS FOR LABORATORY EXPERIENCES

Laboratories have been purported to promote a number of goals for students, most of which are also the goals of science education in general (Lunetta, 1998; Hofstein and Lunetta, 1982). The committee commissioned a paper to examine the definition and goals of laboratory experiences (Millar, 2004) and also considered research reviews on laboratory education that have identified and discussed learning goals (Anderson, 1976; Hofstein and Lunetta, 1982; Lazarowitz and Tamir, 1994; Shulman and Tamir, 1973). While these inventories of goals vary somewhat, a core set remains fairly consistent. Building on these commonly stated goals, the committee developed a comprehensive list of goals for or desired outcomes of laboratory experiences:

Enhancing mastery of subject matter . Laboratory experiences may enhance student understanding of specific scientific facts and concepts and of the way in which these facts and concepts are organized in the scientific disciplines.

Developing scientific reasoning . Laboratory experiences may promote a student’s ability to identify questions and concepts that guide scientific

  

There is a larger body of research on how students learn science that is not considered in depth here because the committee’s focus is science learning through laboratory experiences. The larger body of research is discussed in the National Research Council (2005) report, ; it is also considered in an ongoing National Research Council study of science learning in grades K-8.

investigations; to design and conduct scientific investigations; to develop and revise scientific explanations and models; to recognize and analyze alternative explanations and models; and to make and defend a scientific argument. Making a scientific argument includes such abilities as writing, reviewing information, using scientific language appropriately, constructing a reasoned argument, and responding to critical comments.

Understanding the complexity and ambiguity of empirical work . Interacting with the unconstrained environment of the material world in laboratory experiences may help students concretely understand the inherent complexity and ambiguity of natural phenomena. Laboratory experiences may help students learn to address the challenges inherent in directly observing and manipulating the material world, including troubleshooting equipment used to make observations, understanding measurement error, and interpreting and aggregating the resulting data.

Developing practical skills . In laboratory experiences, students may learn to use the tools and conventions of science. For example, they may develop skills in using scientific equipment correctly and safely, making observations, taking measurements, and carrying out well-defined scientific procedures.

Understanding of the nature of science . Laboratory experiences may help students to understand the values and assumptions inherent in the development and interpretation of scientific knowledge, such as the idea that science is a human endeavor that seeks to understand the material world and that scientific theories, models, and explanations change over time on the basis of new evidence.

Cultivating interest in science and interest in learning science . As a result of laboratory experiences that make science “come alive,” students may become interested in learning more about science and see it as relevant to everyday life.

Developing teamwork abilities . Laboratory experiences may also promote a student’s ability to collaborate effectively with others in carrying out complex tasks, to share the work of the task, to assume different roles at different times, and to contribute and respond to ideas.

Although most of these goals were derived from previous research on laboratory experiences and student learning, the committee identified the new goal of “understanding the complexity and ambiguity of empirical work” to reflect the unique nature of laboratory experiences. Students’ direct encounters with natural phenomena in laboratory science courses are inherently more ambiguous and messy than the representations of these phenomena in science lectures, textbooks, and mathematical formulas (Millar, 2004). The committee thinks that developing students’ ability to recognize this complexity and develop strategies for sorting through it is an essential

goal of laboratory experiences. Unlike the other goals, which coincide with the goals of science education more broadly and may be advanced through lectures, reading, or other forms of science instruction, laboratory experiences may be the only way to advance the goal of helping students understand the complexity and ambiguity of empirical work.

RECENT DEVELOPMENTS IN RESEARCH AND DESIGN OF LABORATORY EXPERIENCES

In reviewing evidence on the extent to which students may attain the goals of laboratory experiences listed above, the committee identified a recent shift in the research. Historically, laboratory experiences have been separate from the flow of classroom science instruction and often lacked clear learning goals. Because this approach remains common today, we refer to these isolated interactions with natural phenomena as “typical” laboratory experiences. 2 Reflecting this separation, researchers often engaged students in one or two experiments or other science activities and then conducted assessments to determine whether their understanding of the science concept underlying the activity had increased. Some studies directly compared measures of student learning following laboratory experiences with measures of student learning following lectures, discussions, videotapes, or other methods of science instruction in an effort to determine which modes of instruction were most effective.

Over the past 10 years, some researchers have shifted their focus. Assuming that the study of the natural world requires opportunities to directly encounter that world, investigators are integrating laboratory experiences and other forms of instruction into instructional sequences in order to help students progress toward science learning goals. These studies draw on principles of learning derived from the rapid growth in knowledge from cognitive research to address the question of how to design science instruction, including laboratory experiences, in order to support student learning.

Given the complexity of these teaching and learning sequences, the committee struggled with how best to describe them. Initially, the committee used the term “science curriculum units.” However, that term failed to convey the importance of integration in this approach to sequencing laboratory experiences with other forms of teaching and learning. The research reviewed by the committee indicated that these curricula not only integrate laboratory experiences in the flow of science instruction, but also integrate

  

In , we argue that most U.S. high school students currently engage in these typical laboratory experiences.

student learning about both the concepts and processes of science. To reflect these aspects of the new approach, the committee settled on the term “integrated instructional units” in this report.

The following sections briefly describe principles of learning derived from recent research in the cognitive sciences and their application in design of integrated instructional units.

Principles of Learning Informing Integrated Instructional Units

Recent research and development of integrated instructional units that incorporate laboratory experiences are based on a large and growing body of cognitive research. This research has led to development of a coherent and multifaceted theory of learning that recognizes that prior knowledge, context, language, and social processes play critical roles in cognitive development and learning (National Research Council, 1999). Taking each of these factors into account, the National Research Council (NRC) report How People Learn identifies four critical principles that support effective learning environments (Glaser, 1994; National Research Council, 1999), and a more recent NRC report, How Students Learn , considers these principles as they relate specifically to science (National Research Council, 2005). These four principles are summarized below.

Learner-Centered Environments

The emerging integrated instructional units are designed to be learner-centered. This principle is based on research showing that effective instruction begins with what learners bring to the setting, including cultural practices and beliefs, as well as knowledge of academic content. Taking students’ preconceptions into account is particularly critical in science instruction. Students come to the classroom with conceptions of natural phenomena that are based on their everyday experiences in the world. Although these conceptions are often reasonable and can provide satisfactory everyday explanations to students, they do not always match scientific explanations and break down in ways that students often fail to notice. Teachers face the challenge of engaging with these intuitive ideas, some of which are more firmly rooted than others, in order to help students move toward a more scientific understanding. In this way, understanding scientific knowledge often requires a change in—not just an addition to—what students notice and understand about the world (National Research Council, 2005).

Knowledge-Centered Environments

The developing integrated instructional units are based on the principle that learning is enhanced when the environment is knowledge-centered. That is, the laboratory experiences and other instruction included in integrated instructional units are designed to help students learn with understanding, rather than simply acquiring sets of disconnected facts and skills (National Research Council, 1999).

In science, the body of knowledge with which students must engage includes accepted scientific ideas about natural phenomena as well as an understanding of what it means to “do science.” These two aspects of science are reflected in the goals of laboratory experiences, which include mastery of subject matter (accepted scientific ideas about phenomena) and several goals related to the processes of science (understanding the complexity of empirical work, development of scientific reasoning). Research on student thinking about science shows a progression of ideas about scientific knowledge and how it is justified. At the first stage, students perceive scientific knowledge as right or wrong. Later, students characterize discrepant ideas and evidence as “mere opinion.” Eventually, students recognize scientific knowledge as being justified by evidence derived through rigorous research. Several studies have shown that a large proportion of high school students are at the first stage in their views of scientific knowledge (National Research Council, 2005).

Knowledge-centered environments encourage students to reflect on their own learning progress (metacognition). Learning is facilitated when individuals identify, monitor, and regulate their own thinking and learning. To be effective problem solvers and learners, students need to determine what they already know and what else they need to know in any given situation, including when things are not going as expected. For example, students with better developed metacognitive strategies will abandon an unproductive problem-solving strategy very quickly and substitute a more productive one, whereas students with less effective metacognitive skills will continue to use the same strategy long after it has failed to produce results (Gobert and Clement, 1999). The basic metacognitive strategies include: (1) connecting new information to former knowledge, (2) selecting thinking strategies deliberately, and (3) monitoring one’s progress during problem solving.

A final aspect of knowledge-centered learning, which may be particularly relevant to integrated instructional units, is that the practices and activities in which people engage while learning shape what they learn. Transfer (the ability to apply learning in varying situations) is made possible to the extent that knowledge and learning are grounded in multiple contexts. Transfer is more difficult when a concept is taught in a limited set of contexts or through a limited set of activities. By encountering the same concept at work in multiple contexts (such as in laboratory experiences and in discussion),

students can develop a deeper understanding of the concept and how it can be used as well as the ability to transfer what has been learned in one context to others (Bransford and Schwartz, 2001).

Assessment to Support Learning

Another important principle of learning that has informed development of integrated instructional units is that assessment can be used to support learning. Cognitive research has shown that feedback is fundamental to learning, but feedback opportunities are scarce in most classrooms. This research indicates that formative assessments provide students with opportunities to revise and improve the quality of their thinking while also making their thinking apparent to teachers, who can then plan instruction accordingly. Assessments must reflect the learning goals of the learning environment. If the goal is to enhance understanding and the applicability of knowledge, it is not sufficient to provide assessments that focus primarily on memory for facts and formulas. The Thinkertools science instructional unit discussed in the following section incorporates this principle, including formative self-assessment tools that help students advance toward several of the goals of laboratory experiences.

Community-Centered Environments

Research has shown that learning is enhanced in a community setting, when students and teachers share norms that value knowledge and participation (see Cobb et al., 2001). Such norms increase people’s opportunities and motivation to interact, receive feedback, and learn. Learning is enhanced when students have multiple opportunities to articulate their ideas to peers and to hear and discuss others’ ideas. A community-centered classroom environment may not be organized in traditional ways. For example, in science classrooms, the teacher is often the sole authority and arbiter of scientific knowledge, placing students in a relatively passive role (Lemke, 1990). Such an organization may promote students’ view that scientific knowledge is a collection of facts about the world, authorized by expert scientists and irrelevant to students’ own experience. The instructional units discussed below have attempted to restructure the social organization of the classroom and encourage students and the teacher to interact and learn from each other.

Design of Integrated Instructional Units

The learning principles outlined above have begun to inform design of integrated instructional units that include laboratory experiences with other types of science learning activities. These integrated instructional units were

developed through research programs that tightly couple research, design, and implementation in an iterative process. The research programs are beginning to document the details of student learning, development, and interaction when students are given systematic support—or scaffolding—in carefully structured social and cognitive activities. Scaffolding helps to guide students’ thinking, so that they can gradually take on more autonomy in carrying out various parts of the activities. Emerging research on these integrated instructional units provides guidance about how to design effective learning environments for real-world educational settings (see Linn, Davis, and Bell, 2004a; Cobb et al., 2003; Design-Based Research Collective, 2003).

Integrated instructional units interweave laboratory experiences with other types of science learning activities, including lectures, reading, and discussion. Students are engaged in framing research questions, designing and executing experiments, gathering and analyzing data, and constructing arguments and conclusions as they carry out investigations. Diagnostic, formative assessments are embedded into the instructional sequences and can be used to gauge student’s developing understanding and to promote their self-reflection on their thinking.

With respect to laboratory experiences, these instructional units share two key features. The first is that specific laboratory experiences are carefully selected on the basis of research-based ideas of what students are likely to learn from them. For example, any particular laboratory activity is likely to contribute to learning only if it engages students’ current thinking about the target phenomena and is likely to make them critically evaluate their ideas in relation to what they see during the activity. The second is that laboratory experiences are explicitly linked to and integrated with other learning activities in the unit. The assumption behind this second feature is that just because students do a laboratory activity, they may not necessarily understand what they have done. Nascent research on integrated instructional units suggests that both framing a particular laboratory experience ahead of time and following it with activities that help students make sense of the experience are crucial in using a laboratory experience to support science learning. This “integration” approach draws on earlier research showing that intervention and negotiation with an authority, usually a teacher, was essential to help students make meaning out of their laboratory activities (Driver, 1995).

Examples of Integrated Instructional Units

Scaling up chemistry that applies.

Chemistry That Applies (CTA) is a 6-8 week integrated instructional unit designed to help students in grades 8-10 understand the law of conservation

of matter. Created by researchers at the Michigan Department of Education (Blakeslee et al., 1993), this instructional unit was one of only a few curricula that were highly rated by American Assocation for the Advancement of Science Project 2061 in its study of middle school science curricula (Kesidou and Roseman, 2002). Student groups explore four chemical reactions—burning, rusting, the decomposition of water, and the volcanic reaction of baking soda and vinegar. They cause these reactions to happen, obtain and record data in individual notebooks, analyze the data, and use evidence-based arguments to explain the data.

The instructional unit engages the students in a carefully structured sequence of hands-on laboratory investigations interwoven with other forms of instruction (Lynch, 2004). Student understanding is “pressed” through many experiences with the reactions and by group and individual pressures to make meaning of these reactions. For example, video transcripts indicate that students engaged in “science talk” during teacher demonstrations and during student experiments.

Researchers at George Washington University, in a partnership with Montgomery County public schools in Maryland, are currently conducting a five-year study of the feasibility of scaling up effective integrated instructional units, including CTA (Lynch, Kuipers, Pyke, and Szesze, in press). In 2001-2002, CTA was implemented in five highly diverse middle schools that were matched with five comparison schools using traditional curriculum materials in a quasi-experimental research design. All 8th graders in the five CTA schools, a total of about 1,500 students, participated in the CTA curriculum, while all 8th graders in the matched schools used the science curriculum materials normally available. Students were given pre- and posttests.

In 2002-2003, the study was replicated in the same five pairs of schools. In both years, students who participated in the CTA curriculum scored significantly higher than comparison students on a posttest. Average scores of students who participated in the CTA curriculum showed higher levels of fluency with the concept of conservation of matter (Lynch, 2004). However, because the concept is so difficult, most students in both the treatment and control group still have misconceptions, and few have a flexible, fully scientific understanding of the conservation of matter. All subgroups of students who were engaged in the CTA curriculum—including low-income students (eligible for free and reduced-price meals), black and Hispanic students, English language learners, and students eligible for special educational services—scored significantly higher than students in the control group on the posttest (Lynch and O’Donnell, 2005). The effect sizes were largest among three subgroups considered at risk for low science achievement, including Hispanic students, low-income students, and English language learners.

Based on these encouraging results, CTA was scaled up to include about 6,000 8th graders in 20 schools in 2003-2004 and 12,000 8th graders in 37 schools in 2004-2005 (Lynch and O’Donnell, 2005).

ThinkerTools

The ThinkerTools instructional unit is a sequence of laboratory experiences and other learning activities that, in its initial version, yielded substantial gains in students’ understanding of Newton’s laws of motion (White, 1993). Building on these positive results, ThinkerTools was expanded to focus not only on mastery of these laws of motion but also on scientific reasoning and understanding of the nature of science (White and Frederiksen, 1998). In the 10-week unit, students were guided to reflect on their own thinking and learning while they carry out a series of investigations. The integrated instructional unit was designed to help them learn about science processes as well as about the subject of force and motion. The instructional unit supports students as they formulate hypotheses, conduct empirical investigations, work with conceptually analogous computer simulations, and refine a conceptual model for the phenomena. Across the series of investigations, the integrated instructional unit introduces increasingly complex concepts. Formative assessments are integrated throughout the instructional sequence in ways that allow students to self-assess and reflect on core aspects of inquiry and epistemological dimensions of learning.

Researchers investigated the impact of Thinker Tools in 12 7th, 8th, and 9th grade classrooms with 3 teachers and 343 students. The researchers evaluated students’ developing understanding of scientific investigations using a pre-post inquiry test. In this assessment, students were engaged in a thought experiment that asked them to conceptualize, design, and think through a hypothetical research study. Gains in scores for students in the reflective self-assessment classes and control classrooms were compared. Results were also broken out by students categorized as high and low achieving, based on performance on a standardized test conducted before the intervention. Students in the reflective self-assessment classes exhibited greater gains on a test of investigative skills. This was especially true for low-achieving students. The researchers further analyzed specific components of the associated scientific processes—formulation of hypotheses, designing an experiment, predicting results, drawing conclusions from made-up results, and relating those conclusions back to the original hypotheses. Students in the reflective-self-assessment classes did better on all of these components than those in control classrooms, especially on the more difficult components (drawing conclusions and relating them to the original hypotheses).

Computer as Learning Partner

Beginning in 1980, a large group of technologists, classroom teachers, and education researchers developed the Computer as Learning Partner (CLP)

integrated instructional unit. Over 10 years, the team developed and tested eight versions of a 12-week unit on thermodynamics. Each year, a cohort of about 300 8th grade students participated in a sequence of teaching and learning activities focused primarily on a specific learning goal—enhancing students’ understanding of the difference between heat and temperature (Linn, 1997). The project engaged students in a sequence of laboratory experiences supported by computers, discussions, and other forms of science instruction. For example, computer images and words prompted students to make predictions about heat and conductivity and perform experiments using temperature-sensitive probes to confirm or refute their predictions. Students were given tasks related to scientific phenomena affecting their daily lives—such as how to keep a drink cold for lunch or selecting appropriate clothing for hiking in the mountains—as a way to motivate their interest and curiosity. Teachers play an important role in carrying out the curriculum, asking students to critique their own and each others’ investigations and encouraging them to reflect on their own thinking.

Over 10 years of study and revision, the integrated instructional unit proved increasingly effective in achieving its stated learning goals. Before the sequenced instruction was introduced, only 3 percent of middle school students could adequately explain the difference between heat and temperature. Eight versions later, about half of the students participating in CLP could explain this difference, representing a 400 percent increase in achievement. In addition, nearly 100 percent of students who participated in the final version of the instructional unit demonstrated understanding of conductors (Linn and Songer, 1991). By comparison, only 25 percent of a group of undergraduate chemistry students at the University of California at Berkeley could adequately explain the difference between heat and temperature. A longitudinal study comparing high school seniors who participated in the thermodynamics unit in middle school with seniors who had received more traditional middle school science instruction found a 50 percent improvement in CLP students’ performance in distinguishing between heat and temperature (Linn and Hsi, 2000)

Participating in the CLP instructional unit also increased students’ interest in science. Longitudinal studies of CLP participants revealed that, among those who went on to take high school physics, over 90 percent thought science was relevant to their lives. And 60 percent could provide examples of scientific phenomena in their daily lives. By comparison, only 60 percent of high school physics students who had not participated in the unit during middle school thought science was relevant to their lives, and only 30 percent could give examples in their daily lives (Linn and Hsi, 2000).

EFFECTIVENESS OF LABORATORY EXPERIENCES

Description of the literature review.

The committee’s review of the literature on the effectiveness of laboratory experiences considered studies of typical laboratory experiences and emerging research focusing on integrated instructional units. In reviewing both bodies of research, we aim to specify how laboratory experiences can further each of the science learning goals outlined at the beginning of this chapter.

Limitations of the Research

Our review was complicated by weaknesses in the earlier research on typical laboratory experiences, isolated from the stream of instruction (Hofstein and Lunetta, 1982). First, the investigators do not agree on a precise definition of the “laboratory” experiences under study. Second, many studies were weak in the selection and control of variables. Investigators failed to examine or report important variables relating to student abilities and attitudes. For example, they failed to note students’ prior laboratory experiences. They also did not give enough attention to extraneous factors that might affect student outcomes, such as instruction outside the laboratory. Third, the studies of typical laboratory experiences usually involved a small group of students with little diversity, making it difficult to generalize the results to the large, diverse population of U.S. high schools today. Fourth, investigators did not give enough attention to the adequacy of the instruments used to measure student outcomes. As an example, paper and pencil tests that focus on testing mastery of subject matter, the most frequently used assessment, do not capture student attainment of all of the goals we have identified. Such tests are not able to measure student progress toward goals that may be unique to laboratory experiences, such as developing scientific reasoning, understanding the complexity and ambiguity of empirical work, and development of practical skills.

Finally, most of the available research on typical laboratory experiences does not fully describe these activities. Few studies have examined teacher behavior, the classroom learning environment, or variables identifying teacher-student interaction. In addition, few recent studies have focused on laboratory manuals—both what is in them and how they are used. Research on the intended design of laboratory experiences, their implementation, and whether the implementation resembles the initial design would provide the understanding needed to guide improvements in laboratory instruction. However, only a few studies of typical laboratory experiences have measured the effectiveness of particular laboratory experiences in terms of both the extent

to which their activities match those that the teacher intended and the extent to which the students’ learning matches the learning objectives of the activity (Tiberghien, Veillard, Le Marchal, Buty, and Millar, 2000).

We also found weaknesses in the evolving research on integrated instructional units. First, these new units tend to be hothouse projects; researchers work intensively with teachers to construct atypical learning environments. While some have been developed and studied over a number of years and iterations, they usually involve relatively small samples of students. Only now are some of these efforts expanding to a scale that will allow robust generalizations about their value and how best to implement them. Second, these integrated instructional units have not been designed specifically to contrast some version of laboratory or practical experience with a lack of such experience. Rather, they assume that educational interventions are complex, systemic “packages” (Salomon, 1996) involving many interactions that may influence specific outcomes, and that science learning requires some opportunities for direct engagement with natural phenomena. Researchers commonly aim to document the complex interactions between and among students, teachers, laboratory materials, and equipment in an effort to develop profiles of successful interventions (Cobb et al., 2003; Collins, Joseph, and Bielaczyc, 2004; Design-Based Research Collective, 2003). These newer studies focus on how to sequence laboratory experiences and other forms of science instruction to support students’ science learning.

Scope of the Literature Search

A final note on the review of research: the scope of our study did not allow for an in-depth review of all of the individual studies of laboratory education conducted over the past 30 years. Fortunately, three major reviews of the literature from the 1970s, 1980s, and 1990s are available (Lazarowitz and Tamir, 1994; Lunetta, 1998; Hofstein and Lunetta, 2004). The committee relied on these reviews in our analysis of studies published before 1994. To identify studies published between 1994 and 2004, the committee searched electronic databases.

To supplement the database search, the committee commissioned three experts to review the nascent body of research on integrated instructional units (Bell, 2005; Duschl, 2004; Millar, 2004). We also invited researchers who are currently developing, revising, and studying the effectiveness of integrated instructional units to present their findings at committee meetings (Linn, 2004; Lynch, 2004).

All of these activities yielded few studies that focused on the high school level and were conducted in the United States. For this reason, the committee expanded the range of the literature considered to include some studies targeted at middle school and some international studies. We included stud-

ies at the elementary through postsecondary levels as well as studies of teachers’ learning in our analysis. In drawing conclusions from studies that were not conducted at the high school level, the committee took into consideration the extent to which laboratory experiences in high school differ from those in elementary and postsecondary education. Developmental differences among students, the organizational structure of schools, and the preparation of teachers are a few of the many factors that vary by school level and that the committee considered in making inferences from the available research. Similarly, when deliberating on studies conducted outside the United States, we considered differences in the science curriculum, the organization of schools, and other factors that might influence the outcomes of laboratory education.

Mastery of Subject Matter

Evidence from research on typical laboratory experiences.

Claims that typical laboratory experiences help students master science content rest largely on the argument that opportunities to directly interact with, observe, and manipulate materials will help students to better grasp difficult scientific concepts. It is believed that these experiences will force students to confront their misunderstandings about phenomena and shift toward more scientific understanding.

Despite these claims, there is almost no direct evidence that typical laboratory experiences that are isolated from the flow of science instruction are particularly valuable for learning specific scientific content (Hofstein and Lunetta, 1982, 2004; Lazarowitz and Tamir, 1994). White (1996) points out that many major reviews of science education from the 1960s and 1970s indicate that laboratory work does little to improve understanding of science content as measured by paper and pencil tests, and later studies from the 1980s and early 1990s do not challenge this view. Other studies indicate that typical laboratory experiences are no more effective in helping students master science subject matter than demonstrations in high school biology (Coulter, 1966), demonstration and discussion (Yager, Engen, and Snider, 1969), and viewing filmed experiments in chemistry (Ben-Zvi, Hofstein, Kempa, and Samuel, 1976). In contrast to most of the research, a single comparative study (Freedman, 2002) found that students who received regular laboratory instruction over the course of a school year performed better on a test of physical science knowledge than a control group of students who took a similar physical science course without laboratory activities.

Clearly, most of the evidence does not support the argument that typical laboratory experiences lead to improved learning of science content. More specifically, concrete experiences with phenomena alone do not appear to

force students to confront their misunderstandings and reevaluate their own assumptions. For example, VandenBerg, Katu, and Lunetta (1994) reported, on the basis of clinical studies with individual students, that hands-on activities with introductory electricity materials facilitated students’ understanding of the relationships among circuit elements and variables. The carefully selected practical activities created conceptual conflict in students’ minds—a first step toward changing their naïve ideas about electricity. However, the students remained unable to develop a fully scientific mental model of a circuit system. The authors suggested that greater engagement with conceptual organizers, such as analogies and concept maps, could have helped students develop more scientific understandings of basic electricity. Several researchers, including Dupin and Joshua (1987), have reported similar findings. Studies indicate that students often hold beliefs so intensely that even their observations in the laboratory are strongly influenced by those beliefs (Champagne, Gunstone, and Klopfer, 1985, cited in Lunetta, 1998; Linn, 1997). Students tend to adjust their observations to fit their current beliefs rather than change their beliefs in the face of conflicting observations.

Evidence from Research on Integrated Instructional Units

Current integrated instructional units build on earlier studies that found integration of laboratory experiences with other instructional activities enhanced mastery of subject matter (Dupin and Joshua, 1987; White and Gunstone, 1992, cited in Lunetta, 1998). A recent review of these and other studies concluded (Hofstein and Lunetta, 2004, p. 33):

When laboratory experiences are integrated with other metacognitive learning experiences such as “predict-observe-explain” demonstrations (White and Gunstone, 1992) and when they incorporate the manipulation of ideas instead of simply materials and procedures, they can promote the learning of science.

Integrated instructional units often focus on complex science topics that are difficult for students to understand. Their design is based on research on students’ intuitive conceptions of a science topic and how those conceptions differ from scientific conceptions. Students’ ideas often do not match the scientific understanding of a phenomenon and, as noted previously, these intuitive notions are resistant to change. For this reason, the sequenced units incorporate instructional activities specifically designed to confront intuitive conceptions and provide an environment in which students can construct normative conceptions. The role of laboratory experiences is to emphasize the discrepancies between students’ intuitive ideas about the topic and scientific ideas, as well as to support their construction of normative understanding. In order to help students link formal, scientific concepts to real

phenomena, these units include a sequence of experiences that will push them to question their intuitive and often inaccurate ideas.

Emerging studies indicate that exposure to these integrated instructional units leads to demonstrable gains in student mastery of a number of science topics in comparison to more traditional approaches. In physics, these subjects include Newtonian mechanics (Wells, Hestenes, and Swackhamer, 1995; White, 1993); thermodynamics (Songer and Linn, 1991); electricity (Shaffer and McDermott, 1992); optics (Bell and Linn, 2000; Reiner, Pea, and Shulman, 1995); and matter (Lehrer, Schauble, Strom, and Pligge, 2001; Smith, Maclin, Grosslight, and Davis, 1997; Snir, Smith, and Raz, 2003). Integrated instructional units in biology have enhanced student mastery of genetics (Hickey, Kindfield, Horwitz, and Christie, 2003) and natural selection (Reiser et al., 2001). A chemistry unit has led to gains in student understanding of stoichiometry (Lynch, 2004). Many, but not all, of these instructional units combine computer-based simulations of the phenomena under study with direct interactions with these phenomena. The role of technology in providing laboratory experiences is described later in this chapter.

Developing Scientific Reasoning

While philosophers of science now agree that there is no single scientific method, they do agree that a number of reasoning skills are critical to research across the natural sciences. These reasoning skills include identifying questions and concepts that guide scientific investigations, designing and conducting scientific investigations, developing and revising scientific explanations and models, recognizing and analyzing alternative explanations and models, and making and defending a scientific argument. It is not necessarily the case that these skills are sequenced in a particular way or used in every scientific investigation. Instead, they are representative of the abilities that both scientists and students need to investigate the material world and make meaning out of those investigations. Research on children’s and adults’ scientific reasoning (see the review by Zimmerman, 2000) suggests that effective experimentation is difficult for most people and not learned without instructional support.

Early research on the development of investigative skills suggested that students could learn aspects of scientific reasoning through typical laboratory instruction in college-level physics (Reif and St. John, 1979, cited in Hofstein and Lunetta, 1982) and in high school and college biology (Raghubir, 1979; Wheatley, 1975, cited in Hofstein and Lunetta, 1982).

More recent research, however, suggests that high school and college science teachers often emphasize laboratory procedures, leaving little time for discussion of how to plan an investigation or interpret its results (Tobin, 1987; see Chapter 4 ). Taken as a whole, the evidence indicates that typical laboratory work promotes only a few aspects of the full process of scientific reasoning—making observations and organizing, communicating, and interpreting data gathered from these observations. Typical laboratory experiences appear to have little effect on more complex aspects of scientific reasoning, such as the capacity to formulate research questions, design experiments, draw conclusions from observational data, and make inferences (Klopfer, 1990, cited in White, 1996).

Research developing from studies of integrated instructional units indicates that laboratory experiences can play an important role in developing all aspects of scientific reasoning, including the more complex aspects, if the laboratory experiences are integrated with small group discussion, lectures, and other forms of science instruction. With carefully designed instruction that incorporates opportunities to conduct investigations and reflect on the results, students as young as 4th and 5th grade can develop sophisticated scientific thinking (Lehrer and Schauble, 2004; Metz, 2004). Kuhn and colleagues have shown that 5th graders can learn to experiment effectively, albeit in carefully controlled domains and with extended supervised practice (Kuhn, Schauble, and Garcia-Mila, 1992). Explicit instruction on the purposes of experiments appears necessary to help 6th grade students design them well (Schauble, Giaser, Duschl, Schulze, and John, 1995).These studies suggest that laboratory experiences must be carefully designed to support the development of scientific reasoning.

Given the difficulty most students have with reasoning scientifically, a number of instructional units have focused on this goal. Evidence from several studies indicates that, with the appropriate scaffolding provided in these units, students can successfully reason scientifically. They can learn to design experiments (Schauble et al., 1995; White and Frederiksen, 1998), make predictions (Friedler, Nachmias, and Linn, 1990), and interpret and explain data (Bell and Linn, 2000; Coleman, 1998; Hatano and Inagaki, 1991; Meyer and Woodruff, 1997; Millar, 1998; Rosebery, Warren, and Conant, 1992; Sandoval and Millwood, 2005). Engagement with these instructional units has been shown to improve students’ abilities to recognize discrepancies between predicted and observed outcomes (Friedler et al., 1990) and to design good experiments (Dunbar, 1993; Kuhn et al., 1992; Schauble et al., 1995; Schauble, Klopfer, and Raghavan, 1991).

Integrated instructional units seem especially beneficial in developing scientific reasoning skills among lower ability students (White and Frederiksen, 1998).

Recently, research has focused on an important element of scientific reasoning—the ability to construct scientific arguments. Developing, revising, and communicating scientific arguments is now recognized as a core scientific practice (Driver, Newton, and Osborne, 2000; Duschl and Osborne, 2002). Laboratory experiences play a key role in instructional units designed to enhance students’ argumentation abilities, because they provide both the impetus and the data for constructing scientific arguments. Such efforts have taken many forms. For example, researchers working with young Haitian-speaking students in Boston used the students’ own interests to develop scientific investigations. Students designed an investigation to determine which school drinking fountain had the best-tasting water. The students designed data collection protocols, collected and analyzed their data, and then argued about their findings (Rosebery et al., 1992). The Knowledge Integration Environment project asked middle school students to examine a common set of evidence to debate competing hypotheses about light propagation. Overall, most students learned the scientific concept (that light goes on forever), although those who made better arguments learned more than their peers (Bell and Linn, 2000). These and other examples (e.g., Sandoval and Millwood, 2005) show that students in middle and high school can learn to argue scientifically, by learning to coordinate theoretical claims with evidence taken from their laboratory investigations.

Developing Practical Skills

Science educators and researchers have long claimed that learning practical laboratory skills is one of the important goals for laboratory experiences and that such skills may be attainable only through such experiences (White, 1996; Woolnough, 1983). However, development of practical skills has been measured in research less frequently than mastery of subject matter or scientific reasoning. Such practical outcomes deserve more attention, especially for laboratory experiences that are a critical part of vocational or technical training in some high school programs. When a primary goal of a program or course is to train students for jobs in laboratory settings, they must have the opportunity to learn to use and read sophisticated instruments and carry out standardized experimental procedures. The critical questions about acquiring these skills through laboratory experiences may not be whether laboratory experiences help students learn them, but how the experiences can be constructed so as to be most effective in teaching such skills.

Some research indicates that typical laboratory experiences specifically focused on learning practical skills can help students progress toward other goals. For example, one study found that students were often deficient in the simple skills needed to successfully carry out typical laboratory activities, such as using instruments to make measurements and collect accurate data (Bryce and Robertson, 1985). Other studies indicate that helping students to develop relevant instrumentation skills in controlled “prelab” activities can reduce the probability that important measurements in a laboratory experience will be compromised due to students’ lack of expertise with the apparatus (Beasley, 1985; Singer, 1977). This research suggests that development of practical skills may increase the probability that students will achieve the intended results in laboratory experiences. Achieving the intended results of a laboratory activity is a necessary, though not sufficient, step toward effectiveness in helping students attain laboratory learning goals.

Some research on typical laboratory experiences indicates that girls handle laboratory equipment less frequently than boys, and that this tendency is associated with less interest in science and less self-confidence in science ability among girls (Jovanovic and King, 1998). It is possible that helping girls to develop instrumentation skills may help them to participate more actively and enhance their interest in learning science.

Studies of integrated instructional units have not examined the extent to which engagement with these units may enhance practical skills in using laboratory materials and equipment. This reflects an instructional emphasis on helping students to learn scientific ideas with real understanding and on developing their skills at investigating scientific phenomena, rather than on particular laboratory techniques, such as taking accurate measurements or manipulating equipment. There is no evidence to suggest that students do not learn practical skills through integrated instructional units, but to date researchers have not assessed such practical skills.

Understanding the Nature of Science

Throughout the past 50 years, studies of students’ epistemological beliefs about science consistently show that most of them have naïve views about the nature of scientific knowledge and how such knowledge is constructed and evaluated by scientists over time (Driver, Leach, Millar, and Scott, 1996; Lederman, 1992). The general public understanding of science is similarly inaccurate. Firsthand experience with science is often seen as a key way to advance students’ understanding of and appreciation for the conventions of science. Laboratory experiences are considered the primary mecha-

nism for providing firsthand experience and are therefore assumed to improve students’ understanding of the nature of science.

Research on student understanding of the nature of science provides little evidence of improvement with science instruction (Lederman, 1992; Driver et al., 1996). Although much of this research historically did not examine details of students’ laboratory experiences, it often included very large samples of science students and thus arguably captured typical laboratory experiences (research from the late 1950s through the 1980s is reviewed by Lederman, 1992). There appear to be developmental trends in students’ understanding of the relations between experimentation and theory-building. Younger students tend to believe that experiments yield direct answers to questions; during middle and high school, students shift to a vague notion of experiments being tests of ideas. Only a small number of students appear to leave high school with a notion of science as model-building and experimentation, in an ongoing process of testing and revision (Driver et al., 1996; Carey and Smith, 1993; Smith et al., 2000). The conclusion that most experts draw from these results is that the isolated nature and rote procedural focus of typical laboratory experiences inhibits students from developing robust conceptions of the nature of science. Consequently, some have argued that the nature of science must be an explicit target of instruction (Khishfe and Abd-El-Khalick, 2002; Lederman, Abd-El-Khalick, Bell, and Schwartz, 2002).

As discussed above, there is reasonable evidence that integrated instructional units help students to learn processes of scientific inquiry. However, such instructional units do not appear, on their own, to help students develop robust conceptions of the nature of science. One large-scale study of a widely available inquiry-oriented curriculum, in which integrated instructional units were an explicit feature, showed no significant change in students’ ideas about the nature of science after a year’s instruction (Meichtry, 1993). Students engaged in the BGuILE science instructional unit showed no gains in understanding the nature of science from their participation, and they seemed not even to see their experience in the unit as necessarily related to professional science (Sandoval and Morrison, 2003). These findings and others have led to the suggestion that the nature of science must be an explicit target of instruction (Lederman et al., 2002).

There is evidence from the ThinkerTools science instructional unit that by engaging in reflective self-assessment on their own scientific investiga-

tions, students gained a more sophisticated understanding of the nature of science than matched control classes who used the curriculum without the ongoing monitoring and evaluation of their own and others’ research (White and Frederiksen, 1998). Students who engaged in the reflective assessment process “acquire knowledge of the forms that scientific laws, models, and theories can take, and of how the development of scientific theories is related to empirical evidence” (White and Frederiksen, 1998, p. 92). Students who participated in the laboratory experiences and other learning activities in this unit using the reflective assessment process were less likely to “view scientific theories as immutable and never subject to revision” (White and Frederiksen, 1998, p. 72). Instead, they saw science as meaningful and explicable. The ThinkerTools findings support the idea that attention to nature of science issues should be an explicit part of integrated instructional units, although even with such attention it remains difficult to change students’ ideas (Khishfe and Abd-el-Khalick, 2002).

A survey of several integrated instructional units found that they seem to bridge the “language gap” between science in school and scientific practice (Duschl, 2004). The units give students “extended opportunities to explore the relationship between evidence and explanation,” helping them not only to develop new knowledge (mastery of subject matter), but also to evaluate claims of scientific knowledge, reflecting a deeper understanding of the nature of science (Duschl, 2004). The available research leaves open the question of whether or not these experiences help students to develop an explicit, reflective conceptual framework about the nature of science.

Cultivating Interest in Science and Interest in Learning Science

Studies of the effect of typical laboratory experiences on student interest are much rarer than those focusing on student achievement or other cognitive outcomes (Hofstein and Lunetta, 2004; White, 1996). The number of studies that address interest, attitudes, and other affective outcomes has decreased over the past decade, as researchers have focused almost exclusively on cognitive outcomes (Hofstein and Lunetta, 2004). Among the few studies available, the evidence is mixed. Some studies indicate that laboratory experiences lead to more positive attitudes (Renner, Abraham, and Birnie, 1985; Denny and Chennell, 1986). Other studies show no relation between laboratory experiences and affect (Ato and Wilkinson, 1986; Freedman, 2002), and still others report laboratory experiences turned students away from science (Holden, 1990; Shepardson and Pizzini, 1993).

There are, however, two apparent weaknesses in studies of interest and attitude (Hofstein and Lunetta, 1982). One is that researchers often do not carefully define interest and how it should be measured. Consequently, it is unclear if students simply reported liking laboratory activities more than other classroom activities, or if laboratory activities engendered more interest in science as a field, or in taking science courses, or something else. Similarly, studies may report increased positive attitudes toward science from students’ participation in laboratory experiences, without clear description of what attitudes were measured, how large the changes were, or whether changes persisted over time.

Student Perceptions of Typical Laboratory Experiences

Students’ perceptions of laboratory experiences may affect their interest and engagement in science, and some studies have examined those perceptions. Researchers have found that students often do not have clear ideas about the general or specific purposes of their work in typical science laboratory activities (Chang and Lederman, 1994) and that their understanding of the goals of lessons frequently do not match their teachers’ goals for the same lessons (Hodson, 1993; Osborne and Freyberg, 1985; Wilkenson and Ward, 1997). When students do not understand the goals of experiments or laboratory investigations, negative consequences for learning occur (Schauble et al., 1995). In fact, students often do not make important connections between the purpose of a typical laboratory investigation and the design of the experiments. They do not connect the experiment with what they have done earlier, and they do not note the discrepancies among their own concepts, the concepts of their peers, and those of the science community (Champagne et al., 1985; Eylon and Linn, 1988; Tasker, 1981). As White (1998) notes, “to many students, a ‘lab’ means manipulating equipment but not manipulating ideas.” Thus, in considering how laboratory experiences may contribute to students’ interest in science and to other learning goals, their perceptions of those experiences must be considered.

A series of studies using the Science Laboratory Environment Inventory (SLEI) has demonstrated links between students’ perceptions of laboratory experiences and student outcomes (Fraser, McRobbie, and Giddings, 1993; Fraser, Giddings, and McRobbie, 1995; Henderson, Fisher, and Fraser, 2000; Wong and Fraser, 1995). The SLEI, which has been validated cross-nationally, measures five dimensions of the laboratory environment: student cohesiveness, open-endedness, integration, rule clarity, and material environment (see Table 3-1 for a description of each scale). Using the SLEI, researchers have studied students’ perceptions of chemistry and biology laboratories in several countries, including the United States. All five dimensions appear to be positively related with student attitudes, although the

TABLE 3-1 Descriptive Information for the Science Laboratory Environment Inventory

Scale Name

Description

Student cohesiveness

Extent to which students know, help, and are supportive of one another

Open-endedness

Extent to which the laboratory activities emphasize an open-ended, divergent approach to experimentation

Integration

Extent to which laboratory activities are integrated with nonlaboratory and theory classes

Rule clarity

Extent to which behavior in the laboratory is guided by formal rules

Material environment

Extent to which the laboratory equipment and materials are adequate

SOURCE: Henderson, Fisher, and Fraser (2000). Reprinted with permission of Wiley-Liss, Inc., a subsidiary of John Wiley & Sons, Inc.

relation of open-endedness with attitudes seems to vary with student population. In some populations, there is a negative relation to attitudes (Fraser et al., 1995) and to some cognitive outcomes (Henderson et al., 2000).

Research using the SLEI indicates that positive student attitudes are particularly strongly associated with cohesiveness (the extent to which students know, help, and are supportive of one another) and integration (the extent to which laboratory activities are integrated with nonlaboratory and theory classes) (Fraser et al.,1995; Wong and Fraser, 1995). Integration also shows a positive relation to students’ cognitive outcomes (Henderson et al., 2000; McRobbie and Fraser, 1993).

Students’ interest and attitudes have been measured less often than other goals of laboratory experiences in studies of integrated instructional units. When evidence is available, it suggests that students who participate in these units show greater interest in and more positive attitudes toward science. For example, in a study of ThinkerTools, completion of projects was used as a measure of student interest. The rate of submitting completed projects was higher for students in the ThinkerTools curriculum than for those in traditional instruction. This was true for all grades and ability levels (White and

Frederiksen, 1998). This study also found that students’ ongoing evaluation of their own and other students’ thinking increased motivation and self-confidence in their individual ability: students who participated in this ongoing evaluation not only turned in their final project reports more frequently, but they were also less likely to turn in reports that were identical to their research partner’s.

Participation in the ThinkerTools instructional unit appears to change students’ attitudes toward learning science. After completing the integrated instructional unit, fewer students indicated that “being good at science” was a result of inherited traits, and fewer agreed with the statement, “In general, boys tend to be naturally better at science than girls.” In addition, more students indicated that they preferred taking an active role in learning science, rather than simply being told the correct answer by the teacher (White and Frederiksen, 1998).

Researchers measured students’ engagement and motivation to master the complex topic of conservation of matter as part of the study of CTA. Students who participated in the CTA curriculum had higher levels of basic engagement (active participation in activities) and were more likely to focus on learning from the activities than students in the control group (Lynch et al., in press). This positive effect on engagement was especially strong among low-income students. The researchers speculate, “perhaps as a result of these changes in engagement and motivation, they learned more than if they had received the standard curriculum” (Lynch et al., in press).

Students who participated in CLP during middle school, when surveyed years later as high school seniors, were more likely to report that science is relevant to their lives than students who did not participate (Linn and Hsi, 2000). Further research is needed to illuminate which aspects of this instructional unit contribute to increased interest.

Developing Teamwork Abilities

Teamwork and collaboration appear in research on typical laboratory experiences in two ways. First, working in groups is seen as a way to enhance student learning, usually with reference to literature on cooperative learning or to the importance of providing opportunities for students to discuss their ideas. Second and more recently, attention has focused on the ability to work in groups as an outcome itself, with laboratory experiences seen as an ideal opportunity to develop these skills. The focus on teamwork as an outcome is usually linked to arguments that this is an essential skill for workers in the 21st century (Partnership for 21st Century Skills, 2003).

There is considerable evidence that collaborative work can help students learn, especially if students with high ability work with students with low ability (Webb and Palincsar, 1996). Collaboration seems especially helpful to lower ability students, but only when they work with more knowledgeable peers (Webb, Nemer, Chizhik, and Sugrue, 1998). Building on this research, integrated instructional units engage students in small-group collaboration as a way to encourage them to connect what they know (either from their own experiences or from prior instruction) to their laboratory experiences. Often, individual students disagree about prospective answers to the questions under investigation or the best way to approach them, and collaboration encourages students to articulate and explain their reasoning. A number of studies suggest that such collaborative investigation is effective in helping students to learn targeted scientific concepts (Coleman, 1998; Roschelle, 1992).

Extant research lacks specific assessment of the kinds of collaborative skills that might be learned by individual students through laboratory work. The assumption appears to be that if students collaborate and such collaborations are effective in supporting their conceptual learning, then they are probably learning collaborative skills, too.

Overall Effectiveness of Laboratory Experiences

The two bodies of research—the earlier research on typical laboratory experiences and the emerging research on integrated instructional units—yield different findings about the effectiveness of laboratory experiences in advancing the goals identified by the committee. In general, the nascent body of research on integrated instructional units offers the promise that laboratory experiences embedded in a larger stream of science instruction can be more effective in advancing these goals than are typical laboratory experiences (see Table 3-2 ).

Research on the effectiveness of typical laboratory experiences is methodologically weak and fragmented. The limited evidence available suggests that typical laboratory experiences, by themselves, are neither better nor worse than other methods of science instruction for helping students master science subject matter. However, more recent research indicates that integrated instructional units enhance students’ mastery of subject matter. Studies have demonstrated increases in student mastery of complex topics in physics, chemistry, and biology.

Typical laboratory experiences appear, based on the limited research available, to support some aspects of scientific reasoning; however, typical laboratory experiences alone are not sufficient for promoting more sophisticated scientific reasoning abilities, such as asking appropriate questions,

TABLE 3-2 Attainment of Educational Goals in Typical Laboratory Experiences and Integrated Instructional Units

Goal

Typical Laboratory Experiences

Integrated Instructional Units

Mastery of subject matter

No better or worse than other modes of instruction

Increased mastery compared with other modes of instruction

Scientific reasoning

Aids development of some aspects

Aids development of more sophisticated aspects

Understanding of the nature of science

Little improvement

Some improvement when explicitly targeted at this goal

Interest in science

Some evidence of increased interest

Greater evidence of increased interest

Understanding the complexity and ambiguity of empirical work

Inadequate evidence

Inadequate evidence

Development of practical skills

Inadequate evidence

Inadequate evidence

Development of teamwork skills

Inadequate evidence

Inadequate evidence

designing experiments, and drawing inferences. Research on integrated instructional units provides evidence that the laboratory experiences and other forms of instruction they include promote development of several aspects of scientific reasoning, including the ability to ask appropriate questions, design experiments, and draw inferences.

The evidence indicates that typical laboratory experiences do little to increase students’ understanding of the nature of science. In contrast, some studies find that participating in integrated instructional units that are designed specifically with this goal in mind enhances understanding of the nature of science.

The available research suggests that typical laboratory experiences can play a role in enhancing students’ interest in science and in learning science. There is evidence that engagement with the laboratory experiences and other learning activities included in integrated instructional units enhances students’ interest in science and motivation to learn science.

In sum, the evolving research on integrated instructional units provides evidence of increases in students’ understanding of subject matter, development of scientific reasoning, and interest in science, compared with students who received more traditional forms of science instruction. Studies conducted to date also suggest that the units are effective in helping diverse groups of students attain these three learning goals. In contrast, the earlier research on typical laboratory experiences indicates that such typical laboratory experiences are neither better nor worse than other forms of science instruction in supporting student mastery of subject matter. Typical laboratory experiences appear to aid in development of only some aspects of scientific reasoning, and they appear to play a role in enhancing students’ interest in science and in learning science.

Due to a lack of available studies, the committee was unable to draw conclusions about the extent to which either typical laboratory experiences or laboratory experiences incorporated into integrated instructional units might advance the other goals identified at the beginning of this chapter—enhancing understanding of the complexity and ambiguity of empirical work, acquiring practical skills, and developing teamwork skills.

PRINCIPLES FOR DESIGN OF EFFECTIVE LABORATORY EXPERIENCES

The three bodies of research we have discussed—research on how people learn, research on typical laboratory experiences, and developing research on how students learn in integrated instructional units—yield information that promises to inform the design of more effective laboratory experiences.

The committee considers the emerging evidence sufficient to suggest four general principles that can help laboratory experiences achieve the goals outlined above. It must be stressed, however, that research to date has not described in much detail how these principles can be implemented nor how each principle might relate to each of the educational goals of laboratory experiences.

Clearly Communicated Purposes

Effective laboratory experiences have clear learning goals that guide the design of the experience. Ideally these goals are clearly communicated to students. Without a clear understanding of the purposes of a laboratory activity, students seem not to get much from it. Conversely, when the purposes of a laboratory activity are clearly communicated by teachers to students, then students seem capable of understanding them and carrying them out. There seems to be no compelling evidence that particular purposes are more understandable to students than others.

Sequenced into the Flow of Instruction

Effective laboratory experiences are thoughtfully sequenced into the flow of classroom science instruction. That is, they are explicitly linked to what has come before and what will come after. A common theme in reviews of laboratory practice in the United States is that laboratory experiences are presented to students as isolated events, unconnected with other aspects of classroom work. In contrast, integrated instructional units embed laboratory experiences with other activities that build on the laboratory experiences and push students to reflect on and better understand these experiences. The way a particular laboratory experience is integrated into a flow of activities should be guided by the goals of the overall sequence of instruction and of the particular laboratory experience.

Integrated Learning of Science Concepts and Processes

Research in the learning sciences (National Research Council, 1999, 2001) strongly implies that conceptual understanding, scientific reasoning, and practical skills are three capabilities that are not mutually exclusive. An educational program that partitions the teaching and learning of content from the teaching and learning of process is likely to be ineffective in helping students develop scientific reasoning skills and an understanding of science as a way of knowing. The research on integrated instructional units, all of which intertwine exploration of content with process through laboratory experiences, suggests that integration of content and process promotes attainment of several goals identified by the committee.

Ongoing Discussion and Reflection

Laboratory experiences are more likely to be effective when they focus students more on discussing the activities they have done during their laboratory experiences and reflecting on the meaning they can make from them, than on the laboratory activities themselves. Crucially, the focus of laboratory experiences and the surrounding instructional activities should not simply be on confirming presented ideas, but on developing explanations to make sense of patterns of data. Teaching strategies that encourage students to articulate their hypotheses about phenomena prior to experimentation and to then reflect on their ideas after experimentation are demonstrably more successful at supporting student attainment of the goals of mastery of subject matter, developing scientific reasoning, and increasing interest in science and science learning. At the same time, opportunities for ongoing discussion and reflection could potentially support students in developing teamwork skills.

COMPUTER TECHNOLOGIES AND LABORATORY EXPERIENCES

From scales to microscopes, technology in many forms plays an integral role in most high school laboratory experiences. Over the past two decades, personal computers have enabled the development of software specifically designed to help students learn science, and the Internet is an increasingly used tool for science learning and for science itself. This section examines the role that computer technologies now and may someday play in science learning in relation to laboratory experiences. Certain uses of computer technology can be seen as laboratory experiences themselves, according to the committee’s definition, to the extent that they allow students to interact with data drawn directly from the world. Other uses, less clearly laboratory experiences in themselves, provide certain features that aid science learning.

Computer Technologies Designed to Support Learning

Researchers and science educators have developed a number of software programs to support science learning in various ways. In this section, we summarize what we see as the main ways in which computer software can support science learning through providing or augmenting laboratory experiences.

Scaffolded Representations of Natural Phenomena

Perhaps the most common form of science education software are programs that enable students to interact with carefully crafted models of natural phenomena that are difficult to see and understand in the real world and have proven historically difficult for students to understand. Such programs are able to show conceptual interrelationships and connections between theoretical constructs and natural phenomena through the use of multiple, linked representations. For example, velocity can be linked to acceleration and position in ways that make the interrelationships understandable to students (Roschelle, Kaput, and Stroup, 2000). Chromosome genetics can be linked to changes in pedigrees and populations (Horowitz, 1996). Molecular chemical representations can be linked to chemical equations (Kozma, 2003).

In the ThinkerTools integrated instructional unit, abstracted representations of force and motion are provided for students to help them “see” such ideas as force, acceleration, and velocity in two dimensions (White, 1993; White and Frederiksen, 1998). Objects in the ThinkerTools microworld are represented as simple, uniformly sized “dots” to avoid students becoming confused about the idea of center of mass. Students use the microworld to solve various problems of motion in one or two dimensions, using the com-

puter keyboard to apply forces to dots to move them along specified paths. Part of the key to the software’s guidance is that it provides representations of forces and accelerations in which students can see change in response to their actions. A “dot trace,” for example, shows students how applying more force affects an object’s acceleration in a predictable way. A “vector cross” represents the individual components of forces applied in two dimensions in a way that helps students to link those forces to an object’s motion.

ThinkerTools is but one example of this type of interactive, representational software. Others have been developed to help students reason about motion (Roschelle, 1992), electricity (Gutwill, Fredericksen, and White, 1999), heat and temperature (Linn, Bell, and Hsi, 1998), genetics (Horwitz and Christie, 2000), and chemical reactions (Kozma, 2003), among others. These programs differ substantially from one another in how they represent their target phenomena, as there are substantial differences in the topics themselves and in the problems that students are known to have in understanding them. They share, however, a common approach to solving a similar set of problems—how to represent natural phenomena that are otherwise invisible in ways that help students make their own thinking explicit and guide them to normative scientific understanding.

When used as a supplement to hands-on laboratory experiences within integrated instructional units, these representations can support students’ conceptual change (e.g., Linn et al., 1998; White and Frederiksen, 1998). For example, students working through the ThinkerTools curriculum always experiment with objects in the real world before they work with the computer tools. The goals of the laboratory experiences are to provide some experience with the phenomena under study and some initial ideas that can then be explored on the computer.

Structured Simulations of Inaccessible Phenomena

Various types of simulations of phenomena represent another form of technology for science learning. These simulations allow students to explore and observe phenomena that are too expensive, infeasible, or even dangerous to interact with directly. Strictly speaking, a computer simulation is a program that simulates a particular phenomenon by running a computational model whose behavior can sometimes be changed by modifying input parameters to the model. For example, the GenScope program provides a set of linked representations of genetics and genetics phenomena that would otherwise be unavailable for study to most students (Horowitz and Christie, 2000). The software represents alleles, chromosomes, family pedigrees, and the like and links representations across levels in ways that enable students to trace inherited traits to specific genetic differences. The software uses an underlying Mendelian model of genetic inheritance to gov-

ern its behavior. As with the representations described above, embedding the use of the software in a carefully thought out curriculum sequence is crucial to supporting student learning (Hickey et al., 2000).

Another example in biology is the BGuILE project (Reiser et al., 2001). The investigators created a series of structured simulations allowing students to investigate problems of evolution by natural selection. In the Galapagos finch environment, for example, students can examine a carefully selected set of data from the island of Daphne Major to explain a historical case of natural selection. The BGuILE software does not, strictly speaking, consist of simulations because it does not “run” a model; from a student’s perspective, it simulates either Daphne Major or laboratory experiments on tuberculosis bacteria. Studies show that students can learn from the BGuILE environments when these environments are embedded in a well-organized curriculum (Sandoval and Reiser, 2004). They also show that successful implementation of such technology-supported curricula relies heavily on teachers (Tabak, 2004).

Structured Interactions with Complex Phenomena and Ideas

The examples discussed here share a crucial feature. The representations built into the software and the interface tools provided for learners are intended to help them learn in very specific ways. There are a great number of such tools that have been developed over the last quarter of a century. Many of them have been shown to produce impressive learning gains for students at the secondary level. Besides the ones mentioned, other tools are designed to structure specific scientific reasoning skills, such as prediction (Friedler et al., 1990) and the coordination of claims with evidence (Bell and Linn, 2000; Sandoval, 2003). Most of these efforts integrate students’ work on the computer with more direct laboratory experiences. Rather than thinking of these representations and simulations as a way to replace laboratory experiences, the most successful instructional sequences integrate them with a series of empirical laboratory investigations. These sequences of science instruction focus students’ attention on developing a shared interpretation of both the representations and the real laboratory experiences in small groups (Bell, 2005).

Computer Technologies Designed to Support Science

Advances in computer technologies have had a tremendous impact on how science is done and on what scientists can study. These changes are vast, and summarizing them is well beyond the scope of the committee’s charge. We found, however, that some innovations in scientific practice, especially uses of the Internet, are beginning to be applied to secondary

science education. With respect to future laboratory experiences, perhaps the most significant advance in many scientific fields is the aggregation of large, varied data sets into Internet-accessible databases. These databases are most commonly built for specific scientific communities, but some researchers are creating and studying new, learner-centered interfaces to allow access by teachers and schools. These research projects build on instructional design principles illuminated by the integrated instructional units discussed above.

One example is the Center for Embedded Networked Sensing (CENS), a National Science Foundation Science and Technology Center investigating the development and deployment of large-scale sensor networks embedded in physical environments. CENS is currently working on ecosystem monitoring, seismology, contaminant flow transport, and marine microbiology. As sensor networks come on line, making data available, science educators at the center are developing middle school curricula that include web-based tools to enable students to explore the same data sets that the professional scientists are exploring (Pea, Mills, and Takeuchi, 2004).

The interfaces professional scientists use to access such databases tend to be too inflexible and technical for students to use successfully (Bell, 2005). Bounding the space of possible data under consideration, supporting appropriate considerations of theory, and promoting understanding of the norms used in the visualization can help support students in developing a shared understanding of the data. With such support, students can develop both conceptual understanding and understanding of the data analysis process. Focusing students on causal explanation and argumentation based on the data analysis process can help them move from a descriptive, phenomenological view of science to one that considers theoretical issues of cause (Bell, 2005).

Further research and evaluation of the educational benefit of student interaction with large scientific databases are absolutely necessary. Still, the development of such efforts will certainly expand over time, and, as they change notions of what it means to conduct scientific experiments, they are also likely to change what it means to conduct a school laboratory.

The committee identified a number of science learning goals that have been attributed to laboratory experiences. Our review of the evidence on attainment of these goals revealed a recent shift in research, reflecting some movement in laboratory instruction. Historically, laboratory experiences have been disconnected from the flow of classroom science lessons. We refer to these separate laboratory experiences as typical laboratory experiences. Reflecting this separation, researchers often engaged students in one or two

experiments or other science activities and then conducted assessments to determine whether their understanding of the science concept underlying the activity had increased. Some studies compared the outcomes of these separate laboratory experiences with the outcomes of other forms of science instruction, such as lectures or discussions.

Over the past 10 years, researchers studying laboratory education have shifted their focus. Drawing on principles of learning derived from the cognitive sciences, they have asked how to sequence science instruction, including laboratory experiences, in order to support students’ science learning. We refer to these instructional sequences as “integrated instructional units.” Integrated instructional units connect laboratory experiences with other types of science learning activities, including lectures, reading, and discussion. Students are engaged in framing research questions, making observations, designing and executing experiments, gathering and analyzing data, and constructing scientific arguments and explanations.

The two bodies of research on typical laboratory experiences and integrated instructional units, including laboratory experiences, yield different findings about the effectiveness of laboratory experiences in advancing the science learning goals identified by the committee. The earlier research on typical laboratory experiences is weak and fragmented, making it difficult to draw precise conclusions. The weight of the evidence from research focused on the goals of developing scientific reasoning and enhancing student interest in science showed slight improvements in both after students participated in typical laboratory experiences. Research focused on the goal of student mastery of subject matter indicates that typical laboratory experiences are no more or less effective than other forms of science instruction (such as reading, lectures, or discussion).

Studies conducted to date on integrated instructional units indicate that the laboratory experiences, together with the other forms of instruction included in these units, show greater effectiveness for these same three goals (compared with students who received more traditional forms of science instruction): improving students’ mastery of subject matter, increasing development of scientific reasoning, and enhancing interest in science. Integrated instructional units also appear to be effective in helping diverse groups of students progress toward these three learning goals . A major limitation of the research on integrated instructional units, however, is that most of the units have been used in small numbers of science classrooms. Only a few studies have addressed the challenge of implementing—and studying the effectiveness of—integrated instructional units on a wide scale.

Due to a lack of available studies, the committee was unable to draw conclusions about the extent to which either typical laboratory experiences or integrated instructional units might advance the other goals identified at the beginning of this chapter—enhancing understanding of the complexity

and ambiguity of empirical work, acquiring practical skills, and developing teamwork skills. Further research is needed to clarify how laboratory experiences might be designed to promote attainment of these goals.

The committee considers the evidence sufficient to identify four general principles that can help laboratory experiences achieve the learning goals we have outlined. Laboratory experiences are more likely to achieve their intended learning goals if (1) they are designed with clear learning outcomes in mind, (2) they are thoughtfully sequenced into the flow of classroom science instruction, (3) they are designed to integrate learning of science content with learning about the processes of science, and (4) they incorporate ongoing student reflection and discussion.

Computer software and the Internet have enabled development of several tools that can support students’ science learning, including representations of complex phenomena, simulations, and student interaction with large scientific databases. Representations and simulations are most successful in supporting student learning when they are integrated in an instructional sequence that also includes laboratory experiences. Researchers are currently developing tools to support student interaction with—and learning from—large scientific databases.

Anderson, R.O. (1976). The experience of science: A new perspective for laboratory teaching . New York: Columbia University, Teachers College Press.

Ato, T., and Wilkinson, W. (1986). Relationships between the availability and use of science equipment and attitudes to both science and sources of scientific information in Benue State, Nigeria. Research in Science and Technological Education , 4 , 19-28.

Beasley, W.F. (1985). Improving student laboratory performance: How much practice makes perfect? Science Education , 69 , 567-576.

Bell, P. (2005). The school science laboratory: Considerations of learning, technology, and scientific practice . Paper prepared for the Committee on High School Science Laboratories: Role and Vision. Available at: http://www7.nationalacademies.org/bose/July_12-13_2004_High_School_Labs_Meeting_Agenda.html [accessed June 2005].

Bell, P., and Linn, M.C. (2000). Scientific arguments as learning artifacts: Designing for learning from the web with KIE. International Journal of Science Education , 22 (8), 797-817.

Ben-Zvi, R., Hofstein, A., Kampa, R.F, and Samuel, D. (1976). The effectiveness of filmed experiments in high school chemical education. Journal of Chemical Education , 53 , 518-520.

Blakeslee, T., Bronstein, L., Chapin, M., Hesbitt, D., Peek, Y., Thiele, E., and Vellanti, J. (1993). Chemistry that applies . Lansing: Michigan Department of Education. Available at: http://www.ed-web2.educ.msu.edu/CCMS/secmod/Cluster3.pdf [accessed Feb. 2005].

Bransford, J.D., and Schwartz, D.L. (2001). Rethinking transfer: A simple proposal with multiple implications. In A. Iran-Nejad, and P.D. Pearson (Eds.), Review of research in education (pp. 61-100). Washington, DC: American Educational Research Association.

Bryce, T.G.K., and Robertson, I.J. (1985). What can they do: A review of practical assessment in science. Studies in Science Education , 12 , 1-24.

Carey, S., and Smith, C. (1993). On understanding the nature of scientific knowledge. Educational Psychologist , 28 , 235-251.

Champagne, A.B., Gunstone, R.F., and Klopfer, L.E. (1985). Instructional consequences of students’ knowledge about physical phenomena. In L.H.T. West and A.L. Pines (Eds.), Cognitive structure and conceptual change (pp. 61-68). New York: Academic Press.

Chang, H.P., and Lederman, N.G. (1994). The effect of levels of co-operation within physical science laboratory groups on physical science achievement. Journal of Research in Science Teaching , 31 , 167-181.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., and Schauble, L. (2003). Design experiments in educational research. Educational Researcher , 32 (1), 9-13.

Cobb, P., Stephan, M., McClain, K., and Gavemeijer, K. (2001). Participating in classroom mathematical practices. Journal of the Learning Sciences , 10 , 113-164.

Coleman, E.B. (1998). Using explanatory knowledge during collaborative problem solving in science. Journal of the Learning Sciences , 7 (3, 4), 387-427.

Collins, A., Joseph, D., and Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences , 13 (1), 15-42.

Coulter, J.C. (1966). The effectiveness of inductive laboratory demonstration and deductive laboratory in biology. Journal of Research in Science Teaching , 4 , 185-186.

Denny, M., and Chennell, F. (1986). Exploring pupils’ views and feelings about their school science practicals: Use of letter-writing and drawing exercises. Educational Studies , 12 , 73-86.

Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32 (1), 5-8.

Driver, R. (1995). Constructivist approaches to science teaching. In L.P. Steffe and J. Gale (Eds.), Constructivism in education (pp. 385-400). Hillsdale, NJ: Lawrence Erlbaum.

Driver, R., Leach, J., Millar, R., and Scott, P. (1996). Young people’s images of science . Buckingham, UK: Open University Press.

Driver, R., Newton, P., and Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education , 84 , 287-312.

Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science , 17 , 397-434.

Dupin, J.J., and Joshua, S. (1987). Analogies and “modeling analogies” in teaching: Some examples in basic electricity. Science Education , 73 , 791-806.

Duschl, R.A. (2004). The HS lab experience: Reconsidering the role of evidence, explanation and the language of science . Paper prepared for the Committee on High School Science Laboratories: Role and Vision, July 12-13, National Research Council, Washington, DC. Available at: http://www7.nationalacademies.org/bose/July_12-13_2004_High_School_Labs_Meeting_Agenda.html [accessed July 2005].

Duschl, R.A., and Osborne, J. (2002). Supporting and promoting argumentation discourse in science education. Studies in Science Education , 38 , 39-72.

Eylon, B., and Linn, M.C. (1988). Learning and instruction: An examination of four research perspectives in science education. Review of Educational Research , 58 (3), 251-301.

Fraser, B.J., Giddings, G.J., and McRobbie, C.J. (1995). Evolution and validation of a personal form of an instrument for assessing science laboratory classroom environments. Journal of Research in Science Teaching , 32 , 399-422.

Fraser, B.J., McRobbie, C.J., and Giddings, G.J. (1993). Development and cross-national validation of a laboratory classroom environment instrument for senior high school science. Science Education , 77 , 1-24.

Freedman, M.P. (2002). The influence of laboratory instruction on science achievement and attitude toward science across gender differences. Journal of Women and Minorities in Science and Engineering , 8 , 191-200.

Friedler, Y., Nachmias, R., and Linn, M.C. (1990). Learning scientific reasoning skills in microcomputer-based laboratories. Journal of Research in Science Teaching , 27 (2), 173-192.

Glaser, R. (1994). Learning theory and instruction. In G. d’Ydewalle, P. Eelen, and P. Bertelson (Eds.), International perspectives on science, volume 2: The state of the art (pp. 341-357). Hove, England: Erlbaum.

Gobert, J., and Clement, J. (1999). The effects of student-generated diagrams versus student-generated summaries on conceptual understanding of spatial, causal, and dynamic knowledge in plate tectonics. Journal of Research in Science Teaching , 36 (1), 39-53.

Gutwill, J.P., Fredericksen, J.R., and White, B.Y. (1999). Making their own connections: Students’ understanding of multiple models in basic electricity. Cognition and Instruction , 17 (3), 249-282.

Hatano, G., and Inagaki, K. (1991). Sharing cognition through collective comprehension activity. In L.B. Resnick, J.M. Levine, and S.D. Teasley (Eds.), Perspectives on socially shared cognition (pp. 331-348). Washington, DC: American Psychological Association.

Henderson, D., Fisher, D., and Fraser, B. (2000). Interpersonal behavior, laboratory learning environments, and student outcomes in senior biology classes. Journal of Research in Science Teaching , 37 , 26-43.

Hickey, D.T., Kindfield, A.C.H., Horwitz, P., and Christie, M.A. (2000). Integrating instruction, assessment, and evaluation in a technology-based genetics environment: The GenScope follow-up study. In B.J. Fishman and S.F. O’Connor-Divelbiss (Eds.), Proceedings of the International Conference of the Learning Sciences (pp. 6-13). Mahwah, NJ: Lawrence Erlbaum.

Hickey, D.T., Kindfield, A.C., Horwitz, P., and Christie, M.A. (2003). Integrating curriculum, instruction, assessment, and evaluation in a technology-supported genetics environment. American Educational Research Journal , 40 (2), 495-538.

Hodson, D. (1993). Philosophic stance of secondary school science teachers, curriculum experiences, and children’s understanding of science: Some preliminary findings. Interchange , 24 , 41-52.

Hofstein, A., and Lunetta, V.N. (1982). The role of the laboratory in science teaching: Neglected aspects of research. Review of Educational Research , 52 (2), 201-217.

Hofstein, A., and Lunetta, V.N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education , 88 , 28-54.

Holden, C. (1990). Animal rights activism threatens dissection. Science , 25 , 751.

Horowitz, P. (1996). Linking models to data: Hypermodels for science education. High School Journal , 79 (2), 148-156.

Horowitz, P., and Christie, M.A. (2000). Computer-based manipulatives for teaching scientific reasoning: An example. In M.J. Jacobson and R.B. Kozma (Eds.), Innovations in science and mathematics education: Advanced designs for technologies of learning (pp. 163-191). Mahwah, NJ: Lawrence Erlbaum.

Jovanovic, J., and King, S.S. (1998). Boys and girls in the performance-based science classroom: Who’s doing the performing? American Educational Research Journal , 35 (3), 477-496.

Kesidou, S., and Roseman, J. (2002). How well do middle school science programs measure up? Findings from Project 2061’s curriculum review. Journal of Research in Science Teaching , 39 (6), 522-549.

Khishfe, R., and Abd-El-Khalick, F. (2002). Influence of explicit and reflective versus implicit inquiry-oriented instruction on sixth graders’ views of nature of science. Journal of Research in Science Teaching , 39 (7), 551-578.

Klopfer, L.E. (1990). Learning scientific enquiry in the student laboratory. In E. Hegarty-Hazel (Ed.), The student laboratory and the science curriculum (pp. 95-118). London, England: Routledge.

Kozma, R.B. (2003). The material features of multiple representations and their cognitive and social affordances for science understanding. Learning and Instruction , 13 , 205-226.

Kuhn, D., Schauble, L., and Garcia-Mila, M. (1992). Cross-domain development of scientific reasoning. Cognition and Instruction , 9 (4), 285-327.

Lazarowitz, R., and Tamir, P. (1994). Research on using laboratory instruction in science. In D.L. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 94-130). New York: Macmillan.

Lederman, N.G. (1992). Students’ and teachers’ conceptions of the nature of science: A review of the research. Journal of Research in Science Teaching , 29 (4), 331-359.

Lederman, N.G., Abd-El-Khalick, F., Bell, R.L., and Schwartz, R.S. (2002). Views of nature of science questionnaire: Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of Research in Science Teaching , 39 (6), 497-521.

Lehrer, R., and Schauble, L. (2004). Scientific thinking and science literacy: Supporting development in learning contexts. In W. Damon, R. Lerner, K. Anne Renninger, and E. Sigel (Eds.), Handbook of child psychology, sixth edition, volume four: Child psychology in practice . Hoboken, NJ: John Wiley & Sons.

Lehrer, R., Schauble, L., Strom, D., and Pligge, M. (2001). Similarity of form and substance: Modeling material kind. In S.M. Carver and D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress . Mahwah, NJ: Lawrence Erlbaum.

Lemke, J. (1990). Talking science: Language, learning, and values . Norwood, NJ: Ablex.

Linn, M.C. (1997). The role of the laboratory in science learning. Elementary School Journal , 97 , 401-417.

Linn, M.C. (2004). High school science laboratories: How can technology contribute? Presentation to the Committee on High School Science Laboratories: Role and Vision. June. Available at: http://www7.nationalacademies.org/bose/June_3-4_2004_High_School_Labs_Meeting_Agenda.html [accessed April 2005].

Linn, M.C., Bell, P., and Hsi, S. (1998). Using the Internet to enhance student understanding of science: The knowledge integration environment. Interactive Learning Environments , 6 (1-2), 4-38.

Linn, M.C., Davis, E., and Bell, P. (2004a). Inquiry and technology. In M.C. Linn, E. Davis, and P. Bell, (Eds.), Internet environments for science education . Mahwah, NJ: Lawrence Erlbaum.

Linn, M.C., Davis, E., and Bell, P. (Eds.). (2004b). Internet environments for science education . Mahwah, NJ: Lawrence Erlbaum.

Linn, M.C., and Hsi, S. (2000). Computers, teachers, peers . Mahwah, NJ: Lawrence Erlbaum.

Linn, M.C., and Songer, B. (1991). Teaching thermodynamics to middle school children: What are appropriate cognitive demands? Journal of Research in Science Teaching , 28 (10), 885-918.

Lunetta, V.N. (1998). The school science laboratory. In B.J. Fraser and K.G. Tobin (Eds.), International handbook of science education (pp. 249-262). London, England: Kluwer Academic.

Lynch, S. (2004). What are the effects of highly rated, lab-based curriculum materials on diverse learners? Presentation to the Committee on High School Science Laboratories: Role and Vision. July 12. Available at: http://www7.nationalacademies.org/bose/July_12-13_2004_High_School_Labs_Meeting_Agenda.html [accessed Oct. 2004].

Lynch, S., Kuipers, J., Pyke, C., and Szesze, M. (In press). Examining the effects of a highly rated science curriculum unitinstructional unit on diverse populations: Results from a planning grant. Journal of Research in Science Teaching .

Lynch, S., and O’Donnell, C. (2005). The evolving definition, measurement, and conceptualization of fidelity of implementation in scale-up of highly rated science curriculum unitsintegrated instructional units in diverse middle schools . Paper presented at the annual meeting of the American Educational Research Association, April 7, Montreal, Canada.

McRobbie, C.J., and Fraser, B.J. (1993). Associations between student outcomes and psychosocial science environment. Journal of Educational Research , 87 , 78-85.

Meichtry, Y.J. (1993). The impact of science curricula on student views about the nature of science. Journal of Research in Science Teaching , 30 (5), 429-443.

Metz, K.E. (2004). Children’s understanding of scientific inquiry: Their conceptualization of uncertainty in investigations of their own design. Cognition and Instruction , 22 (2), 219-290.

Meyer, K., and Woodruff, E. (1997). Consensually driven explanation in science teaching. Science Education , 80 , 173-192.

Millar, R. (1998). Rhetoric and reality: What practical work in science education is really for. In J. Wellington (Ed.), Practical work in school science: Which way now? (pp. 16-31). London, England: Routledge.

Millar, R. (2004). The role of practical work in the teaching and learning of science . Paper prepared for the Committee on High School Science Laboratories: Role and Vision. Available at: http://www7.nationalacademies.org/bose/June3-4_2004_High_School_Labs_Meeting_Agenda.html [accessed April 2005].

National Research Council. (1999). How people learn: Brain, mind, experience, and school . Committee on Developments in the Science of Learning, J.D. Bransford, A.L. Brown, and R.R. Cocking (Eds.). Washington, DC: National Academy Press.

National Research Council. (2001). Eager to learn: Educating our preschoolers . Committee on Early Childhood Pedagogy. B.T. Bowman, M.S. Donovan, and M.S. Burns (Eds.). Commission on Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.

National Research Council. (2005). Systems for state science assessment . Committee on Test Design for K-12 Science Achievement, M.R. Wilson and M.W. Bertenthal (Eds.). Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

Osborne, R., and Freyberg, P. (1985). Learning in science: The implications of children’s science . London, England: Heinemann.

Partnership for 21st Century Skills. (2003). Learning for the 21st century . Washington, DC: Author. Available at: http://www.21stcenturyskills.org/reports/learning.asp [accessed April 2005].

Pea, R., Mills, M., and Takeuchi, L. (Eds). (2004). Making SENS: Science education networks of sensors . Report from an OMRON-sponsored workshop of the Media-X Program at Stanford University, October 3. Stanford, CA: Stanford Center for Innovations in Learning. Available at:: http://www.makingsens.stanford.edu/index.html [accessed May 2005].

Raghubir, K.P. (1979). The laboratory investigative approach to science instruction. Journal of Research in Science Teaching , 16 , 13-18.

Reif, F., and St. John, M. (1979) Teaching physicists thinking skills in the laboratory. American Journal of Physics , 47 (11), 950-957.

Reiner, M., Pea, R.D., and Shulman, D.J. (1995). Impact of simulator-based instruction on diagramming in geometrical optics by introductory physics students. Journal of Science Education and Technology , 4 (3), 199-225.

Reiser, B.J., Tabak, I., Sandoval, W.A., Smith, B.K., Steinmuller, F., and Leone, A.J. (2001). BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms. In S.M. Carver and D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263-305). Mahwah, NJ: Lawrence Erlbaum.

Renner, J.W., Abraham, M.R., and Birnie, H.H. (1985). Secondary school students’ beliefs about the physics laboratory, Science Education , 69 , 649-63.

Roschelle, J. (1992). Learning by collaborating: Convergent conceptual change. Journal of the Learning Sciences , 2 (3), 235-276.

Roschelle, J., Kaput, J., and Stroup, W. (2000). SimCalc: Accelerating students’ engagement with the mathematics of change. In M.J. Jacobsen and R.B. Kozma (Eds). Learning the sciences of the 21st century: Research, design, and implementing advanced technology learning environments (pp. 47-75). Hillsdale, NJ: Lawrence Erlbaum.

Rosebery, A.S., Warren, B., and Conant, F.R. (1992). Appropriating scientific discourse: Findings from language minority classrooms. Journal of the Learning Sciences , 2 (1), 61-94.

Salomon, G. (1996). Studying novel learning environments as patterns of change. In S. Vosniadou, E. De Corte, R. Glaser, and H. Mandl (Eds.), International perspectives on the design of technology-supported learning environments (pp. 363-377). Mahwah, NJ: Lawrence Erlbaum.

Sandoval, W.A. (2003). Conceptual and epistemic aspects of students’ scientific explanations. Journal of the Learning Sciences , 12 (1), 5-51.

Sandoval, W.A., and Millwood, K.A. (2005). The quality of students’ use of evidence in written scientific explanations. Cognition and Instruction , 23 (1), 23-55.

Sandoval, W.A., and Morrison, K. (2003). High school students’ ideas about theories and theory change after a biological inquiry unit. Journal of Research in Science Teaching , 40 (4), 369-392.

Sandoval, W.A., and Reiser, B.J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic supports for science inquiry. Science Education , 88 , 345-372.

Schauble, L., Glaser, R., Duschl, R.A., Schulze, S., and John, J. (1995). Students’ understanding of the objectives and procedures of experimentation in the science classroom. Journal of the Learning Sciences , 4 (2), 131-166.

Schauble, L., Klopfer, L.E., and Raghavan, K. (1991). Students’ transition from an engineering model to a science model of experimentation. Journal of Research in Science Teaching , 28 (9), 859-882.

Shaffer, P.S., and McDermott, L.C. (1992). Research as a guide for curriculum development: An example from introductory electricity. Part II: Design of instructional strategies. American Journal of Physics , 60 (11), 1003-1013.

Shepardson, D.P., and Pizzini, E.L. (1993). A comparison of student perceptions of science activities within three instructional approaches. School Science and Mathematics , 93 , 127-131.

Shulman, L.S., and Tamir, P. (1973). Research on teaching in the natural sciences. In R.M.W. Travers (Ed.), Second handbook of research on teaching . Chicago: Rand-McNally.

Singer, R.N. (1977). To err or not to err: A question for the instruction of psychomotor skills. Review of Educational Research , 47 , 479-489.

Smith, C.L., Maclin, D., Grosslight, L., and Davis, H. (1997). Teaching for understanding: A study of students’ pre-instruction theories of matter and a comparison of the effectiveness of two approaches to teaching about matter and density. Cognition and Instruction , 15 , 317-394.

Smith, C.L., Maclin, D., Houghton, C., and Hennessey, M. (2000). Sixth-grade students’ epitemologies of science: The impact of school science experiences on epitemological development. Cognition and Instruction , 18 , 349-422.

Snir, J., Smith, C.L., and Raz, G. (2003). Linking phenomena with competing underlying models: A software tool for introducing students to the particulate model of matter. Science Education , 87 (6), 794-830.

Songer, N.B., and Linn, M.C. (1991). How do students’ views of science influence knowledge integration? Journal of Research in Science Teaching , 28 (9), 761-784.

Tabak, I. (2004). Synergy: a complement to emerging patterns of distributed scaffolding. Journal of the Learning Sciences , 13 (3), 305-335.

Tasker, R. (1981). Children’s views and classroom experiences. Australian Science Teachers’ Journal , 27 , 33-37.

Tiberghien, A., Veillard, L., Le Marechal, J.-F., Buty, C., and Millar, R. (2000). An analysis of labwork tasks used in science teaching at upper secondary school and university levels in several European countries. Science Education , 85 , 483-508.

Tobin, K. (1987). Forces which shape the implemented curriculum in high school science and mathematics. Teaching and Teacher Education , 3 (4), 287-298.

VandenBerg, E., Katu, N., and Lunetta, V.N. (1994). The role of “experiments” in conceptual change . Paper presented at the annual meeting of the National Association for Research in Science Teaching, Anaheim, CA.

Webb, N.M., Nemer, K.M., Chizhik, A.W., and Sugrue, B. (1998). Equity issues in collaborative group assessment: Group composition and performance. American Educational Research Journal , 35 (4), 607-652.

Webb, N.M., and Palincsar, A.S. (1996). Group processes in the classroom. In D.C. Berliner and R.C. Calfee (Eds.), Handbook of educational psychology (pp. 841-873). New York: Macmillan.

Wells, M., Hestenes, D., and Swackhamer, G. (1995). A modeling method for high school physics instruction. American Journal of Physics , 63 (7), 606-619.

Wheatley, J.H. (1975).Evaluating cognitive learning in the college science laboratory. Journal of Research in Science Teaching , 12 , 101-109.

White, B.Y. (1993). ThinkerTools: Causal models, conceptual change, and science education. Cognition and Instruction , 10 (1), 1-100.

White, B.Y., and Frederiksen, J.R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction , 16 (1), 3-118.

White, R.T. (1996). The link between the laboratory and learning. International Journal of Science Education , 18 , 761-774.

White, R.T., and Gunstone, R.F. (1992). Probing understanding . London, England: Falmer.

Wilkenson, J.W., and Ward, M. (1997). The purpose and perceived effectiveness of laboratory work in secondary schools. Australian Science Teachers’ Journal , 43-55.

Wong, A.F.L., and Fraser, B.J. (1995). Cross-validation in Singapore of the science laboratory environment inventory. Psychological Reports , 76 , 907-911.

Woolnough, B.E. (1983). Exercises, investigations and experiences. Physics Education , 18 , 60-63.

Yager, R.E., Engen, J.B., and Snider, C.F. (1969). Effects of the laboratory and demonstration method upon the outcomes of instruction in secondary biology. Journal of Research in Science Teaching , 5 , 76-86.

Zimmerman, C. (2000). The development of scientific reasoning skills. Developmental Review , 20 , 99-149.

Laboratory experiences as a part of most U.S. high school science curricula have been taken for granted for decades, but they have rarely been carefully examined. What do they contribute to science learning? What can they contribute to science learning? What is the current status of labs in our nation's high schools as a context for learning science? This book looks at a range of questions about how laboratory experiences fit into U.S. high schools:

  • What is effective laboratory teaching?
  • What does research tell us about learning in high school science labs?
  • How should student learning in laboratory experiences be assessed?
  • Do all student have access to laboratory experiences?
  • What changes need to be made to improve laboratory experiences for high school students?
  • How can school organization contribute to effective laboratory teaching?

With increased attention to the U.S. education system and student outcomes, no part of the high school curriculum should escape scrutiny. This timely book investigates factors that influence a high school laboratory experience, looking closely at what currently takes place and what the goals of those experiences are and should be. Science educators, school administrators, policy makers, and parents will all benefit from a better understanding of the need for laboratory experiences to be an integral part of the science curriculum—and how that can be accomplished.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

  • Departments and Units
  • Majors and Minors
  • LSA Course Guide
  • LSA Gateway

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

{{item.snippet}}
  • Accessibility
  • Undergraduates
  • Instructors
  • Alums & Friends

Sweetland Center for Writing

  • ★ Writing Support
  • Minor in Writing
  • First-Year Writing Requirement
  • Transfer Students
  • Writing Guides
  • Peer Writing Consultant Program
  • Upper-Level Writing Requirement
  • Writing Prizes
  • International Students
  • ★ The Writing Workshop
  • Dissertation ECoach
  • Fellows Seminar
  • Dissertation Writing Groups
  • Rackham / Sweetland Workshops
  • Dissertation Writing Institute
  • Guides to Teaching Writing
  • Teaching Support and Services
  • Support for FYWR Courses
  • Support for ULWR Courses
  • Writing Prize Nominating
  • Alums Gallery
  • Commencement
  • Giving Opportunities
  • How Do I Present Findings From My Experiment in a Report?
  • How Do I Make Sure I Understand an Assignment?
  • How Do I Decide What I Should Argue?
  • How Can I Create Stronger Analysis?
  • How Do I Effectively Integrate Textual Evidence?
  • How Do I Write a Great Title?
  • What Exactly is an Abstract?
  • What is a Run-on Sentence & How Do I Fix It?
  • How Do I Check the Structure of My Argument?
  • How Do I Write an Intro, Conclusion, & Body Paragraph?
  • How Do I Incorporate Quotes?
  • How Can I Create a More Successful Powerpoint?
  • How Can I Create a Strong Thesis?
  • How Can I Write More Descriptively?
  • How Do I Incorporate a Counterargument?
  • How Do I Check My Citations?

See the bottom of the main Writing Guides page for licensing information.

Many believe that a scientist’s most difficult job is not conducting an experiment but presenting the results in an effective and coherent way. Even when your methods and technique are sound and your notes are comprehensive, writing a report can be a challenge because organizing and communicating scientific findings requires patience and a thorough grasp of certain conventions. Having a clear understanding of the typical goals and strategies for writing an effective lab report can make the process much less troubling.

General Considerations

It is useful to note that effective scientific writing serves the same purpose that your lab report should. Good scientific writing explains:

  • The goal(s) of your experiment
  • How you performed the experiment
  • The results you obtained
  • Why these results are important

While it’s unlikely that you’re going to win the Nobel Prize for your work in an undergraduate laboratory course, tailoring your writing strategies in imitation of professional journals is easier than you might think, since they all follow a consistent pattern. However, your instructor has the final say in determining how your report should be structured and what should appear in each section. Please use the following explanations only to supplement your given writing criteria, rather than thinking of them as an indication of how all lab reports must be written.

In Practice

The structure of a report.

The traditional experimental report is structured using the acronym “IMRAD” which stands for I ntroduction, M ethods, R esults and D iscussion. The “ A ” is sometimes used to stand for A bstract. For help writing abstracts, please see Sweetland’s resource entitled “What is an abstract, and how do I write one?”

Introduction: “What am I doing here?”

The introduction should accomplish what any good introduction does: draw the reader into the paper. To simplify things, follow the “inverted pyramid” structure, which involves narrowing information from the most broad (providing context for your experiment’s place in science) to the most specific (what exactly your experiment is about). Consider the example below.

Most broad: “Caffeine is a mild stimulant that is found in many common beverages, including coffee.”

Less broad: “Common reactions to caffeine use include increased heart rate and increased respiratory rate.”

Slightly more specific (moving closer to your experiment): Previous research has shown that people who consume multiple caffeinated beverages per day are also more likely to be irritable.

Most specific (your experiment): This study examines the emotional states of college students (ages 18-22) after they have consumed three cups of coffee each day.

See how that worked? Each idea became slightly more focused, ending with a brief description of your particular experiment. Here are a couple more tips to keep in mind when writing an introduction:

  • Include an overview of the topic in question, including relevant literature A good example: “In 1991, Rogers and Hammerstein concluded that drinking coffee improves alertness and mental focus (citation 1991).
  • Explain what your experiment might contribute to past findings A good example: “Despite these established benefits, coffee may negatively impact mood and behavior. This study aims to investigate the emotions of college coffee drinkers during finals week.”
  • Keep the introduction brief There’s no real advantage to writing a long introduction. Most people reading your paper already know what coffee is, and where it comes from, so what’s the point of giving them a detailed history of the coffee bean? A good example: “Caffeine is a psychoactive stimulant, much like nicotine.” (Appropriate information, because it gives context to caffeine—the molecule of study) A bad example: “Some of the more popular coffee drinks in America include cappuccinos, lattés, and espresso.” (Inappropriate for your introduction. This information is useless for your audience, because not only is it already familiar, but it doesn’t mention anything about caffeine or its effects, which is the reason that you’re doing the experiment.)
  • Avoid giving away the detailed technique and data you gathered in your experiment A good example: “A sample of coffee-drinking college students was observed during end-of-semester exams.” ( Appropriate for an introduction ) A bad example: “25 college students were studied, and each given 10oz of premium dark roast coffee (containing 175mg caffeine/serving, except for Folgers, which has significantly lower caffeine content) three times a day through a plastic straw, with intervals of two hours, for three weeks.” ( Too detailed for an intro. More in-depth information should appear in your “Methods” or “Results” sections. )

Methods: “Where am I going to get all that coffee…?”

A “methods” section should include all the information necessary for someone else to recreate your experiment. Your experimental notes will be very useful for this section of the report. More or less, this section will resemble a recipe for your experiment. Don’t concern yourself with writing clever, engaging prose. Just say what you did, as clearly as possible. Address the types of questions listed below:

  • Where did you perform the experiment? (This one is especially important in field research— work done outside the laboratory.)
  • How much did you use? (Be precise.)
  • Did you change anything about them? (i.e. Each 5 oz of coffee was diluted with 2 oz distilled water.)
  • Did you use any special method for recording data? (i.e. After drinking coffee, students’ happiness was measured using the Walter Gumdrop Rating System, on a scale of 1-10.)
  • Did you use any techniques/methods that are significant for the research? (i.e. Maybe you did a double blinded experiment with X and Y as controls. Was your control a placebo? Be specific.)
  • Any unusual/unique methods for collecting data? If so, why did you use them?

After you have determined the basic content for your “methods” section, consider these other tips:

  • Decide between using active or passive voice

There has been much debate over the use of passive voice in scientific writing. “Passive voice” is when the subject of a sentence is the recipient of the action.

  • For example: Coffee was given to the students.

“Active voice” is when the subject of a sentence performs the action.

  • For example: I gave coffee to the students.

The merits of using passive voice are obvious in some cases. For instance, scientific reports are about what is being studied, and not about YOU. Using too many personal pronouns can make your writing sound more like a narrative and less like a report. For that reason, many people recommend using passive voice to create a more objective, professional tone, emphasizing what was done TO your subject. However, active voice is becoming increasingly common in scientific writing, especially in social sciences, so the ultimate decision of passive vs. active voice is up to you (and whoever is grading your report).

  • Units are important When using numbers, it is important to always list units, and keep them consistent throughout the section. There is a big difference between giving someone 150 milligrams of coffee and 150 grams of coffee—the first will keep you awake for a while, and the latter will put you to sleep indefinitely. So make sure you’re consistent in this regard.
  • Don’t needlessly explain common techniques If you’re working in a chemistry lab, for example, and you want to take the melting point of caffeine, there’s no point saying “I used the “Melting point-ometer 3000” to take a melting point of caffeine. First I plugged it in…then I turned it on…” Your reader can extrapolate these techniques for him or herself, so a simple “Melting point was recorded” will work just fine.
  • If it isn’t important to your results, don’t include it No one cares if you bought the coffee for your experiment on “3 dollar latte day”. The price of the coffee won’t affect the outcome of your experiment, so don’t bore your reader with it. Simply record all the things that WILL affect your results (i.e. masses, volumes, numbers of trials, etc).

Results: The only thing worth reading?

The “results” section is the place to tell your reader what you observed. However, don’t do anything more than “tell.” Things like explaining and analyzing belong in your discussion section. If you find yourself using words like “because” or “which suggests” in your results section, then STOP! You’re giving too much analysis.

A good example: “In this study, 50% of subjects exhibited symptoms of increased anger and annoyance in response to hearing Celine Dion music.” ( Appropriate for a “results” section—it doesn’t get caught up in explaining WHY they were annoyed. )

In your “results” section, you should:

  • Display facts and figures in tables and graphs whenever possible. Avoid listing results like “In trial one, there were 5 students out of 10 who showed irritable behavior in response to caffeine. In trial two…” Instead, make a graph or table. Just be sure to label it so you can refer to it in your writing (i.e. “As Table 1 shows, the number of swear words spoken by students increased in proportion to the amount of coffee consumed.”) Likewise, be sure to label every axis/heading on a chart or graph (a good visual representation can be understood on its own without any textual explanation). The following example clearly shows what happened during each trial of an experiment, making the trends visually apparent, and thus saving the experimenter from having to explain each trial with words.
Amount of coffee consumed (mg) Response to being poked with a pencil (number of expletives
uttered)
50 0
75 1
100 3
125 4
150 7 ½
  • Identify only the most significant trends. Don’t try to include every single bit of data in this section, because much of it won’t be relevant to your hypothesis. Just pick out the biggest trends, or what is most significant to your goals.

Discussion: “What does it all mean?”

The “discussion” section is intended to explain to your reader what your data can be interpreted to mean. As with all science, the goal for your report is simply to provide evidence that something might be true or untrue—not to prove it unequivocally. The following questions should be addressed in your “discussion” section:

  • Is your hypothesis supported? If you didn’t have a specific hypothesis, then were the results consistent with what previous studies have suggested? A good example: “Consistent with caffeine’s observed effects on heart rate, students’ tendency to react strongly to the popping of a balloon strongly suggests that caffeine’s ability to heighten alertness may also increase nervousness.”
  • Was there any data that surprised you? Outliers are seldom significant, and mentioning them is largely useless. However, if you see another cluster of points on a graph that establish their own trend, this is worth mentioning.
  • Are the results useful? If you have no significant findings, then just say that. Don’t try to make wild claims about the meanings of your work if there is no statistical/observational basis for these claims—doing so is dishonest and unhelpful to other scientists reading your work. Similarly, try to avoid using the word “proof” or “proves.” Your work is merely suggesting evidence for new ideas. Just because things worked out one way in your trials, that doesn’t mean these results will always be repeatable or true.
  • What are the implications of your work? Here are some examples of the types of questions that can begin to show how your study can be significant outside of this one particular experiment: Why should anyone care about what you’re saying? How might these findings affect coffee drinkers? Do your findings suggest that drinking coffee is more harmful than previously thought? Less harmful? How might these findings affect other fields of science? What about the effects of caffeine on people with emotional disorders? Do your findings suggest that they should or should not drink coffee?
  • Any shortcomings of your work? Were there any flaws in your experimental design? How should future studies in this field accommodate for these complications. Does your research raise any new questions? What other areas of science should be explored as a result of your work?

Hogg, Alan. "Tutoring Scientific Writing." Sweetland Center for Writing. University of Michigan, Ann Arbor. 3/15/2011. Lecture.

Swan, Judith A, and George D. Gopen. "The Science of Scientific Writing." American Scientist . 78. (1990): 550-558. Print.

"Scientific Reports." The Writing Center . University of North Carolina, n.d. Web. 5 May 2011. http://www.unc.edu/depts/wcweb/handouts/lab_report_complete.html

LSA - College of Literature, Science, and The Arts - University of Michigan

  • Information For
  • Prospective Students
  • Current Students
  • Faculty and Staff
  • Alumni and Friends
  • More about LSA
  • How Do I Apply?
  • LSA Magazine
  • Student Resources
  • Academic Advising
  • Global Studies
  • LSA Opportunity Hub
  • Social Media
  • Update Contact Info
  • Privacy Statement
  • Report Feedback

Experimental Method In Psychology

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

The experimental method involves the manipulation of variables to establish cause-and-effect relationships. The key features are controlled methods and the random allocation of participants into controlled and experimental groups .

What is an Experiment?

An experiment is an investigation in which a hypothesis is scientifically tested. An independent variable (the cause) is manipulated in an experiment, and the dependent variable (the effect) is measured; any extraneous variables are controlled.

An advantage is that experiments should be objective. The researcher’s views and opinions should not affect a study’s results. This is good as it makes the data more valid  and less biased.

There are three types of experiments you need to know:

1. Lab Experiment

A laboratory experiment in psychology is a research method in which the experimenter manipulates one or more independent variables and measures the effects on the dependent variable under controlled conditions.

A laboratory experiment is conducted under highly controlled conditions (not necessarily a laboratory) where accurate measurements are possible.

The researcher uses a standardized procedure to determine where the experiment will take place, at what time, with which participants, and in what circumstances.

Participants are randomly allocated to each independent variable group.

Examples are Milgram’s experiment on obedience and  Loftus and Palmer’s car crash study .

  • Strength : It is easier to replicate (i.e., copy) a laboratory experiment. This is because a standardized procedure is used.
  • Strength : They allow for precise control of extraneous and independent variables. This allows a cause-and-effect relationship to be established.
  • Limitation : The artificiality of the setting may produce unnatural behavior that does not reflect real life, i.e., low ecological validity. This means it would not be possible to generalize the findings to a real-life setting.
  • Limitation : Demand characteristics or experimenter effects may bias the results and become confounding variables .

2. Field Experiment

A field experiment is a research method in psychology that takes place in a natural, real-world setting. It is similar to a laboratory experiment in that the experimenter manipulates one or more independent variables and measures the effects on the dependent variable.

However, in a field experiment, the participants are unaware they are being studied, and the experimenter has less control over the extraneous variables .

Field experiments are often used to study social phenomena, such as altruism, obedience, and persuasion. They are also used to test the effectiveness of interventions in real-world settings, such as educational programs and public health campaigns.

An example is Holfing’s hospital study on obedience .

  • Strength : behavior in a field experiment is more likely to reflect real life because of its natural setting, i.e., higher ecological validity than a lab experiment.
  • Strength : Demand characteristics are less likely to affect the results, as participants may not know they are being studied. This occurs when the study is covert.
  • Limitation : There is less control over extraneous variables that might bias the results. This makes it difficult for another researcher to replicate the study in exactly the same way.

3. Natural Experiment

A natural experiment in psychology is a research method in which the experimenter observes the effects of a naturally occurring event or situation on the dependent variable without manipulating any variables.

Natural experiments are conducted in the day (i.e., real life) environment of the participants, but here, the experimenter has no control over the independent variable as it occurs naturally in real life.

Natural experiments are often used to study psychological phenomena that would be difficult or unethical to study in a laboratory setting, such as the effects of natural disasters, policy changes, or social movements.

For example, Hodges and Tizard’s attachment research (1989) compared the long-term development of children who have been adopted, fostered, or returned to their mothers with a control group of children who had spent all their lives in their biological families.

Here is a fictional example of a natural experiment in psychology:

Researchers might compare academic achievement rates among students born before and after a major policy change that increased funding for education.

In this case, the independent variable is the timing of the policy change, and the dependent variable is academic achievement. The researchers would not be able to manipulate the independent variable, but they could observe its effects on the dependent variable.

  • Strength : behavior in a natural experiment is more likely to reflect real life because of its natural setting, i.e., very high ecological validity.
  • Strength : Demand characteristics are less likely to affect the results, as participants may not know they are being studied.
  • Strength : It can be used in situations in which it would be ethically unacceptable to manipulate the independent variable, e.g., researching stress .
  • Limitation : They may be more expensive and time-consuming than lab experiments.
  • Limitation : There is no control over extraneous variables that might bias the results. This makes it difficult for another researcher to replicate the study in exactly the same way.

Key Terminology

Ecological validity.

The degree to which an investigation represents real-life experiences.

Experimenter effects

These are the ways that the experimenter can accidentally influence the participant through their appearance or behavior.

Demand characteristics

The clues in an experiment lead the participants to think they know what the researcher is looking for (e.g., the experimenter’s body language).

Independent variable (IV)

The variable the experimenter manipulates (i.e., changes) is assumed to have a direct effect on the dependent variable.

Dependent variable (DV)

Variable the experimenter measures. This is the outcome (i.e., the result) of a study.

Extraneous variables (EV)

All variables which are not independent variables but could affect the results (DV) of the experiment. EVs should be controlled where possible.

Confounding variables

Variable(s) that have affected the results (DV), apart from the IV. A confounding variable could be an extraneous variable that has not been controlled.

Random Allocation

Randomly allocating participants to independent variable conditions means that all participants should have an equal chance of participating in each condition.

The principle of random allocation is to avoid bias in how the experiment is carried out and limit the effects of participant variables.

Order effects

Changes in participants’ performance due to their repeating the same or similar test more than once. Examples of order effects include:

(i) practice effect: an improvement in performance on a task due to repetition, for example, because of familiarity with the task;

(ii) fatigue effect: a decrease in performance of a task due to repetition, for example, because of boredom or tiredness.

Print Friendly, PDF & Email

Optional Lab Activities

Lab objectives.

At the conclusion of the lab, the student should be able to:

  • define the following terms: metabolism, reactant, product, substrate, enzyme, denature
  • describe what the active site of an enzyme is (be sure to include information regarding the relationship of the active site to the substrate)
  • describe the specific action of the enzyme catalase, include the substrate and products of the reaction
  • list what organelle catalase can be found in every plant or animal cell
  • list the factors that can affect the rate of a chemical reaction and enzyme activity
  • explain why enzymes have an optimal pH and temperature to ensure greatest activity (greatest functioning) of the enzyme (be sure to consider how virtually all enzymes are proteins and the impact that temperature and pH may have on protein function)
  • explain why the same type of chemical reaction performed at different temperatures revealed different results/enzyme activity
  • explain why warm temperatures (but not boiling) typically promote enzyme activity but cold temperature typically
  • decreases enzyme activity
  • explain why increasing enzyme concentration promotes enzyme activity
  • explain why the optimal pH of a particular enzyme promotes its activity
  • if given the optimal conditions for a particular enzyme, indicate which experimental conditions using that particular enzyme would show the greatest and least enzyme activity

Introduction

Hydrogen peroxide is a toxic product of many chemical reactions that occur in living things. Although it is produced in small amounts, living things must detoxify this compound and break down hydrogen peroxide into water and oxygen, two non-harmful molecules. The organelle responsible for destroying hydrogen peroxide is the peroxisome using the enzyme catalase. Both plants and animals have peroxisomes with catalase. The catalase sample for today’s lab will be from a potato.

Enzymes speed the rate of chemical reactions. A catalyst is a chemical involved in, but not consumed in, a chemical reaction. Enzymes are proteins that catalyze biochemical reactions by lowering the activation energy necessary to break the chemical bonds in reactants and form new chemical bonds in the products. Catalysts bring reactants closer together in the appropriate orientation and weaken bonds, increasing the reaction rate. Without enzymes, chemical reactions would occur too slowly to sustain life.

The functionality of an enzyme is determined by the shape of the enzyme. The area in which bonds of the reactant(s) are broken is known as the active site. The reactants of enzyme catalyzed reactions are called substrates. The active site of an enzyme recognizes, confines, and orients the substrate in a particular direction.

Enzymes are substrate specific, meaning that they catalyze only specific reactions. For example, proteases (enzymes that break peptide bonds in proteins) will not work on starch (which is broken down by the enzyme amylase). Notice that both of these enzymes end in the suffix -ase. This suffix indicates that a molecule is an enzyme.

Environmental factors may affect the ability of enzymes to function. You will design a set of experiments to examine the effects of temperature, pH, and substrate concentration on the ability of enzymes to catalyze chemical reactions. In particular, you will be examining the effects of these environmental factors on the ability of catalase to convert H 2 O 2 into H 2 O and O 2 .

The Scientific Method

As scientists, biologists apply the scientific method. Science is not simply a list of facts, but is an approach to understanding the world around us. It is use of the scientific method that differentiates science from other fields of study that attempt to improve our understanding of the world.

The scientific method is a systematic approach to problem solving. Although some argue that there is not one single scientific method, but a variety of methods; each of these approaches, whether explicit or not, tend to incorporate a few fundamental steps: observing, questioning, hypothesizing, predicting, testing, and interpreting results of the test. Sometimes the distinction between these steps is not always clear. This is particularly the case with hypotheses and predictions. But for our purposes, we will differentiate each of these steps in our applications of the scientific method.

You are already familiar with the steps of the scientific method from previous lab experiences. You will need to use your scientific method knowledge in today’s lab in creating hypotheses for each experiment, devising a protocol to test your hypothesis, and analyzing the results. Within the experimentation process it will be important to identify the independent variable, the dependent variable, and standardized variables for each experiment.

Part 1: Observe the Effects of Catalase

  • Obtain two test tubes and label one as A and one as B.
  • Use your ruler to measure and mark on each test tube 1 cm from the bottom.
  • Fill each of two test tubes with catalase (from the potato) to the 1 cm mark
  • Add 10 drops of hydrogen peroxide to the tube marked A.
  • Add 10 drops of distilled water to the tube marked B.
  • Bubbling height tube A
  • Bubbling height tube B
  • What happened when H 2 O 2 was added to the potato in test tube A?
  • What caused this to happen?
  • What happened in test tube B?
  • What was the purpose of the water in tube B?

Part 2: Effects of pH, Temperature, and Substrate Concentration

Observations.

From the introduction and your reading, you have some background knowledge on enzyme structure and function. You also just observed the effects of catalase on the reaction in which hydrogen peroxide breaks down into water and oxygen.

From the objectives of this lab, our questions are as follows:

  • How does temperature affect the ability of enzymes to catalyze chemical reactions?
  • How does pH affect the ability of enzymes to catalyze chemical reactions?
  • What is the effect of substrate concentration on the rate of enzyme catalyzed reactions?

Based on the questions above, come up with some possible hypotheses. These should be general, not specific, statements that are possible answers to your questions.

  • Temperature hypothesis
  • pH hypothesis
  • Substrate concentration hypothesis

Test Your Hypotheses

Based on your hypotheses, design a set of experiments to test your hypotheses. Use your original experiment to shape your ideas. You have the following materials available:

  • Catalase (from potato)
  • Hydrogen peroxide
  • Distilled water
  • Hot plate (for boiling water)
  • Acidic pH solution
  • Basic pH solution
  • Thermometer
  • Ruler and wax pencil

Write your procedure to test each hypothesis. You should have three procedures, one for each hypothesis. Make sure your instructor checks your procedures before you continue.

  • Procedure 1: Temperature
  • Procedure 2: pH
  • Procedure 3: Concentration

Record your results—you may want to draw tables. Also record any observations you make. Interpret your results to draw conclusions.

  • Do your results match your hypothesis for each experiment?
  • Do the results reject or fail to reject your hypothesis and why?
  • What might explain your results? If your results are different from your hypothesis, why might they differ? If the results matched your predictions, hypothesize some mechanisms behind what you have observed.

Communicating Your Findings

Scientists generally communicate their research findings in written reports. Save the things that you have done above. You will be use them to write a lab report a little later in the course.

Sections of a Lab Report

  • Title Page:  The title describes the focus of the research. The title page should also include the student’s name, the lab instructor’s name, and the lab section.
  • Introduction:  The introduction provides the reader with background information about the problem and provides the rationale for conducting the research. The introduction should incorporate and cite outside sources. You should avoid using websites and encyclopedias for this background information. The introduction should start with more broad and general statements that frame the research and become more specific, clearly stating your hypotheses near the end.
  • Methods:  The methods section describes how the study was designed to test your hypotheses. This section should provide enough detail for someone to repeat your study. This section explains what you did. It should not be a bullet list of steps and materials used; nor should it read like a recipe that the reader is to follow. Typically this section is written in first person past tense in paragraph form since you conducted the experiment.
  • Results:  This section provides a written description of the data in paragraph form. What was the most reaction? The least reaction? This section should also include numbered graphs or tables with descriptive titles. The objective is to present the data, not interpret the data. Do not discuss why something occurred, just state what occurred.
  • Discussion:  In this section you interpret and critically evaluate your results. Generally, this section begins by reviewing your hypotheses and whether your data support your hypotheses. In describing conclusions that can be drawn from your research, it is important to include outside studies that help clarify your results. You should cite outside resources. What is most important about the research? What is the take-home message? The discussion section also includes ideas for further research and talks about potential sources of error. What could you improve if you conducted this experiment a second time?
  • Biology 101 Labs. Authored by : Lynette Hauser. Provided by : Tidewater Community College. Located at : http://www.tcc.edu/ . License : CC BY: Attribution
  • BIOL 160 - General Biology with Lab. Authored by : Scott Rollins. Provided by : Open Course Library. Located at : http://opencourselibrary.org/biol-160-general-biology-with-lab/ . License : CC BY: Attribution

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of peerj

A guideline for reporting experimental protocols in life sciences

Olga giraldo.

1 Ontology Engineering Group, Campus de Montegancedo, Boadilla del Monte, Universidad Politécnica de Madrid, Madrid, Spain

Alexander Garcia

2 Technische Universität Graz, Graz, Austria

Oscar Corcho

Associated data.

  • Dryad 2017. [7 July 2017]. Dryad homepage. http://datadryad.org/
  • Figshare 2017. [7 July 2017]. Figshare. http://figshare.com
  • Giraldo O, Garcia A, Corcho O. 2018a. Corpus of protocols [Data set] Zenodo. [ CrossRef ]
  • Giraldo O, Garcia A, Corcho O. 2018b. Guidelines for reporting experimental protocols [Data set] Zenodo. [ CrossRef ]
  • Giraldo O, Garcia A, Corcho O. 2018c. Survey—reporting an experimental protocol [Data set] Zenodo. [ CrossRef ]
  • Gómez FL, Alexander, Giraldo O. 2018. SMARTProtocols/SMARTProtocols.github.io: first release of SMARTProtocols.github.io. Zenodo. [ CrossRef ]

The following information was supplied regarding data availability:

Federico López Gómez, Alexander Garcia & Olga Giraldo. (2018, March 26). SMARTProtocols/SMARTProtocols.github.io: First release of SMARTProtocols.github.io (Version v1.0.0). Zenodo. http://doi.org/10.5281/zenodo.1207846 .

Olga Giraldo. (2018, March 22). oxgiraldo/SMART-Protocols: First release of SMART-Protocols repository (Version v1.0.0). Zenodo. http://doi.org/10.5281/zenodo.1205247 .

Olga Giraldo, Alexander Garcia, & Oscar Corcho. (2018). Survey - reporting an experimental protocol [Data set]. Zenodo. http://doi.org/10.5281/zenodo.1204916 .

Olga Giraldo, Alexander Garcia, & Oscar Corcho. (2018). Guidelines for reporting experimental protocols [Data set]. Zenodo. http://doi.org/10.5281/zenodo.1204887 .

Olga Giraldo, Alexander Garcia, & Oscar Corcho. (2018). Corpus of protocols [Data set]. Zenodo. http://doi.org/10.5281/zenodo.1204838 .

Experimental protocols are key when planning, performing and publishing research in many disciplines, especially in relation to the reporting of materials and methods. However, they vary in their content, structure and associated data elements. This article presents a guideline for describing key content for reporting experimental protocols in the domain of life sciences, together with the methodology followed in order to develop such guideline. As part of our work, we propose a checklist that contains 17 data elements that we consider fundamental to facilitate the execution of the protocol. These data elements are formally described in the SMART Protocols ontology. By providing guidance for the key content to be reported, we aim (1) to make it easier for authors to report experimental protocols with necessary and sufficient information that allow others to reproduce an experiment, (2) to promote consistency across laboratories by delivering an adaptable set of data elements, and (3) to make it easier for reviewers and editors to measure the quality of submitted manuscripts against an established criteria. Our checklist focuses on the content, what should be included. Rather than advocating a specific format for protocols in life sciences, the checklist includes a full description of the key data elements that facilitate the execution of the protocol.

Introduction

Experimental protocols are fundamental information structures that support the description of the processes by means of which results are generated in experimental research ( Giraldo et al., 2017 ; Freedman, Venugopalan & Wisman, 2017 ). Experimental protocols, often as part of “Materials and Methods” in scientific publications, are central for reproducibility; they should include all the necessary information for obtaining consistent results ( Casadevall & Fang, 2010 ; Festing & Altman, 2002 ). Although protocols are an important component when reporting experimental activities, their descriptions are often incomplete and vary across publishers and laboratories. For instance, when reporting reagents and equipment, researchers sometimes include catalog numbers and experimental parameters; they may also refer to these items in a generic manner, e.g., “ Dextran sulfate, Sigma-Aldrich ” ( Karlgren et al., 2009 ). Having this information is important because reagents usually vary in terms of purity, yield, pH, hydration state, grade, and possibly additional biochemical or biophysical features. Similarly, experimental protocols often include ambiguities such as “ Store the samples at room temperature until sample digestion ” ( Brandenburg et al., 2002 ); but, how many Celsius degrees? What is the estimated time for digesting the sample? Having this information available not only saves time and effort, it also makes it easier for researchers to reproduce experimental results; adequate and comprehensive reporting facilitates reproducibility ( Freedman, Venugopalan & Wisman, 2017 ; Baker, 2016 ).

Several efforts focus on building data storage infrastructures, e.g., 3TU. Datacentrum ( 4TU, 2017 ), CSIRO Data Access Portal ( CSIRO, 2017 ), Dryad ( Dryad, 2017 ), figshare ( Figshare, 2017 ), Dataverse ( King, 2007 ) and Zenodo ( Zenodo, 2017 ). These data repositories make it possible to review the data and evaluate whether the analysis and conclusions drawn are accurate. However, they do little to validate the quality and accuracy of the data itself. Evaluating research implies being able to obtain similar, if not identical results. Journals and funders are now asking for datasets to be publicly available for reuse and validation. Fully meeting this goal requires datasets to be endowed with auxiliary data providing contextual information e.g., methods used to derive such data ( Assante et al., 2016 ; Simmhan, Plale & Gannon, 2005 ). If data must be public and available, shouldn’t methods be equally public and available?

Illustrating the problem of adequate reporting, Moher et al. (2015) have pointed out that fewer than 20% of highly-cited publications have adequate descriptions of study design and analytic methods. In a similar vein, Vasilevsky et al. (2013) showed that 54% of biomedical research resources such as model organisms, antibodies, knockdown reagents (morpholinos or RNAi), constructs, and cell lines are not uniquely identifiable in the biomedical literature, regardless of journal Impact Factor. Accurate and comprehensive documentation for experimental activities is critical for patenting, as well as in cases of scientific misconduct. Having data available is important; knowing how the data were produced is just as important. Part of the problem lies in the heterogeneity of reporting structures; these may vary across laboratories in the same domain. Despite this variability, we want to know which data elements are common and uncommon across protocols; we use these elements as the basis for suggesting our guideline for reporting protocols. We have analyzed over 500 published and non-published experimental protocols, as well as guidelines for authors from journals publishing protocols. From this analysis we have derived a practical adaptable checklist for reporting experimental protocols.

Efforts such as the Structured, Transparent, Accessible Reporting (STAR) initiative ( Marcus, 2016 ; Cell Press, 2017 ) address the problem of structure and standardization when reporting methods. In a similar manner, The Minimum Information about a Cellular Assay (MIACA) ( MIACA, 2017 ), The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) ( Lee et al., 2008 ) and many other “minimal information” efforts deliver minimal data elements describing specific types of experiments. Soldatova et al. (2008) and Soldatova et al. (2014) proposes the EXACT ontology for representing experimental actions in experimental protocols; similarly, Giraldo et al. (2017) proposes the S e MA ntic R epresen T ation of Protocols ontology (henceforth SMART Protocols Ontology) an ontology for reporting experimental protocols and the corresponding workflows. These approaches are not minimal; they aim to be comprehensive in the description of the workflow, parameters, sample, instruments, reagents, hints, troubleshooting, and all the data elements that help to reproduce an experiment and describe experimental actions.

There are also complementary efforts addressing the problem of identifiers for reagents and equipment; for instance, the Resource Identification Initiative (RII) ( Force11, 2017 ), aims to help researchers sufficiently cite the key resources used to produce the scientific findings. In a similar vein, the Global Unique Device Identification Database (GUDID) ( NIH, 2018 ) has key device identification information for medical devices that have Unique Device Identifiers (UDI); the Antibody Registry ( Antibody Registry, 2018 ), gives researchers a way to universally identify antibodies used in their research, and also the Addgene web-application ( Addgene, 2018 ) makes it easy for researchers to identify plasmids. Having identifiers make it possible for researchers to be more accurate in their reporting by unequivocally pointing to the resource used or produced. The Resource Identification Portal ( RIP, 2018 ), makes it easier to navigate through available identifiers, researchers can search across all the sources from a single location.

In this paper, we present a guideline for reporting experimental protocols; we complement our guideline with a machine-processable checklist that helps researchers, reviewers and editors to measure the completeness of a protocol. Each data element in our guideline is represented in the SMART Protocols Ontology. This paper is organized as follows: we start by describing the materials and methods used to derive the resulting guidelines. In the “Results” section, we present examples indicating how to report each data element; a machine readable checklist in the JavaScript Object Notation (JSON) format is also presented in this section. We then discuss our work and present the conclusions.

Materials and Methods

We have analyzed: (i) guidelines for authors from journals publishing protocols ( Giraldo, Garcia & Corcho, 2018b ), (ii) our corpus of protocols ( Giraldo, Garcia & Corcho, 2018a ), (iii) a set of reporting structures proposed by minimal information projects available in the FairSharing catalog ( McQuilton et al., 2016 ), and (iv) relevant biomedical ontologies available in BioPortal ( Whetzel et al., 2011 ) and Ontobee ( Xiang et al., 2011 ). Our analysis was carried out by a domain expert, Olga Giraldo; she is an expert in text mining and biomedical ontologies with over ten years of experience in laboratory techniques. All the documents were read, and then data elements, subject areas, materials (e.g., sample, kits, solutions, reagents, etc.), and workflow information were identified. Resulting from this activity we established a baseline terminology, common and non common data elements, as well as patterns in the description of the workflows (e.g., information describing the steps and the order for the execution of the workflow).

Instructions for authors from analyzed journals

Publishers usually have instructions for prospective authors; these indications tell authors what to include, the information that should be provided, and how it should be reported in the manuscript. In Table 1 we present the list of guidelines that were analyzed.

JournalGuidelines for authors
BioTechniques (BioTech)
CSH protocols (CSH)
Current Protocols (CP)
Journal of Visualized Experiments (JoVE)
Nature Protocols (NP)
Springer Protocols (SP)
MethodsX
Bio-protocols (BP)
Journal of Biological Methods (JBM)

Corpus of protocols

Our corpus includes 530 published and unpublished protocols. Unpublished protocols (75 in total) were collected from four laboratories located at the International Center for Tropical Agriculture (CIAT) ( CIAT, 2017 ). The published protocols (455 in total) were gathered from the repository “Nature Protocol Exchange” ( NPE, 2017 ) and from 11 journals, namely: BioTechniques, Cold Spring Harbor Protocols, Current Protocols, Genetics and Molecular Research ( GMR, 2017 ), JoVE, Plant Methods ( BioMed Central, 2017 ), Plos One ( PLOS ONE, 2017 ), Springer Protocols, MethodsX, Bio-Protocol and the Journal of Biological Methods. The analyzed protocols comprise areas such as cell biology, molecular biology, immunology, and virology. The number of protocols from each journal is presented in Table 2 .

SourceNumber of protocols
BioTechniques (BioTech)16
CSH protocols (CSH)267
Current Protocols (CP)31
Genetics and Molecular Research (GMR)5
Journal of Visualized Experiments (JoVE)21
Nature Protocols Exchange (NPE)39
Plant Methods (PM)12
Plos One (PO)5
Springer Protocols (SP)5
MethodsX7
Bio-protocols (BP)40
Journal of Biological Methods (JBM)7
Non-published protocols from CIAT75

Minimum information standards and ontologies

We analyzed minimum information standards from the FairSharing catalog, e.g., MIAPPE ( MIAPPE, 2017 ), MIARE ( MIARE, 2017 ) and MIQE ( Bustin et al., 2009 ). See Table 3 for the complete list of minimum information models that we analyzed.

StandardsDescription
Minimum Information about Plant Phenotyping Experiment (MIAPPE)A reporting guideline for plant phenotyping experiments.
CIMR: Plant Biology Context ( )A standard for reporting metabolomics experiments.
The Gel Electrophoresis Markup Language (GelML)A standard for representing gel electrophoresis experiments performed in proteomics investigations.
Minimum Information about a Cellular Assay (MIACA)A standardized description of cell-based functional assay projects.
Minimum Information About an RNAi Experiment (MIARE)A checklist describing the information that should be reported for an RNA interference experiment.
The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt)This guideline describes the minimum information required to report flow cytometry (FCM) experiments.
Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE)This guideline describes the minimum information necessary for evaluating qPCR experiments.
ARRIVE (Animal Research: Reporting of Experiments) ( )Initiative to improve the standard of reporting of research using animals.

We paid special attention to the recommendations indicating how to describe specimens, reagents, instruments, software and other entities participating in different types of experiments. Ontologies available at Bioportal and Ontobee were also considered; we focused on ontologies modeling domains, e.g., bioassays (BAO), protocols (EXACT), experiments and investigations (OBI). We also focused on those modeling specific entities, e.g., organisms (NCBI Taxon), anatomical parts (UBERON), reagents or chemical compounds (ERO, ChEBI), instruments (OBI, BAO, EFO). The list of analyzed ontologies is presented in Table 4 .

OntologyDescription
The Ontology for Biomedical Investigations (OBI) ( )An ontology for the description of life-science and clinical investigations.
The Information Artifact Ontology (IAO) ( )An ontology of information entities.
The ontology of experiments (EXPO) ( )An ontology about scientific experiments.
The ontology of experimental actions (EXACT)An ontology representing experimental actions.
The BioAssay Ontology (BAO) ( )An ontology describing biological assays.
The Experimental Factor Ontology (EFO) ( )The ontology includes aspects of disease, anatomy, cell type, cell lines, chemical compounds and assay information.
eagle-i resource ontology (ERO)An ontology of research resources such as instruments, protocols, reagents, animal models and biospecimens.
NCBI taxonomy (NCBITaxon) ( )An ontology representation of the NCBI organismal taxonomy.
Chemical Entities of Biological Interest (ChEBI) ( )Classification of molecular entities of biological interest focusing on ‘small’ chemical compounds.
Uberon multi-species anatomy ontology (UBERON) ( )A cross-species anatomy ontology covering animals and bridging multiple species-specific ontologies.
Cell Line Ontology (CLO) ( ; )The ontology was developed to standardize and integrate cell line information.

Methods for developing this guideline

Developing the guideline entailed a series of activities; these were organized in the following stages: (i) analysis of guidelines for authors, (ii) analysis of protocols, (iii) analysis of Minimum Information (MI) standards and ontologies, and (iv) evaluation of the data elements from our guideline. For a detailed representation of our workflow, see Fig. 1

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g001.jpg

Analyzing guidelines for authors

We manually reviewed instructions for authors from nine journals as presented in Table 1 . In this stage (step A in Fig. 1 ), we identified bibliographic data elements classified as “desirable information” in the analyzed guidelines. See Table 5 .

Bibliographic data elementsBioTechNPCPJoVECSHSPBPMethodsXJBM
title/nameYYYYYYYYY
author nameYYYYYYYYY
author identifier (e.g., orcid)NNNNNNNNN
protocol identifier (DOI)YYYYYYYYY
protocol source (retrieved from, modified from)NYNNNNNNN
updates (corrections, retractions or other revisions)NNNNNNNNN
references/related publicationsYYYYYYYYY
categories or keywordsYYYYYYYYY

In addition, we identified the rhetorical elements. These have been categorized in the guidelines for authors as: (i) required information (R), must be submitted with the manuscript; (ii) desirable information (D), should be submitted if available; and (iii) optional (O) or extra information. See Table 6 for more details.

Rhetorical/discourse elementsBio-TechNPCPJoVECSHSPBPMethodsXJBM
Description of the protocol (objective, range of applications where the protocol can be used, advantages, limitations)DDDDDDDDD
Description of the sample tested (name; ID; strain, line or ecotype; developmental stage; organism part; growth conditions; treatment type; size)NCNCDNCNCNCNCNCNC
Reagents (name, vendor, catalog number)RDDDRDRNCD
Equipment (name, vendor, catalog number)RDDDRDRNCD
Recipes for solutions (name, final concentration, volume)RDDDDDRNCD
Procedure descriptionRRRDRRRRD
Alternatives to performing specific stepsNCNCDDNCDNCNCNC
Critical stepsRNCDNCNCNCNCNCNC
Pause pointRNCNCODNCNCNCNC
TroubleshootingRORODDNCNCD
Caution/warningsNCNCRONCDNCNCD
Execution timeNCODNCNCDNCNCNC
Storage conditions (reagents, recipes, samples)RNCRDDDNCNCNC
Results (figure, tables)RNCRRDRDNCD

Analyzing the protocols

In 2014, we started by manually reviewing 175 published and unpublished protocols; these were from domains such as cell biology, biotechnology, virology, biochemistry and pathology. From this collection, 75 are unpublished protocols and thus not available in the dataset for this paper. These unpublished protocols were collected from four laboratories located at the CIAT. In 2015, our corpus grew to 530; we included 355 published protocols gathered from one repository and eleven journals as listed in Table 2 . Our corpus of published protocols is: (i) identifiable, i.e., each document has a Digital Object Identifier (DOI) and (ii) in disciplines and areas related to the expertise provided by our domain experts, e.g., virology, pathology, biochemistry, biotechnology, plant biotechnology, cell biology, molecular and developmental biology and microbiology. In this stage, step B in Fig. 1 , we analyzed the content of the protocols; theory vs. practice was our main concern. We manually verified if published protocols were following the guidelines; if not, what was missing , what additional information was included? We also reviewed common data elements in unpublished protocols.

Analyzing minimum information standards and ontologies

Biomedical sciences have an extensive body of work related to minimum information standards and reporting structures, e.g., those from the FairSharing initiative. We were interested in determining whether there was any relation to these resources. Our checklist includes the data elements that are common across these resources. We manually analyzed standards such as MIQE, used to describe qPCR assays; we also looked into MIACA, it provides guidelines to report cellular assays; ARRIVE, which provides detailed descriptions of experiments on animal models and MIAPPE, addressing the descriptions of experiments for plant phenotyping. See Table 3 for a complete list of the standards that we analyzed. Metadata, data, and reporting structures in biomedical documents are frequently related to ontological concepts. We also looked into relations between data elements and biomedical ontologies available in BioPortal and Ontobee. We focused on ontologies representing materials that are often found in protocols; for instance, organisms, anatomical parts (e.g., CLO, UBERON, NCBI Taxon), reagents or chemical compounds (e.g., ChEBI, ERO), and equipment (e.g., OBI, BAO, EFO). The complete list of the ontologies that we analyzed is presented in Table 4 .

Generating the first draft

The first draft is the main output from the initial analysis of instructions for authors, experimental protocols, MI standards and ontologies, see (step D in Fig. 1 ). The data elements were organized into four categories: bibliographic data elements such as title, authors; descriptive data elements such as purpose, application; data elements for materials, e.g., sample, reagents, equipment; and data elements for procedures, e.g., critical steps, Troubleshooting. The role of the authors, provenance and properties describing the sample (e.g., organism part, amount of the sample, etc.) were considered in this first draft. In addition properties like “name”, “manufacturer or vendor” and “identifier” were proposed to describe equipment, reagents and kits.

Evaluation of data elements by domain experts

This stage entailed three activities. The first activity was carried out at CIAT with the participation of 19 domain experts in areas such as virology, pathology, biochemistry, and plant biotechnology. The input of this activity was the checklist V. 0.1 (see step E in Fig. 1 ). This evaluation focused on “ What information is necessary and sufficient for reporting an experimental protocol? ”; the discussion also addressed data elements that were not initially part of guidelines for authors -e.g., consumables. The result of this activity was the version 0.2 of the checklist; domain experts suggested to use an online survey for further validation. This survey was designed to enrich and validate the checklist V. 0.2. We used a Google survey that was circulated over mailing lists; participants did not have to disclose their identity (see step F in Fig. 1 ). A final meeting was organized with those who participated in workshops, as well as in the survey (23 in total) to discuss the results of the online poll. The discussion focused on the question: Should the checklist include data elements not considered by the majority of participants? Participants were presented with use cases where infrequent data elements are relevant in their working areas. It was decided to include all infrequent data elements; domain experts concluded that this guideline was a comprehensive checklist a opposed to a minimal information. Also, after discussing infrequent data elements it was concluded that the importance of a data element should not bear a direct relation to its popularity. The analogy used was that of an editorial council; some data elements needed to be included regardless of the popularity as an editorial decision. The output of this activity was the checklist V. 1.0. The survey and its responses are available at ( Giraldo, Garcia & Corcho, 2018c ). This current version includes a new bibliographic element “license of the protocol”, as well as the property “equipment configuration” associated to the datum equipment. The properties: alternative, optional and parallel steps were added to describe the procedure. In addition, the datum “PCR primers” was removed from the checklist, it is specific and therefore should be the product of a community specialization as opposed to part of a generic guideline.

Our results are summarized in Table 7 ; it includes all the data elements resulting from the process illustrated in Fig. 1 . We have also implemented our checklist as an online tool that generates data in the JSON format and presents an indicator of completeness based on the checked data elements; the tool is available at https://smartprotocols.github.io/checklist1.0 ( Gómez, alexander & Giraldo, 2018 ). Below, we present a complete description of the data elements in our checklist. We have organized the data elements in four categories, namely: (i) bibliographic data elements, (ii) discourse data elements, (iii) data elements for materials, and iv) data elements for the procedure. Ours is a comprehensive checklist, the data elements must be reported whenever applicable.

Data elementProperty
Title of the protocol
AuthorName
Identifier
Version number
License of the protocol
Provenance of the protocol
Overall objective or purpose
Application of the protocol
Advantage(s) of the protocol
Limitation(s) of the protocol
OrganismWhole organism / Organism part
Sample/organism identifier
Strain, genotype or line
Amount of Bio-Source
Developmental stage
Bio-source supplier
Growth substrates
Growth environment
Growth time
Sample pre-treatment or sample preparation
Laboratory equipmentName
Manufacturer or vendor (including homepage)
Identifier (catalog number or model)
Equipment configuration
Laboratory consumableName
Manufacturer or vendor (including homepage)
Identifier (catalog number)
ReagentName
Manufacturer or vendor (including homepage)
Identifier (catalog number)
KitName
Manufacturer or vendor (including homepage)
Identifier (catalog number)
Recipe for solutionName
Reagent or chemical compound name
Initial concentration of a chemical compound
Final concentration of chemical compound
Storage conditions
Cautions
Hints
SoftwareName
Version number
Homepage
ProcedureList of steps in numerical order
Alternative/Optional/Parallel steps
Critical steps
Pause point
Timing
Hints
Troubleshooting

Bibliographic data elements

From the guidelines for authors, the datum “author identifier” was not considered, nor was this data element found in the analyzed protocols. The “provenance” is proposed as “desirable information” in only two of the guidelines (Nature Protocols and Bio-protocols), as well as “updates of the protocol” (Cold Spring Harbor Protocols and Bio-protocols). A total of 72.5% (29) of the protocols available in our Bio-protocols collection and 61.5% (24) of the protocols available in our Nature Protocols Exchange collection reported the provenance ( Fig. 2 ). None of the protocols collected from Cold Spring Harbor Protocols or Bio-protocols had been updated–last checked December 2017.

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g002.jpg

NC, Not Considered in guidelines; D, Desirable information if this is available.

As a result of the workshops, domain experts exposed the importance of including these three data elements in our checklist. For instance, readers sometimes need to contact the authors to ask about specific information (quantity of the sample used, the storage conditions of a solution prepared in the lab, etc.); occasionally, the correspondent author does not respond because he/she has changed his/her email address, and searching for the full name could retrieve multiple results. By using author IDs, this situation could be resolved. The experts asserted that well-documented provenance helps them to know where the protocol comes from and whether it has changed. For example, domain experts expressed their interest in knowing where a particular protocol was published for the first time, who has reused it, how many research papers have used it, how many people have modified it, etc. In a similar way, domain experts also expressed the need for a version control system that could help them to know and understand how, where and why the protocol has changed. For example, researchers are interested in tracking changes in quantities, reagents, instruments, hints, etc. For a complete description of the bibliographic data elements proposed in our checklist, see below.

Title. The title should be informative, explicit, and concise (50 words or fewer). The use of ambiguous terminology and trivial adjectives or adverbs (e.g., novel, rapid, efficient, inexpensive, or their synonyms) should be avoided. The use of numerical values, abbreviations, acronyms, and trademarked or copyrighted product names is discouraged. This definition was adapted from BioTechniques ( Giraldo, Garcia & Corcho, 2018b ). In Table 8 , we present examples illustrating how to define the title.

ambiguous titleA protocol for extraction of  from bacteria and yeast.Protocol available at
comprehensible titleExtraction of nucleic acids from yeast cells and plant tissues using ethanol as medium for sample preservation and cell disruption.Protocol available at

Issues in the ambiguous tittle:

Author name and author identifier. The full name(s) of the author(s) is required together with an author ID, e.g., ORCID ( ORCID, 2017 ) or research ID ( ResearcherID, 2017 ). The role of each author is also required; depending on the domain, there may be several roles. It is important to use a simple word that describes who did what. Publishers, laboratories, and authors should enforce the use of an “author contribution section” to identify the role of each author. We have identified two roles that are common across our corpus of documents.

  • • Creator of the protocol: This is the person or team responsible for the development or adaptation of a protocol.
  • • Laboratory-validation scientist: Protocols should be validated in order to certify that the processes are clearly described; it must be possible for others to follow the described processes. If applicable, statistical validation should also be addressed. The validation may be procedural (related to the process) or statistical (related to the statistics). According to the Food and Drug Administration (FDA) ( FDA, 2017 ), validation is “ establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes ” ( Das, 2011 ).

Updating the protocol. The peer-reviewed and non peer-reviewed repositories of protocols should encourage authors to submit updated versions of their protocols; these may be corrections, retractions, or other revisions. Extensive modifications to existing protocols could be published as adapted versions and should be linked to the original protocol. We recommended to promote the use of a version control system; in this paper we suggest to use the version control guidelines proposed by the National Institute of Health (NIH) ( NIH, 2017 ).

  • • Document dates: Suitable for unpublished protocols. The date indicating when the protocol was generated should be in the first page and, whenever possible, incorporated into the header or footer of each page in the document.
  • – Draft document version number: Suitable for unpublished protocols. The first draft of a document will be Version 0.1. Subsequent drafts will have an increase of “0.1” in the version number, e.g., 0.2, 0.3, 0.4, ... 0.9, 0.10, 0.11.
  • – Final document version number and date: Suitable for unpublished and published protocols. The author (or investigator) will deem a protocol final after all reviewers have provided final comments and these have been addressed. The first final version of a document will be Version 1.0; the date when the document becomes final should also be included. Subsequent final documents will have an increase of “1.0” in the version number (1.0, 2.0, etc.).
  • • Documenting substantive changes: Suitable for unpublished and published protocols. A list of changes from the previous drafts or final documents will be kept. The list will be cumulative and identify the changes from the preceding document versions so that the evolution of the document can be seen. The list of changes and consent/assent documents should be kept with the final protocol.

Provenance of the protocol. The provenance is used to indicate whether or not the protocol results from modifying a previous one. The provenance also indicates whether the protocol comes from a repository, e.g., Nature Protocols Exchange, protocols.io ( Teytelman et al., 2016 ), or a journal like JoVE, MethodsX, or Bio-Protocols. The former refers to adaptations of the protocol. The latter indicates where the protocol comes from. See Table 9 .

example Protocol available at

License of the protocol. The protocols should include a license. Whether as part of a publication or, just as an internal document, researchers share, adapt and reuse protocols. The terms of the license should facilitate and make clear the legal framework for these activities.

Data elements of the discourse

Here, we present the elements considered necessary to understand the suitability of a protocol. They are the “overall objective or purpose”, “applications”, “advantages,” and “limitations”. 100% of the analyzed guidelines for author suggest the inclusion of these four elements in the abstract or introduction section. However, one or more of these four elements were not reported. For example, “limitations” was reported in only 20% of the protocols from Genetic and Molecular Research and PLOS One, and in 40% of the protocols from Springer. See Fig. 3 .

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g003.jpg

Interestingly, 83% of the respondents considered the “limitations” to be a data element that is necessary when reporting a protocol. In the last meeting, participants considered that “limitations” represents an opportunity to make suggestions for further improvements. Another data element discussed was “advantages”; 43% of the respondents considered the “advantages” as a data element that is necessary to be reported in a protocol. In the last meeting, all participants agreed that “advantages” (where applicable) could help us to compare a protocol with other alternatives commonly used to achieve the same result. For a complete description of the discourse data elements proposed in our checklist, see below.

Overall objective or Purpose. The description of the objective should make it possible for readers to decide on the suitability of the protocol for their experimental problem. See Table 10 .

Discourse data elementExampleSource
Overall objective/ Purpose Reagent or columns.”Protocol available at
Application Protocol available at
Advantage(s) Protocol available at
Limitation(s) Protocol available at

Application of the protocol. This information should indicate the range of techniques where the protocol could be applied. See Table 10 .

Advantage(s) of the protocol. Here, the advantages of a protocol compared to other alternatives should be discussed. See Table 10 . Where applicable, references should be made to alternative methods that are commonly used to achieve the same result.

Limitation(s) of the protocol. This datum includes a discussion of the limitations of the protocol. This should also indicate the situations in which the protocol could be unreliable or unsuccessful. See Table 10 .

Data elements for materials

From the analyzed guidelines for authors, the datum “sample description” was considered only in the Current Protocols guidelines. The “laboratory consumables or supplies” datum was not included in any of the analyzed guidelines. See Fig. 4 .

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g004.jpg

NC, Not Considered in guidelines; D, Desirable information if this is available; R, Required information.

Our Current Protocols collection includes documents about toxicology, microbiology, magnetic resonance imaging, cytometry, chemistry, cell biology, human genetics, neuroscience, immunology, pharmacology, protein, and biochemistry; for these protocols the input is a biological or biochemical sample. This collection also includes protocols in bioinformatics with data as the input. 100% of the protocols from our Current Protocols collection includes information about the input of the protocol (biological/biochemical sample or data). In addition, 87% of protocols from this collection include a list of materials or resources (reagents, equipment, consumables, software, etc.).

We also analyzed the protocols from our MethodsX collection. We found that despite the exclusion of the sample description in guidelines for authors, the authors included this information in their protocols. Unfortunately, these protocols do not include a list of materials. Only 29% of the protocols reported a partial list of materials. For example, the protocol published by Vingataramin & Frost (2015) , includes a list of recommended equipment but does not list any of the reagents, consumables, or other resources mentioned in the protocol instructions. See Fig. 5 .

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g005.jpg

Domain experts considered that the input of the protocol (biological/biochemical sample or data) needs an accurate description; the granularity of the description varies depending on the domain. If such description is not available then the reproducibility could be affected. In addition, domain experts strongly suggested to include consumables in the checklist. It was a general surprise not to find these data elements in the guidelines for authors that we analyzed. Domain experts shared with us bad experiences caused by the lack of information about the type of consumables. Some of the incidents that may arise from the lack of this information include: (i) cross contamination, when no information suggesting the use of filtered pipet tips is available; (ii) misuse of containers, when no information about the use of containers resistant to extreme temperatures and/or impacts is available; (iii) misuse of containers, when a container made of a specific material should be used, e.g., glass vs. plastic vs. metal. This is critical information; researchers need to know if reagents or solutions prepared in the laboratory require some specific type of containers in order to avoid unnecessary reactions altering the result of the assay. Presented below is the set of data elements related to materials or resources used for carrying out the execution of a protocol.

Sample. This is the role played by a biological substance; the sample is an experimental input to a protocol. The information required depends on the type of sample being described and the requirements from different communities. Here, we present the data elements for samples commonly used across the protocols and guidelines that we analyzed.

  • Strain, genotype or line: This datum is about subspecies such as ecotype, cultivar, accession, or line. In the case of crosses or breeding results, pedigree information should also be provided.
  • – whole organism Typical examples are multicellular animals, plants, and fungi; or unicellular microorganisms such as a protists, bacteria, and archaea.
  • – organism part Typical examples of an organism part include a cell line, a tissue, an organ, corporal bodily fluids protoplasts, nucleic acids, proteins, etc.
  • – organism/sample identifier This is the unique identifier assigned to an organism. The NCBI taxonomy id, also known as “taxid”, is commonly used to identify an organism; the Taxonomy Database is a curated classification and nomenclature for all organisms in the public sequence databases. Public identification systems, e.g., the Taxonomy Database, should be used when ever possible. Identifiers may be internal; for instance, laboratories often have their own coding system for generating identifiers. When reporting internal identifiers it is important to also state the source and the nature (private or pubic) of the identifier, e.g., A0928873874, barcode (CIAT-DAPA internal identifier) of a specimen or sample.
  • Amount of Bio-Source: This datum is about mass (mg fresh weight or mg dry weight), number of cells, or other measurable bulk numbers (e.g., protein content).
  • Developmental stage: This datum includes age and gender (if applicable) of the organism.
  • Bio-source Supplier: This datum is defined as a person, company, laboratory or entity that offers a variety of biosamples or biospecimens.
  • Growth substrates: This datum refers to an hydroponic system (type, supplier, nutrients, concentrations), soil (type, supplier), agar (type, supplier), and cell culture (media, volume, cell number per volume).
  • Growth environment: This datum includes, but is not limited to, controlled environments such as greenhouse (details on accuracy of control of light, humidity, and temperature), housing conditions (light/dark cycle), and non-controlled environments such as the location of the field trial.
  • Growth time: This datum refers to the growth time of the sample prior to the treatment.
  • • Sample pre-treatment or sample preparation: This datum refers to collection, transport, storage, preparation (e.g., drying, sieving, grinding, etc.), and preservation of the sample.

Laboratory equipment. The laboratory equipment includes apparatus and instruments that are used in diagnostic, surgical, therapeutic, and experimental procedures. In this subsection, all necessary equipment should be listed; manufacturer name or vendor (including the homepage), catalog number (or model), and configuration of the equipment should be part of this data element. See Table 11 .

Protocol available at
  • • Laboratory equipment name: This datum refers to the name of the equipment as it is given by the manufacturer (e.g., FocalCheck fluorescence microscope test slide).
  • • Manufacturer name: This datum is defined as a person, company, or entity that produces finished goods (e.g., Life Technologies, Zeiss).
  • • Laboratory equipment ID (model or catalog number): This datum refers to an identifier provided by the manufacturer or vendor (e.g., F36909—catalog number for FocalCheck fluorescence microscope test slide from Life Technologies).
  • • Equipment configuration: This datum should explain the configuration of the equipment and the parameters that make it possible to carry out an operation, procedure, or task (e.g., the configuration of an inverted confocal microscope).

Laboratory consumables or supplies. The laboratory consumables include, amongst others, disposable pipettes, beakers, funnels, test tubes for accurate and precise measurement, disposable gloves, and face masks for safety in the laboratory. In this subsection, a list with all the consumables necessary to carry out the protocol should be presented with manufacturer name (including the homepage) and catalog number. See Table 12 .

Filter paperProtocol available at
Filter paper (GE, catalog number: 10311611)Protocol available at
  • • Laboratory consumable name: This datum refers to the name of the laboratory consumable as it is given by the manufacturer e.g., Cryogenic Tube, sterile, 1.2 ml.
  • • Manufacturer name: This datum is defined as a person, enterprise, or entity that produces finished goods (e.g., Nalgene, Thermo-scientific, Eppendorf, Falcon)
  • • Laboratory consumable ID (catalog number): This datum refers to an identifier provided by the manufacturer or vendor; for instance, 5000-0012 (catalog number for Cryogenic Tube, sterile, 1.2 mL from Nalgene).

Recipe for solutions. A recipe for solutions is a set of instructions for preparing a particular solution, media, buffer, etc. The recipe for solutions should include the list of all necessary ingredients (chemical compounds, substance, etc.), initial and final concentrations, pH, storage conditions, cautions, and hints. Ready-to-use reagents do not need to be listed in this category; all purchased reagents that require modification (e.g., a dilution or addition of β -mercaptoethanol) should be listed. See Table 13 for more information.

See in the section recipes, the recipe 1 (PBS)Protocol available at
Phosphate-buffered saline (PBS) recipeProtocol available at
  • • Solution name: This is the name of the preparation that has at least 2 chemical substances, one of them playing the role of solvent and the other playing the role of solute. If applicable, the name should include the following information: concentration of the solution, final volume and final pH. For instance, Ammonium bicarbonate (NH4HCO3), 50 mM, 10 ml, pH 7.8.
  • • Chemical compound name or reagent name: This is the name of a drug, solvent, chemical, etc.; for instance, agarose, dimethyl sulfoxide (DMSO), phenol, sodium hydroxide. If applicable, a measurable property, e.g., concentration, should be included.
  • • Initial concentration of a chemical compound: This is the first measured concentration of a compound in a substance.
  • • Final concentration of chemical compound: This is the last measured concentration of a compound in a substance.
  • • Storage conditions: This datum includes, among others, shelf life (maximum storage time) and storage temperature for the solutions e.g., “Store the solution at room temperature”, “maximum storage time, 6 months”. Specify whether or not the solutions must be prepared fresh.
  • • Cautions: Toxic or harmful chemical compounds should be identified by the word ‘CAUTION’ followed by a brief explanation of the hazard and the precautions that should be taken when handling e.g., “CAUTION: NaOH is a very strong base. Can seriously burn skin and eyes. Wear protective clothing when handling. Make in fume hood”.
  • • Hints: The “hints” are commentaries or “tips” that help the researcher to correctly prepare the recipe e.g., “Add NaOH to water to avoid splashing”.

Reagents. A reagent is a substance used in a chemical reaction to detect, measure, examine, or produce other substances. List all the reagents used when performing the protocol, the vendor name (including homepage), and catalog number. Reagents that are purchased ready-to-use should be listed in this section. See Table 14 .

Dextran sulfate, Sigma-AldrichProtocol available at
Dextran sulfate sodium salt from , Sigma-Aldrich, D8906-5GProtocol available at
  • • Reagent name: This datum refers to the name of the reagent or chemical compound. For instance, “Taq DNA Polymerase from Thermus aquaticus with 10X reaction buffer without MgCl2”.
  • • Reagent vendor or manufacturer: This is the person, enterprise, or entity that produces chemical reagents e.g., Sigma-Aldrich.
  • • Reagent ID (catalog number): This is an identifier provided by the manufacturer or vendor. For instance, D4545-250UN (catalog number for Taq DNA Polymerase from Thermus aquaticus with 10X reaction buffer without MgCl2 from Sigma-Aldrich).

Kits. A kit is a gear consisting of a set of articles or tools for a specific purpose. List all the kits used when carrying out the protocol, the vendor name (including homepage), and catalog number.

  • • Kit name: This datum refers to the name of the kit as it is given by the manufacturer e.g., Spectrum Plant Total RNA Kit, sufficient for 50 purifications.
  • • Kit vendor or manufacturer: This is the person, enterprise, or entity that produces the kit e.g., Sigma-Aldrich.
  • • Kit ID (catalog number): This is an identifier provided by the manufacturer or vendor e.g., STRN50, catalog number for Spectrum ™ Plant Total RNA Kit, sufficient for 50 purifications.

Software. Software is composed of a series of instructions that can be interpreted or directly executed by a processing unit. In this subsection, please list software used in the experiment including the version, as well as where to obtain it.

  • • Software name: This datum refers to the name of the software. For instance, “LightCycler 480 Software”.
  • • Software version: A software version number is an attribute that represents the version of software e.g., Version 1.5.
  • • Software availability: This datum should indicate where the software can be downloaded from. If possible, license information should also be included; for instance, https://github.com/MRCIE-U/ariesmqtl, GPL3.0.

Data elements for the procedure

All the analyzed guidelines include recommendations about how to document the instructions; for example, list the steps in numerical order, use active tense, organize the procedures in major stages, etc. However, information about documentation of alternative, optional, or parallel steps (where applicable) and alert messages such as critical steps, pause point, and execution time was infrequent (available in less than 40% of the guidelines). See Fig. 6 .

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g006.jpg

NC, Not Considered in guidelines; O, Optional information; D, Desirable information if this is available; R, Required information.

We chose a subset of protocols (12 from our Plant Methods collection, 7 from our Biotechniques collection, and five unpublished protocols from CIAT) to review which data elements about the procedure were documented. 100% of the protocols have steps organized in major stages. 100% of the unpublished protocols list the steps in numerical order, and nearly 60% of the protocols from Plant Methods and Biotechniques followed this recommendation. Alert messages were included in 67% of the Plant Methods protocols and in 14% of the Biotechniques protocols. Neither of the five unpublished protocols included alert messages. Troubleshooting was reported in just a few protocols; this datum was available in 8% of the Plant Methods protocols and in 14% of the Biotechniques protocols. See Fig. 7 .

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g007.jpg

In this stage, the discussion with domain experts started with the description of steps. In some protocols, the steps are poorly described; for instance, some of them include working temperatures, e.g., cold room, on ice, room temperature; but, what exactly do they mean? Steps involving centrifugation, incubation, washing, etc., should specify conditions, e.g., time, temperature, speed (rpm or g), number of washes, etc. For experts, alert messages and troubleshooting (where applicable) complement the description of steps and facilitate a correct execution. This opinion coincides with the results of the survey, where troubleshooting and alert messages such as critical steps, pause points, and timing were considered relevant by 83%–87% of the respondents. The set of data elements related to the procedure is presented below.

  • • Recommendation 1. Whenever possible, list the steps in numerical order; use active tense. For example: “Pipette 20 ml of buffer A into the flask,” as opposed to “20 ml of buffer A are/were pipetted into the flask” ( Nature Protocols, 2012 ).
  • • Recommendation 2. Whenever there are two or more alternatives, these should be numbered as sets of consecutive steps Wiley’s Current Protocols (2012) . For example: “Choose procedure A (steps 1–10) or procedure B (steps 11–20); then continue with step 21 . . .”. Optional steps or steps to be executed in parallel should also be included.
  • • Recommendation 3. For techniques comprising a number of individual procedures, organize these in the exact order in which they should be executed ( Nature Protocols, 2012 ).
  • – Frozen/deep-freeze temperature (−20 °C to −15 °C)
  • – Refrigerator, cold room or cold temperature (2 °C to 8 °C)
  • – Cool temperature (8 °C to 15 °C)
  • – Room/Ambient temperature (15 °C to 25 °C)
  • – Warm/Lukewarm temperature (30 °C to 40 °C)

For centrifugation steps, specify time, temperature, and speed (rpm or g). Always state whether to discard/keep the supernatant/pellet. For incubations, specify time, temperature, and type of incubator. For washes, specify conditions e.g., temperature, washing solution and volume, specific number of washes, etc.

Useful auxiliary information should be included in the form of “alert messages”. The goal is to remind or alert the user of a protocol with respect to issues that may arise when executing a step. These messages may cover special tips or hints for performing a step successfully, alternate ways to perform the step, warnings regarding hazardous materials or other safety conditions, time considerations. For instance, pause points, speed at which the step must be performed and storage information (temperature, maximum duration) ( Wiley’s Current Protocols, 2012 ).

Alert messageStepNoteSource
Critical step Protocol available at
Pause point Protocol available at
Timing Protocol available at
Hint Protocol available at
  • • Pause point: This datum is appropriate after steps in the protocol where the procedure can be stopped. i.e., when the experiment can be stopped and resumed at a later point in time. Any PAUSE POINTS should be indicated with a brief description of the options available. See Table 15 .
  • • Timing: This datum is used to include the approximate time of execution of a step or set of steps. Timing could also be indicated at the beginning of the protocol. See Table 15 .
  • • Hints: Provide any commentary, note, or hints that will help the researcher to correctly perform the protocol. See Table 15 .
  • • Troubleshooting: This datum is used to list common problems, possible causes, and solutions/methods of correction. This can be submitted as a 3-column table or listed in the text. An example is presented in “Table 1.Troubleshooting table”, available at Rohland & Hofreiter (2007) .

Data Elements Represented in the SMART Protocols Ontology

The data elements proposed in our guideline are represented in the SMART Protocols Ontology. This ontology was developed to facilitate the semantic representation of experimental protocols. Our ontology reuses the Basic Formal Ontology (BFO) ( IFOMIS, 2018 ) and the Relation Ontology (RO) ( Smith et al., 2005 ) to characterize concepts. In addition, each term in the SMART Protocols ontology is represented with annotation properties imported from the OBI Minimal metadata. The classes and properties are represented by their respective labels to facilitate the readability; the prefix indicates the provenance for each term. Our ontology is organized in two modules. The document module represents the metadata necessary and sufficient for reporting a protocol. The workflow module represents the executable elements of a protocol to be carried out and maintained by humans. Figure 8 presents the hierarchical organization of data elements into the SMART Protocols Ontology.

An external file that holds a picture, illustration, etc.
Object name is peerj-06-4795-g008.jpg

In this paper, we have described 17 data elements that can be used to improve the reporting structure of protocols. Our work is based on the analysis of 530 published and non-published protocols, guidelines for authors, and suggested reporting structures. We examined guidelines for authors from journals that specialize in publishing experimental protocols, e.g., Bio-protocols, Cold Spring Harbor Protocols, MethodsX, Nature Protocols, and Plant Methods (Methodology). Although JoVE ( JoVE, 2017 ) is a video methods journal, its guidelines for authors were also considered. Online repositories were also studied; these resources deliver an innovative approach for the publication of protocols by offering platforms tailored for this kind of document. For instance, protocols.io ( protocols.io, 2018 ) structures the protocol by using specific data elements and treats the protocol as a social object, thus facilitating sharing. It also makes it possible to have version control over the document. Protocol Exchange from Nature Protocols is an open repository where users upload, organize, comment, and share their protocols. Our guideline has also benefited from the input from a group of researchers whose primary interest is having reproducible protocols. By analyzing reporting structures and guidelines for authors, we are contributing to the homogenization of data elements that should be reported as part of experimental protocols. Improving the reporting structure of experimental protocols will add the necessary layer of information that should accompany the data that is currently being deposited into data repositories.

Ours was an iterative development process; drafts were reviewed and analyzed, and then improved versions were produced. This made it easier for us to make effective use of the time that domain experts had available. Working with experimental protocols that were known by our group of domain experts helped us to engage them in the iterations. Also, for the domain experts who worked with us during the workshops, there was a pre-existing interest in standardizing their reporting structures. Reporting guidelines are not an accepted norm in biology ( MIBBI, 2017 ); however, experimental protocols are part of the daily activities for most biologists. They are familiar with these documents, the benefits of standardization are easy for them to understand. From our experience at CIAT, once researchers were presented with a standardized format that they could extend and manage with minimal overhead, they adopted it. The early engagement with domain experts in the development process eased the initial adoption; they were familiar with the outcome and aware of the advantages of implementing this practice. However, maintaining the use of the guideline requires more than just availability of the guideline; the long-term use of these instruments requires an institutional policy in data stewardship. Our approach builds upon previous experiences; in our case, the guidelines presented in this paper are a tool that was conceived by researchers as part of their reporting workflow, thus adding a minimal burden on their workload. As domain experts were working with the guideline, they were also gaining familiarity with the Minimum Information for Biological and Biomedical Investigations (MIBBI) ( MIBBI, 2017 ) that were applicable to their experiments. This made it possible for us to also discuss the relation between MIBBIs and the content in the experimental protocols.

The quality of the information reported in experimental protocols and methods is a general cause for concern. Poorly described methods generate poorly reproducible research. In a study conducted by Flórez-Vargas et al. (2014) in Trypanosoma experiments, they report that none of the investigated articles met all the criteria that should be reported in these kinds of experiments. The study reported by Kilkenny et al. (2009) has similar results leading to similar conclusions; key metadata elements are not always reported by researchers. The widespread availability of key metadata elements in ontologies, guidelines, minimal information models, and reporting structures was discussed. These were, from the onset, understood as reusable sources of information. Domain experts understand that they were building on previous experiences; having examples of use is helpful in understanding how to adapt or reuse from existing resources. This helps them to understand the rationale of each data element within the context of their own practice. For us, being able to consult previous experiences was also an advantage. Sharing protocols is a common practice amongst researchers from within the same laboratories or collaborating in the same experiments or projects. However, there are limitations in sharing protocols, not necessarily related to the lack of reporting standards. They are, for instance, related to patenting and intellectual property issues, as well as to giving away competitive advantages implicit in the method.

During our development process, we considered the SMART Protocols ontology ( Giraldo et al., 2017 ); it reuses terminology from OBI, IAO, EXACT, ChEBI, NCBI taxonomy, and other ontologies. Our metadata elements have been mapped to the SMART Protocols ontology; the metadata elements in our guideline could also be mapped to resources on the web such as PubChem ( Kim et al., 2016 ) ( Wang et al., 2017 ) and the Taxonomy database from UniProt ( UniProt, 2017 ). Our implementation of the checklist illustrates how it could be used as an online tool to generate a complement to the metadata that is usually available with published protocols. The content of the protocol does not need to be displayed; key metadata elements are made available together with the standard bibliographic metadata. Laboratories could adapt the online tool to their specific reporting structures. Having a checklist made it easier for the domain experts to validate their protocols. Machine validation is preferable, but such mechanisms require documents to be machine-processable beyond that which our domain experts were able to generate. Domain experts were using the guideline to implement simple Microsoft Word reporting templates. Our checklist does not include aspects inherent to each possible type of experiment such as those available in the MIBBIs; these are based on the minimal common denominator for specific experiments. Both approaches complement each other; where MIBBIs offer specificity, our guideline provides a context that is general enough for facilitating reproducibility and adequate reporting without interfering with records such as those commonly managed by Laboratory Information Management Systems.

In laboratories, experimental protocols are released and periodically undergo revisions until they are released again. These documents follow the publication model put forward by Carole Goble, “ Don’t publish, release ” with strict versioning, changes, and forks ( Goble, 2017 ). Experimental protocols are essentially executable workflows for which identifiers for equipment, reagents, and samples need to be resolved against the Web. The use of unique identifiers can’t be underestimated when supporting adequate reporting; identifiers remove ambiguity for key resources and make it possible for software agents to resolve and enrich these entities. The workflows in protocols are mostly followed by humans, but in the future, robots may be executing experiments ( Yachie, Consortium & Natsume, 2017 ); it makes sense to investigate other publication paradigms for these documents. The workflow nature of these documents is more suitable for a fully machine-processable or -actionable document. The workflows should be intelligible for humans and processable by machines; thus, facilitating the transition to fully automated laboratory paradigms. Entities and executable elements should be declared and characterized from the onset. The document should be “born semantic” and thus inter-operable with the larger web of data. In this way post-publication and linguistic processing activities, such as Named Entity Recognition and annotation, could be more focused.

Currently, when protocols are published, they are treated like any other scientific publication. Little attention is paid to the workflow nature implicit in this kind of document, or to the chain of provenance indicating where it comes from and how it has changed. The protocol is understood as a text-based narrative instead of a self-descriptive Findable Accessible Interoperable and Reusable (FAIR) ( Wilkinson et al., 2016 ) compliant document. There are differences across the examined publications, e.g., JoVE builds the narrative around video, whereas Bio-protocols, MethodsX, Nature Protocols, and Plant Methods primarily rely on a text-based narrative. The protocol is, however, a particular type of publication; it is slightly different from other scientific articles. An experimental protocol is a document that is kept “alive” after it has been published. The protocols are routinely used in laboratory activities, and researchers often improve and adapt them, for instance, by extending the type of samples that can be tested, reducing timing, minimizing the quantity of certain reagents without altering the results, adding new recipes, etc. The issues found in reporting methods probably stem, at least in part, from the current structure of scientific publishing, which is not adequate to effectively communicate complex experimental methods ( Flórez-Vargas et al., 2014 ).

Experimental research should be reproducible whenever possible. Having precise descriptions of the protocols is a step in that direction. Our work addresses the problem of adequate reporting for experimental protocols. It builds upon previous work, as well as over an exhaustive analysis of published and unpublished protocols and guidelines for authors. There is value in guidelines because they indicate how to report; having examples of use facilitate how to adapt them. The guideline we present in this paper can be adapted to address the needs of specific communities. Improving reporting structures requires collective efforts from authors, peer reviewers, editors, and funding bodies. There is no “one size that fits all.” The improvement will be incremental; as guidelines and minimal information models are presented, they will be evaluated, adapted, and re-deployed.

Authors should be aware of the importance of experimental protocols in the research life-cycle. Experimental protocols ought to be reused and modified, and derivative works are to be expected. This should be considered by authors before publishing their protocols; the terms of use and licenses are the choice of the publisher, but where to publish is the choice of the author. Terms of use and licenses forbidding “reuse”, “reproduce”, “modify”, or “make derivative works based upon” should be avoided. Such restrictions are an impediment to the ability of researchers to use the protocols in their most natural way, which is adapting and reusing them for different purposes –not to mention sharing, which is a common practice among researchers. Protocols represent concrete “know-how” in the biomedical domain. Similarly, publishers should adhere to the principle of encouraging authors to make protocols available, for instance, as preprints or in repositories for protocols or journals. Publishers should enforce the use of repository or journal publishing protocols. Publishers require or encourage data to be available; the same principle should be applied to protocols. Experimental protocols are essential when reproducing or replicating an experiment; data is not contextualized unless the protocols used to derive the data are available.

This work is related to the SMART Protocols project. Ultimately we want: (1) to enable authors to report experimental protocols with necessary and sufficient information that allows others to reproduce an experiment, (2) to ensure that every data item is resolvable against resources in the web of data, and (3) to make the protocols available in RDF, JSON, and HTML as web native objects. We are currently working on a publication platform based on linked data for experimental protocols. Our approach is simple, we consider that protocols should be born semantics and FAIR.

Acknowledgments

Special thanks to the research staff at CIAT; in particular, we want to express our gratitude to those who participated in the workshops, survey and discussions. We also want to thank Melissa Carrion for her useful comments and proof-reading. Finally, we would like to thank the editor and reviewers (Leonid Teytelman, Philippe Rocca-Serra and Tom Gillespie) for their valuable comments and suggestions to improve the manuscript.

Funding Statement

This work was supported by the EU project Datos4.0 (No. C161046002). Olga Giraldo has been funded by the I+D+i pre doctoral grant from the UPM, and the Predoctoral grant from the I+D+i program from the Universidad Politécnica de Madrid. Alexander Garcia has been funded by the KOPAR project, H2020-MSCA-IF-2014, Grant Agreement No. 655009. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Additional Information and Declarations

The authors declare there are no competing interests.

Olga Giraldo conceived and designed the experiments, performed the experiments, analyzed the data, contributed reagents/materials/analysis tools, prepared figures and/or tables, authored or reviewed drafts of the paper, approved the final draft.

Alexander Garcia contributed reagents/materials/analysis tools, prepared figures and/or tables, authored or reviewed drafts of the paper, approved the final draft, alexander supervised the research and was a constant springboard for discussion and ideas wrt the checklist and methods.

Oscar Corcho reviewed drafts of the paper, and approved the final draft.

Stay updated on the wildfire impacts to hunting areas in northeast Wyoming

Site logo

CHEYENNE — On Sept. 3, the Wyoming Game and Fish Department, through the Wyoming State Veterinary Laboratory, confirmed a case of anthrax in a dead moose in Carbon County. The Wyoming Livestock Board recently informed Game and Fish that cattle near Elk Mountain have tested positive for anthrax.

Anthrax is a naturally occurring bacterial disease that can be transmitted between livestock, wildlife and humans. It is most commonly seen in herbivores, including cattle, deer and bison (elk, moose and pronghorn are also susceptible). Carnivores tend to be less at risk and may display higher resilience to the disease. The spores can persist in the ground for decades and emerge when the ground is disturbed or flooded. Disturbance is common in summer months when conditions may alternate between rain and hot, dry weather, allowing spores to be released from contaminated soil and ingested by livestock or wildlife.  

This recent detection in a moose is the only documented case reported in wildlife at this time. The last confirmed case of anthrax in wildlife in Wyoming was in 1956 in Sublette County. 

Game and Fish is advising hunters and the public to take the following precautions:

  • If you encounter dead cattle or wildlife, do not approach, handle or move carcasses.
  • Do not harvest an animal that looks sick. Early signs of anthrax can include respiratory difficulty and disorientation. After death, infected animals tend to bloat very quickly and you may see black, tarry blood coming out of natural body openings (e.g., nose, mouth, anus).
  • It is always recommended to wear gloves while field dressing or handling harvested animals.
  • Do not pick up roadkill or fresh deadheads in the Elk Mountain area.
  • Keep dogs, horses and other pets away from animal carcasses you come across in the field.
  • If hunters encounter deceased wildlife, note the location or take a GPS pin and report findings to Game and Fish. You can  report a wildlife disease incident online or by calling the Game and Fish Wildlife Health Laboratory at 307-745-5865.
  • Human cases are rare but precautions are warranted. If you have concerns that you may have come into contact with an anthrax-infected animal, please contact the Wyoming Department of Health and seek medical attention.

Game and Fish will continue to monitor the situation and assess impacts to wildlife. If changes occur that require further action, hunters will receive updates through emails and posts on the Game and Fish website. 

For questions, please consult the following list of resources:

Wyoming Livestock Board

Occurrences in cattle and area affected

307-777-7515

Wyoming Department of Health

Human health and safety concerns

307-777-7656

Wyoming Game and Fish Department

Wildlife and hunting concerns

Wildlife Health Lab: 307-745-5865

Regional Office: 307-745-4046

Want the latest updates?

Sign up to get the latest news and events sent directly to your inbox.

FREE BRACELET with qualifying purchase.

FREE SHIPPING on all orders $75+.

FREE IN-STORE PICKUP Within 2 hours.

BUY NOW PAY LATER  with AfterPay, Klarna or PayPal.

Join My PANDORA  to get 10% off and exclusive benefits.

Open a Pandora Credit Card & Save 10%**. LEARN MORE

Remove Product

  • Lab Grown Diamond Necklaces

Pandora Era Lab-grown Diamond Pavé Bar Necklace 0.26 carat tw 14k Gold

Pandora Era Lab-grown Diamond Pavé Bar Necklace 0.26 carat tw 14k Gold image number null

What is a carat weight?

Each lab-grown diamond in our collection is individually crafted, with exacting standards for remarkable cut, color and clarity. Read more about them below.

A diamond's carat number refers to its weight. A carat is a metric measurement equivalent to 200 milligrams. Diamonds are often measured in one hundredths of a carat – each one of these units is known as a "point." Our current collections feature lab-grown diamonds ranging between 0.009 and 2.00 carats. The most important thing to know is: the higher the number, the heavier (and bigger) the diamond.

Click here to learn more about carat weight and the 4 C's of lab-grown diamonds.

previous laboratory experiments

  • Low Stock - Only 3 Left!

Drop a Hint

Let us do the gifting work for you. Tell that special someone what item and size you would like.

Pandora Era Lab-grown Diamond Pavé Bar Necklace 0.26 carat tw 14k Gold

Select a size

Standard Shipping • FREE on all orders $75+. • Arrives in 3-4 business days.

Express Shipping • Free on Lab-Grown Diamonds. • Arrives in 1-2 business days.

• Free on all orders. • Ready for pickup within 2 hours.

• Free and easy returns within 30 days, online or in-store. • Prepaid return label included.

Berkeley Lab Careers

  • Visually Impaired
  • Appointment Types

NESAP for Programming Environments and Models Postdoc

🔍 bay area, california, united states.

The National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab seeks a highly motivated postdoctoral fellow to join the Programming Environments and Models team as part of NERSC’s Exascale Science Acceleration Program (NESAP). NESAP postdocs collaborate with scientific teams to enable the solution of deep, meaningful problems across all program areas funded by the Department of Energy Office of Science.

The Perlmutter Supercomputer is NERSC’s first production GPU-based system. Many workflows running at NERSC need to be adapted or optimized to run efficiently on GPUs. At the same time, solutions that put GPU performance in users’ hands need to be portable. In this role, as part of a multidisciplinary team composed of computational and domain scientists, you’ll develop and apply cutting-edge computer science, advanced performance analysis and software engineering to meet these challenges. In order to carry out this work you will collaborate closely with one or more NESAP teams.

What You Will Do:

  • apply knowledge of HPC programming and programming models
  • Contribute to the identification, implementation and application of new or advanced methods and technologies to improve the performance, portability or productivity of scientific software.
  • Publish and present results in peer reviewed journals and conferences.

Required Qualifications:

  • Ph.D. in Computer Science, Computational Science, Applied Mathematics or an equivalent/related field awarded within the last five years.
  • Experience programming in one or more of python, C++, Fortran, Julia.
  • Kokkos, OpenMP offload/OpenACC, CUDA/HIP, JAX, SYCL, MPI
  • Ability to work in an interdisciplinary team.
  • Demonstrated track record of written and verbal communication of candidate-led results.

Desired Qualifications:

  • Knowledge of GPU architecture.
  • Experience with performance analysis and profiling tools.
  • Experience debugging distributed memory parallel applications.
  • Experience with container technologies (e.g. Docker/ Podman, shifter, or equivalent).
  • Knowledge of git and modern software development practices (e.g. branches, CI/CD, Pull requests).
  • This is a full-time, 2 year, postdoctoral appointment with the possibility of renewal based upon satisfactory job performance, continuing availability of funds and ongoing operational needs. You must have less than 3 years of paid postdoctoral experience.
  • The salary range for this position is $77,172 - $103,704 and is expected to start at $94,116 or above. Postdoctoral positions are paid on a step schedule per union contract and salaries will be predetermined based on postdoctoral step rates. Each step represents one full year of completed post-Ph.D. postdoctoral experience.
  • This position is represented by a union for collective bargaining purposes.
  • This position may be subject to a background check. Any convictions will be evaluated to determine if they directly relate to the responsibilities and requirements of the position. Having a conviction history will not automatically disqualify an applicant from being considered for employment.
  • Work will be primarily performed at Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA. Position is eligible for an on-site or hybrid work schedule.

Learn About Us:

Berkeley Lab is a U.S. Department of Energy national laboratory managed by the University of California and designated a Federally Funded Research and Development Center. Located in the San Francisco Bay Area, we have a close relationship with UC Berkeley, as well as robust partnerships with other academic institutions and industries, including those in Silicon Valley. The Laboratory conducts world-class research that supports clean energy, a healthy planet, and solution-inspired discovery science. Berkeley Lab is defined by our deeply felt sense of stewardship , which we describe as a commitment to taking care of the Laboratory's research, people, and resources that are entrusted to us. Our values of team science, innovation, service, trust, and respect knit us together as a community. We practice these values and prioritize our principles of inclusion, diversity, equity, and accountability ( IDEA ) to build highly effective teams that produce world-class science and technology and where all individuals, regardless of their backgrounds, disciplines, and experiences, can thrive.

Working at Berkeley Lab has many rewards including a competitive compensation program, excellent health and welfare programs, a retirement program that is second to none, and outstanding development opportunities. To view information about the many rewards that are offered at Berkeley Lab- Click Here .

Berkeley Lab is an Equal Opportunity and Affirmative Action Employer. We heartily welcome applications from women, minorities, veterans, and all who would contribute to the Lab’s mission of leading scientific discovery, inclusion, and professionalism. In support of our diverse global community, all qualified applicants will be considered for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, or protected veteran status. 

Equal Opportunity and IDEA Information Links: Know your rights, click here for the supplement: "Equal Employment Opportunity is the Law" and the Pay Transparency Nondiscrimination Provision under 41 CFR 60-1.40.

Previous Job Searches

Create and manage profiles for future opportunities.

My Submissions

Track your opportunities.

Similar Listings

MS-Materials Sciences

 Bay Area, California, United States

📁 Postdoctoral Fellow

Requisition #: 101995

AM-Applied Mathematics and Computational Research

Requisition #: 102482

ED-Energy Storage & Distributed R

Requisition #: 102505

Berkeley Lab is committed to Inclusion, Diversity, Equity and Accountability (IDEA) and strives to continue building community with these shared values and commitments.

Berkeley Lab is an Equal Opportunity and Affirmative Action Employer. We heartily welcome applications from women, minorities, veterans, and all who would contribute to the Lab’s mission of leading scientific discovery, inclusion, and professionalism. In support of our diverse global community, all qualified applicants will be considered for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, or protected veteran status.

Equal Opportunity and IDEA Information Links: Know your rights, click here for the supplement: "Equal Employment Opportunity is the Law." and the Pay Transparency Nondiscrimination Provision under 41 CFR 60-1.4.

Privacy and Security Notice | LBNL is an E-Verify Employer | Contact Us

The Lawrence Berkeley National Laboratory provides accommodation to otherwise qualified internal and external applicants who are disabled or become disabled and need assistance with the application process. Internal and external applicants that need such assistance may contact the Lawrence Berkeley National Laboratory to request accommodation by telephone at 510-486-7635, by email to [email protected] or by U.S. mail at EEO/AA Office, One Cyclotron Road, MS90R-2121, Berkeley, CA 94720. These methods of contact have been put in place ONLY to be used by those internal and external applicants requesting accommodation.

HCC Careers

  • Upcoming Career Fairs

Lab Assistant I (ITP/ASL) Part-time - Staff Pool

🔍 houston, texas, system wide.

Responsible for assisting with the setup of lab equipment, instructional materials, and lab resources which are utilized in the Department.

ESSENTIAL DUTIES AND RESPONSIBILITIES include the following.  Other duties may be assigned.

  • Assist instructors, as needed, in operating laboratory classes in the department.
  • Organize, maintain, and control inventory.
  • Assist the instructor or senior lab personnel in preparing lab experiments and demonstrations.
  • Comply with all applicable health and safety regulations, policies, and established work practices.
  • Maintain lab equipment by cleaning and performing basic repairs, contacting the supervisor for more complicated repairs.
  • Perform clerical assistance, as needed for lab maintenance and attendance purposes.
  • May assist/tutor students with using lab software and equipment.

QUALIFICATIONS

To perform this job successfully, an individual must be able to perform the essential duties and responsibilities listed above.  The qualifications listed below are representative of the education, experience, knowledge, skills, and/or abilities required.

High School Diploma or GED required.

1 year experience in an educational lab setting or related area preferred.

KNOWLEDGE, SKILLS AND ABILITIES

  • Excellent verbal and written communication skills.
  • Proficiency in Word processing, spreadsheet and web page applications.
  • Able to work effectively with persons from diverse backgrounds .
  • Must be able to receive, organize, maintain, test, and setup lab, equipment, materials, or hardware supplies.
  • Familiar with an academic lab setup, configuration, and testing of related equipment.
  • Must be able to lift 25 lbs.

This job description in no way states or implies that these are the only duties to be performed by the employee occupying this position.  Employees will be required to follow any other job-related instructions and to perform any other job-related duties requested by their supervisor.

This job description may be revised upon development of other duties and changes in responsibilities.

The Organization

Houston Community College (HCC) is an open-admission, public institution of higher education offering a high-quality, affordable education for academic advancement, workforce training, career development and lifelong learning to prepare individuals in our diverse communities for life and work in a global and technological society. We’re proud to say that 98 percent of our graduates step into a job in their field of study immediately upon graduation. One of the largest community colleges in the nation, HCC has served the Greater Houston area for over four decades. Accredited by the Southern Association of Colleges and Schools, and the Schools Commission on Colleges, we offer 300+ associate degree and certificate programs to 75,000+ students across 13 Centers of Excellence and online each semester. We are proud to be No.1 among all community colleges in the nation in providing associate degrees to minorities and No.1 in educating international students, with 10.4 percent of our student population from outside the USA. Our vision is to become the Employer of Choice in support of our mission for Student Success by attracting, retaining and motivating the best employees.

The Team Some of the brightest minds in academics and business are choosing HCC as their home. When you join our talented team, you’ll play a special role as teacher, mentor and academic advisor. We’ll support you in your professional development as you contribute your knowledge and expertise to HCC, our students and the community.

Houston is a city with limitless possibilities:

  • Fourth-largest city in the U.S. and home to 54 Fortune 500 companies, second only to New York City’s 55.
  • Approximately 145 languages are spoken here.
  • Overall after-taxes living costs are 5.6 percent below the average for all 308 urban areas recently surveyed.
  • Houston is a major-league sports town, and don’t forget the annual Houston Livestock Show & Rodeo.
  • The weather is great! Mild winters ensure that outdoor activities can be enjoyed year-round.
  • World-renowned medical care. The Houston metro area has long been known for its first-rate healthcare system, with many Houston area hospitals consistently ranking among the nation’s top institutions.
  • With over 150 museums and cultural institutions in the Greater Houston area, museums are a large part of Houston’s cultural scene. 
  • Houston is the Culture & Culinary Capital of Texas with more than 7,500 restaurants and eating establishments covering 60+ cuisines. 

EEO Statement

Houston Community College does not discriminate on the bases of race, color, religion, sex, gender identity and expression, national origin, age, disability, sexual orientation or veteran’s status.  The following person has been designated to handle inquiries regarding the non-discrimination policies:

David Cross, Director EEO/Compliance, Title IX Coordinator Office of Institutional Equity

PO Box 667517                                   

Houston TX, 77266

713 718.8271 or  [email protected]

HCC values its employees and their contributions, promotes opportunities for their professional growth and development, and provides a positive working and learning environment that encourages diversity, innovation and creativity, and inclusion.

Individuals with disabilities, who require special accommodations to interview, should contact (713) 718-8565.

Returning Users?

​ Click Here  to login to your account and review your applications

Previous Job Searches

Create and manage profiles for future opportunities.

My Submissions

Track your opportunities.

Similar Listings

 Houston, Texas, System Wide

📁 Part-Time

Requisition #: 22001JB

Requisition #: 21002KS

Requisition #: 22001CO

Agenda-setting intelligence, analysis and advice for the global fashion community.

News & Analysis

  • Professional Exclusives
  • The News in Brief
  • Sustainability
  • Direct-to-Consumer
  • Global Markets
  • Fashion Week
  • Workplace & Talent
  • Entrepreneurship
  • Financial Markets
  • Newsletters
  • Case Studies
  • Masterclasses
  • Special Editions
  • The State of Fashion
  • Read Careers Advice
  • BoF Professional
  • BoF Careers
  • BoF Insights
  • Our Journalism
  • Work With Us
  • Read daily fashion news
  • Download special reports
  • Sign up for essential email briefings
  • Follow topics of interest
  • Receive event invitations
  • Create job alerts

Why H&M and Inditex Are Betting on Lab-Grown Cotton

A pair of hands hold a handful of cotton buds.

Ginning, blowing, carding, drawing, roving, spinning, weaving, dyeing, cutting, sewing, ironing, shipping and trucking — all the steps it took to turn some cotton bolls into your T-shirt. Those processes also contribute the most to the planet-warming impact of the clothing fibre.

Growing cotton bolls itself sucks up huge amounts of water, pesticides and fertilisers. For all the water you’ll ever use to wash your cotton T-shirt over its entire lifetime, it will have taken 50 times as much water to grow the cotton that went into it. Cotton uses about 2.3 percent of global arable land and accounts for 16 percent for all insecticide sales. And the fashion industry has been forced to reckon with allegations of forced labor and poor working conditions in certain cotton-harvesting regions.

Boston-based startup Galy says its found an alternative that avoids all of these problems by growing cotton in a lab. The company shared an evaluation by environmental consultancy Quantis to show that, at an industrial scale, its process reduces water use by 99 percent, land use by 97 percent and the negative impact of fertilisers by 91 percent when compared with conventional cotton.

Brazil-born Luciano Bueno, chief executive officer of Galy, founded the company in 2019. But cotton has featured in his business life for much longer. “I started selling T-shirts door-to-door just to pay my bills in high school,” he said. His first job at Deloitte involved working for textile companies. His first company Horvath Co., which he founded in 2015, tried to develop sweat-resistant shirts.

ADVERTISEMENT

But after Horvath got stuck in an exclusivity deal, he took a break and studied entrepreneurship in Silicon Valley. It was during the heyday of fundraising for lab-grown meat startups that Bueno thought he should apply the same idea to cotton. It’s taken Galy a few years, but now the startup has shown enough progress to secure investments from huge cotton consumers: Hennes & Mauritz AB and Zara-owner Inditex SA.

Galy takes cells from a cotton plant, adds them to a large vat and feeds them sugar. After they have sufficiently multiplied, Galy technicians use their genetic understanding of the plant — which has been developed over decades of research — to activate certain genes and deactivate others. The result is the cell transforms and elongates into a cotton fibre.

So far, Galy has only been able to make a few kilograms of vat-grown cotton. If it can make more at scale, the company has big dreams to also make lab-grown cocoa and coffee powders. At its stall at the Breakthrough Energy Summit in London in June, Galy showed off all three products.

Buyers of cotton care about strand length, strength and purity. Galy already has purity given the process happens inside a vat and not in the open. That has helped it secure a $50-million deal with Suzuran Medical Inc. for medical-grade cotton, which Galy plans to supply over 10 years once it starts producing at industrial scale.

For clothing, Bueno says that Galy still needs to improve on strand length. That development will need investments in further research. In an announcement today, Galy said it had raised $33 million from Bill Gates-led Breakthrough Energy Ventures, H&M and Inditex — bringing the company’s total raise so far to $65 million.

Martin Ekenbark, lead of H&M’s circular innovation lab, said that the fast-fashion retailer is seeing a rise in the demand for cotton. “Customers prefer the hand feel of fabrics made with cotton,” he said.

After H&M stopped using cotton from China’s Xinjiang region in early 2021, following allegations of forced labor, it faced Chinese boycotts. H&M and other cotton consumers are keen to find solutions that can produce cotton without these risks.

Inditex has invested in more than 300 startups with the goal of finding new materials that have a lower impact on the environment, a company spokesperson said, and it’s now working with Galy to “enhance fiber quality through various proof-of-concept tests.”

Pound-for-pound cotton is much cheaper than meat and has a smaller market. The global cotton market is about $60 billion and cotton sells for a little more than $1 per kilogram, whereas the meat market is more than $1 trillion. That’s why Bueno’s focus is not just scaling up production, but also doing so at a tiny fraction of the cost of the process used for lab-grown meat.

There are a few things that help Galy. The plant cells only need sugar to multiply, rather than complex growth material used for meat. And given people aren’t going to be eating the cotton, Galy can use reactors that don’t have to adhere to as high hygiene standards.

The hurdles that remain aren’t small. Despite plenty of funding and investor enthusiasm, lab-grown companies have struggled to grow because of the finicky nature of biology and the struggle to sell the products at much higher cost than traditional alternatives. Galy will face the same problem and it’s raising money at a time when climate-tech investments have been shrinking. That’s one reason why Galy isn’t currently facing any major commercial competitors for developing lab-grown cotton.

Peter Turner, a partner at Breakthrough Energy Ventures, points out that Galy’s cotton today is at the same point lab-grown meat was in 2013. That’s when the Dutch researcher Mark Post made a 5-ounce burger that reportedly cost €250,000. That lead to a rapid growth in the number of startups chasing the prize, with funding for the sector peaking in 2021. Galy wouldn’t say what its cotton costs today.

“We fully expect competition to follow,” said Turner.

By Akshat Rathi

Learn more:

Better Cotton to Expand Due Diligence After Brazil Deforestation Investigation

Fashion’s biggest sustainable cotton certifier said it found no evidence of non-compliance at farms covered by its standard, but acknowledged weaknesses in its monitoring approach.

  • Sustainability : Materials

© 2024 The Business of Fashion. All rights reserved. For more information read our Terms & Conditions

previous laboratory experiments

Shein: Fashion’s Biggest Polluter in Four Charts

The ultra-fast-fashion giant’s planet-warming emissions have nearly tripled in the last three years as its growth far outpaced other major fashion companies. In Shein's latest sustainability report, CEO Sky Xu says tackling emissions is “particularly critical."

previous laboratory experiments

Shein Reveals Child Labour Cases as It Steps Up Supplier Audits

The company said both cases had been “resolved swiftly,” with remediation steps including ending underage employees’ contracts, arranging medical checkups, and facilitating repatriation to parents or guardians as necessary.

previous laboratory experiments

Beware Fashion’s Sustainability Retreat

Financial and political volatility are having a chilling effect on the industry’s environmental efforts. But failure to act now will bring bigger risks in the future, writes Kenneth P. Pucker.

previous laboratory experiments

Why Hotter Weather Matters for Fashion

Scientists say it’s increasingly likely 2024 will be the world’s hottest year on record, with rising temperatures carrying big implications from shopping malls to supply chains.

Subscribe to the BoF Daily Digest

The essential daily round-up of fashion news, analysis, and breaking news alerts.

Our newsletters may include 3rd-party advertising, by subscribing you agree to the Terms and Conditions & Privacy Policy .

The Business of Fashion

Our products.

You're reading a free article with opinions that may differ from The Motley Fool's Premium Investing Services. Become a Motley Fool member today to get instant access to our top analyst recommendations, in-depth research, investing resources, and more. Learn More

Why Rocket Lab Stock Shot Higher in August

  • Rocket Lab demonstrated solid growth in its most recent quarter, and highlighted a significant Mars mission.
  • The company is still in its early days and significant risks remain, but management is executing well on their plan.
  • Motley Fool Issues Rare “All In” Buy Alert

NASDAQ: RKLB

Rocket lab usa.

Rocket Lab USA Stock Quote

The company's business is progressing as planned.

Rocket Lab USA ( RKLB 4.62% ) plans to go to Mars, and investors seem to think the stock could go along for the ride.

Shares of Rocket Lab climbed 19.7% in August, according to data provided by S&P Global Market Intelligence , on solid earnings and news about a high-profile mission.

To Mars and beyond!

Rocket Lab is part of a new generation of space companies reshaping the industry and taking share from incumbents. The company has quickly established itself as a leader , and is now trying to boldly go where few have gone before.

In August, the company reported strong year-over-year growth in the second quarter and a backlog of more than $1 billion in future business. Rocket Lab also announced it had built two spacecraft for the University of California Berkeley's Space Science Laboratory and NASA. The mission, which will travel to Mars, is part of a demonstration by Rocket Lab that it can do space missions at about one-tenth the cost of legacy space contractors. If successful, Rocket Lab should see significant demand for its services at these lower price points.

Is Rocket Lab stock a buy?

It is important to note that Rocket Lab is an early-stage, money-losing company with significant competition not just from large defense contractors but other start-ups as well. Rocket science is notoriously hard, and a lot can go wrong.

That said, Rocket Lab deserves plaudits for the progress management has made developing the business. Rocket Lab hopes to be a one-stop shop for government and corporate customers, able to design and manufacture satellites, launch them into space, and then maintain and control them from Earth.

A lot of the projected growth in the space industry assumes that companies who in the past had no space presence will eventually see value in having their own satellites in orbit. That requires a lower cost, and likely would favor Rocket Lab's approach compared to customers developing their own expertise in-house or dealing with a large number of vendors.

So, although this stock is not yet a sure thing, the company is well on its way toward proving out its concept. Demonstrating its value to NASA with this Mars mission is part of that process. The company continues to make progress in other areas as well, including the development of its new Neutron rocket that it hopes to launch in 2025 and which could eventually allow it to take larger payloads into space.

Given the risks involved in space, investors should remain cautious. But for those with a taste for high-risk, high-potential-reward type investments, Rocket Lab deserves consideration as part of a well-diversified portfolio .

Lou Whiteman has positions in Rocket Lab USA. The Motley Fool recommends Rocket Lab USA. The Motley Fool has a disclosure policy .

Related Articles

rocket launch

Premium Investing Services

Invest better with The Motley Fool. Get stock recommendations, portfolio guidance, and more from The Motley Fool's premium services.

4 reasons you should set up Power over Ethernet for your home lab

4

Your changes have been saved

Email is sent

Email has already been sent

Please verify your email address.

You’ve reached your account maximum for followed topics.

Key Takeaways

  • Less cable clutter: Reduce the number of cables by half with PoE technology.
  • More flexibility: No need to search for power outlets, making device placement easier.
  • Centralized power delivery: Manage devices remotely and avoid blackout issues with PoE switches.

As its name implies, PoE allows you to send both power and data to your devices over a single Ethernet cable. On paper, it may sound like a niche feature, as most servers, PCs, laptops, and NAS can’t be powered via an Ethernet cable. But if your computing environment consists of smart gadgets, Power over Ethernet can go from somewhat useful to an absolute game-changer – and here are four reasons why you should outfit your home lab with PoE technology.

EnGenius ECS112FP

EnGenius ECS1112FP review: A powerful 8-port switch with a hefty PoE capacity

Require an 8-port switch with PoE? This could be the one for you!

4 Less cable clutter

Why use two cables when you can go with just one.

Asus RT-AX57 Ethernet ports and antennas from an angle

The most obvious advantage of a PoE setup is that you’ll have fewer cables running around the house. Since most IoT and smart gadgets require both power and a data signal to operate, integrating PoE switches or injectors into your home lab can get rid of the barrel or USB connections required for power. As such, Power over Ethernet halves the number of connections in your home – and this can significantly reduce the cable clutter when you have multiple smart gadgets scattered all over the place.

3 More flexibility in device placement

No need to hunt for power outlets.

An image of three OBSBOT Tail Air webcams in an NDI setup

Even if your house has wall sockets around every corner, powering IP cameras and access points can be quite troublesome. Since they’re often fixed in hard-to-reach places, you’ll have to invest in long power adapters for these devices. And if you’re a fan of building cool projects with the Raspberry Pi and other SBCs , you won’t have to constantly seek out power outlets when testing your creations.

2 Ability to remotely manage all devices

An easy way to reduce the legwork required to manage a home lab.

A Raspberry Pi 5 kept near an Obsbot Tail Air webcam, with a PC in the background

If you have multiple IoT devices powering your smart home, you’ll have to go through the tedious ritual of flipping the power switches when you wish to power them on. PoE switches can make this process a lot less painful, as you can turn your devices on or off over a remote connection. Plus, some of the more advanced PoE switches can even let you monitor the power consumption, voltage, and other aspects of the IoT devices connected to them.

1 Centralized power delivery

Throw in a ups, and you won't have to worry about power outages anymore.

An Ethernet cable plugged into an RJ45 port

Besides adding to the mess of cables, one of the biggest drawbacks of using separate power connections is that you’ll be in a world of trouble if a particular outlet stops working. In case your house is as prone to power outages as mine, you’ll have to add a backup solution involving even more cables if you need constant access to your smart devices.

PoE provides an easy alternative to your blackout woes. All you have to do is plug an Uninterrupted Power Supply into your switch and voilà, you can continue using all devices compatible with PoE technology even during outages.

Taking your home lab’s prowess to the next level with PoE devices

With all the pros of PoE technology, you might be worried about the extra cost of integrating Power over Ethernet provisions into your home lab . Sadly, PoE-compatible switches can cost a fortune. If you’re the proud owner of an ultra-fast home lab, you may have to drop thousands of dollars on a 10GbE PoE switch.

EnGenius ECS112FP

5 reasons you should buy a network switch for your home lab

If you don't have one already, here are 5 reasons why you should arm your home lab with a dedicated network switch

Thankfully, there’s also a cheaper way to do things: if you don’t mind placing your switch near multiple power outlets and are willing to give up remote monitoring provisions for the IoT devices, PoE injectors might be more up your alley. Sure, they aren’t as feature-laden as PoE switches, but you won’t have to worry about blowing a hole in your wallet when buying PoE injectors.

EnGenius ECS1112FP

EnGenius ECS1112FP

Learn About Chemistry

Learn About Chemistry

chemistry lab equipment and uses

A Complete List of Chemistry Lab Equipment and their Uses

Performing experiments in the laboratory require skills and perfection, I have composed a list of 50 chemistry lab equipment and their uses in the laboratory.

This is a guide on the list of lab pieces of equipment and their uses. This is the same as saying use of apparatus used in Chemistry.

There are different duties one could carry out in the chemistry laboratory and will certainly require specific lab equipment or apparatus. As a matter of fact, in Chemistry, we often take accurate measurements of mass, time, and temperature.

Mass is measured using Balance. A top pan balance measures in kilogram, it weighs chemicals in the laboratory to two decimal places

Temperature is usually measured using a thermometer. The thermometer reading should be taken when the thermometer is inserted into the liquid. The reading in degree should be to one-tenth of a degree accuracy.

Time is measured accurately using electronic stop clocks. They measure seconds to two decimal places.

For accurate and precise measurements, a specific apparatus may be needed. Your ability to deduce the apparatus that would give you accurate measurement is invaluable.

What is Accuracy? This is the closeness of a measurement to the actual value.

What is Precision? This is the closeness of measurements to each other.

50 Chemistry Lab Apparatus and their Uses

This is the list of chemistry laboratory apparatus and their uses.

1. Evaporating dish : This is is usually made of porcelain to withstand high heat.

   Uses : used for evaporation or heating a solution to dryness or saturation point.

EISCO Evaporating Dish (Pack of 10), Round Form with spout - 80ml, Outer Dia 80mm, 30mm Height

2. Beaker : usually made of pyrex glass but some are actually made with plastics. It is cylindrical but has a lip for easy pouring of liquids.

   Uses : used to hold solids or liquids during experiments n the laboratory.

Labvida Upgrated 6 Size Low Form Glass Beaker Set, 25ml 50ml 100ml 250ml 500ml 1000ml, 3.3 Boro Glass beakers with Printed Graduation, LVA019

3. Bell Jar : A cylindrical shaped apparatus made of thick glass.

    Uses : used in combustion experiments

4. Burette : a long cylindrical glass tube with a tap or stop cock.

    Uses : measures or delivers accurate volume of liquids up to 50 cm 3 .

EISCO 25mL Acrylic Burette - Class B - 0.1mL Blue Graduations - with PTFE Key Stopcock

5. Bunsen Burner : A metal tube with a wide metal base

   Uses: A source of heat for experiments that require heating.

EISCO Natural Gas Bunsen Burner, StabiliBase Anti-Tip Design with Handle, with Flame Stabilizer and Gas Adjustment, NG

6. Crucible : made of porcelain and has a matching lid. It can withstand high temperatures.

    Uses: It can withstand high temperatures and is usually used for heating substances until the     

    decompose.

7. Crucible Tongs : The apparatus is made of iron.

    Uses: This is used for holding hot crucibles.

8. Desiccators : The apparatus is made of thick glass with a lid

   Uses : used for drying and keeping dry solids.

SP Bel-Art “Space Saver” Polycarbonate Vacuum Desiccator with Clear Polycarbonate Bottom; 0.09 cu. ft. (F42012-0000)

10. Distillation Flask : This is a round bottom flask with a slanting long side arm.

   Uses : used during simple distillation for the passage of the vapours into the condenser.

Fristaden Lab Vacuum Filtration Distillation Apparatus, 500mL Filtering Flask, 300mL Graduated Funnel, Laboratory Glassware, Distillation Kit, Glass Joint Filter, Vacuum Filter, 1-Year Warranty

11. Conical Flask : This is a glassware apparatus with needle-like mouth.

    Uses: used for holding or collecting liquids during experiments especially titration.

Okulab Conical Flask Set, 250mL Glass Erlenmeyer Flasks, Narrow Mouth, 3.3 Borosilicate Glass Conical Flask Set (case of 6) for Lab, Classroom, Craft, Kitchen,EFNG250A6

12. Filter Funnel : Can be made of glass or plastic with a needle–like bottom.

     Uses: used during filtration and filling the burette

13. Gauze wire : made of iron mesh with the asbestos center.

      Uses: Usually placed on tripod stand when heating to support the flask.

14 . Gas Jar: Made of glass with a flat glass cover.

     Uses: used for collecting gases

15. Measuring cylinder : made of glass or plastic graduated and with a lip.

     Uses: used to measure the volume of liquids but not accurately.

2PCS Graduated Cylinder Measuring Cylinder Thick Glass Lab Cylinders (250ML)

16. Gas Syringe : Laboratory equipment made of glass or plastic.

     Uses: used to collect volume of gases during experiments in the laboratory.

Hoypeyfiy 550ml Large Syringe ,Plastic Syringe with 40 Inches Plastic Tubing for Scientific Labs,Liquid,Plant Irrigation,Perfumes,Inks,Feeding Pets

17. Graduated Pipette : An apparatus made of glass or plastic

     Uses: used to deliver accurately small volumes of liquids.

Adjustable Pipettor Pipette Micropipette High-Accurate MicroPipette Variable Volume Pipette 100ul-1000ul

18. Bulb Pipette : glassware with needle-like ends

     Uses: used to transfer accurately definite or specific volumes of liquids.

19. Dropping Pipette : An apparatus with a rubber teat.

  Uses: used for dropwise addition of a solution or reagent.

ULAB Single Channel Pipettor with Pipette Tips Offered, 1pc of Adjustable Volume Micro Pipette with Vol.Range.100-1000μl, 500pcs of Vol.1000μl Pipette Tips in Blue Color, ULH1021

20. Pipette Filler : made of flexible rubber .

  Uses: it is used in filling the pipette with the solution or base.

2 Pcs Rubber Pipette Filler, 3 Valves Lab Pipette Filler Bulb, Three Valve Safety Pipette Filler

21. Specimen tube or bottle : an apparatus made of glass or plastic.

  Uses: This is used for keeping or storing a small amount of solids.

21. Spatula: A laboratory equipment made of plastic or steel that is chisel-shaped at the end.

   Uses: it is used for transferring or putting small quantity of solids into test tubes or during

   flame tests.

22. Deflagrating spoon : This is a long stainless steel wire with a cup at the end.

  Uses: it is used in introducing small quantities of chemicals into the gas jar.

23. Retort Stand : a long cylindrical rod that can be coupled with a clamp.

   Uses: it is used to support or hold burettes upright.

Laboratory Retort Support Stand for Titration Extraction - XMWangzi, with a Burette Clamp and 2 Flask Ring Clamps, Used in Chemistry or Physics Lab (RodLength 16'')

24. Burette Stand : This is usually a wooden apparatus used mainly during titration.

   Uses: This is used to hold the burette uprightly and firmly during titration.

25. Test Tubes : This is a piece of apparatus made of glass sealed at one end.

  Uses: used in qualitative analysis to hold reactants.

26. Test Tube holder : This piece of apparatus could be made of wood, brass, or stainless steel.

 Uses: Commonly used to hold test tubes during experiments.

27. Thermometer : This is one of the very important chemistry lab equipment. It is made of glass with a bulb at one end.

  Uses: it is used in measuring the temperature of liquid or solution.

28. Delivery Tube : it is made of glass and can be bent into any shape.

Uses: it is used to bubble a gas over a liquid or delivery gases into gas jar.

29. Tripod stand : An apparatus made of iron with usually three legs. It could bear a triangular or circular base.

Uses: usually used to support flasks when heating in experiments.

30. Pipette stands : This is a piece of apparatus made of wood or even plastic.

     Uses: usually for keeping pipettes.

31. Capillary Tube : this is a laboratory apparatus made of tiny glass tubes with very small diameters.

   Uses: applied in the determination of the melting point of solids.

The purity of a substance is determined by melting point, boiling point, and single spot on a chromatogram(for colored substances). Chemistry fact

32. Tile : this is an apparatus made of white plastic or tile.

  Uses: used in titration for clear and sharp observation of colour change.

33. Pneumatic Trough: This is an apparatus made of thick glass that is cylindrical, rectangular but usually rimless.

34. Stopwatch : Very accurate timing apparatus

Uses: used in timing during experiments

35. Hofmann’s Voltametter : A glassware apparatus designed perfectly for the determination of water

Uses: used in the determination of the chemical composition of water

36. Watch glass : it is circular glassware with a round or flat bottom.

   Uses: it is used for keeping solid you want to dry in air or desiccators.

37. Separating Funnel : A glassware apparatus with a short stem and stopcock.

   Uses: used in separating immiscible liquids like oil and water.

38. Thistle Funnel : a glassware apparatus with a thistle head and a long stem.

   Uses: used in experiments for intermittent addition of reagents.

39. Kipps Apparatus : A lab equipment made of thick glassware.

   Uses: used in the intermittent supply of gases

40. Buchner Funnel : the apparatus is made of porcelain and has a fixed perforated plate.

   Uses: it enables solids to be sucked dry and is also used for suction filtration.

41. Centrifuge : manual or electric operated machine.

 Uses: used in separating solid particles in a liquid.

42. Leibig Condenser: made of glass tubes fixed together with openings for water inlet and water outlet.

  Uses: it is used for cooling and condensing vapours into liquids. Usually used in distillation experiments

43. Combustion Boat/tube : usually made of aluminum and is boat-like in shape.

Uses: used in combustion experiments

44. Filter Pump: made of glass, plastic or nickel-plated brass.

 Uses: it provides a Vacuum suitable for filtration by suction.

45. Flat bottom flask : A flask with a flat bottom base that can stand on its own.

  Uses: it is used for boiling liquids in the laboratory.

46. Round bottom Flask : A flask made of glassware that has round bottom base.

47. Beehive Shelf : Made of porcelain or glassware with openings for insertion.

  Uses: serves as a support for the gas jar during preparation and collection of gases over water.

48. Chromatographic jar or Tank : made of thick glass but rectangular in shape.

  Uses: used in developing thin layer chromatographic plate.

49 . Cork : a wooden apparatus shaped with a matching cover.

 Uses: for supporting round bottom flasks

            Used in stoppers having one or two standard holes for connecting delivery tubes.

50. Water bath : made of metal and filled with water

 Uses: it is used in gradual boiling and heating of liquids for slow evaporation.

These are the different laboratory apparatus fused for different specific uses.        

I have also written an alternative version of this post, a tabular form of common apparatus used in chemistry,

Similar Posts

Changes Matter Undergo (2)

Which is the Process by Which a Gas Changes to a Solid? Deposition, Evaporation, Freezing, Sublimation

Which is the Process by Which a Gas Changes to a Solid? Deposition, Evaporation, Freezing, Sublimation. The straight answer to this question is Deposition. Deposition is the change of state of matter from gaseous state straight to solid state without passing through the intermediate form which is liquid state. The summary of the processes that…

What Is Quantum Chemistry

What is quantum chemistry?

What is quantum chemistry? Quantum chemistry is the branch of theoretical chemistry that applies quantum mechanics to study the behavior and properties of sub-particles of an atom, especially the electron. What is an orbital? An orbital is a region where there is a high probability of finding an electron. There are basic principles that determine…

5 Solutions on Mole-to-Mole Stoichiometry Problems 

Stoichiometry problems involving mole to mole calculations by using balanced chemical equations to determine the amount of one substance (in moles) required to react with a certain amount of another substance (also in moles).  In addition, every stoichiometry problem can be solved from the angle of mole-to-mole stoichiometry problem tips.  In mole-to-mole stoichiometry, there is need to…

igcse chemistry practicals

A Guide to IGCSE Alternative to Practical Chemistry

This is a complete guide to IGCSE alternative to practical chemistry and I have broken it down into several sections just to help us understand what is required in this section. Please remember in IGCSE Olevel chemistry, there are three sections. We have the essay(theory), objectives, and alternative to practical or Practical. The mark distribution…

How does stoichiometry support the law of conservation of mass?

Stoichiometry is a concept based on balanced chemical equation and stoichiometry can support the law of conservation of mass. We’ve discovered that in balanced chemical equation, the mass of reactants and products remain the same.  What is law of conservation of mass?  Law of conservation of mass is otherwise called the law of conservation of matter…

resonance in chemistry

What is Resonance in Chemistry?

Resonance in Chemistry is otherwise called Mesomerism simply refers to the alternation or rotation of double bonds in some compounds which ensure their stability. Resonance in Chemistry is a term used to describe the representation of the covalent bonding in some compounds like Benzene, Carbon IV oxide, and even Ozone. Additionally, resonance exists in ions…

One Comment

This is very educative and please I need more of this as this helps science students in many ways.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

LABOR DAY SALE: UP TO 40% OFF + FREE SHIPPING

Cushion Lab: Specially Designed Ergonomic Memory Foam Pillow & Bedding

  • Deep Sleep Pillow
  • CloudLoft™ Pillow
  • TruFiber™ Bamboo Sateen+ Sheet Set
  • TruFiber™ Comforter
  • Side Sleeper Body Pillow
  • TruFiber™ Duvet Cover
  • Neck Relief Cervical Pillow
  • Side Sleeper Knee Pillow
  • Back Relief Lumbar Pillow

Pressure Relief Seat Cushion

  • Ergonomic Foot Cushion
  • Ultimate Sitting Comfort Bundle
  • Pressure Relief Car Seat Cushion
  • Ergonomic Travel Pillow
  • Travel Deep Sleep Pillow
  • Ultimate Travel Comfort Bundle
  • Mindful Meditation Cushion

Best Rated Ergonomic Travel Pillow 2019

Award winning sleep at 30,000 ft.

ergonomic cervical pillow for neck pain relief

The pillow of your dreams.

previous laboratory experiments

Next level driving comfort.

previous laboratory experiments

Making work more comfortable.

Up to 40% off storewide & free shipping included

  • The Sleep Blog
  • Order Status
  • Gift Card Balance
  • Business Inquiry

ergonomic memory foam pillows with boxes

Our Story - Bringing comfort for you and the planet.

climate neutral - cushion lab

Sustainability  - Carbon Neutral by 2020

One tree planted Partnership - save our planet

Onetreeplanted Partnership - Reforestation project

Sleep & Comfort

  • Sleep Improvement
  • Better Sleep & Better Comfort
  • Back Heath & Comfort

previous laboratory experiments

7 Reasons Why You Wake Up In The Middle of The Night & What You Can Do

previous laboratory experiments

7 Proven Secrets to An Effective Bedtime Routine for Adults

deep sleep blanket for insomnia

Everything You Need to Know about Weighted Blankets

Labor Day Sale

previous laboratory experiments

Game-changing Sitting Comfort

back relief lumbar pillow for office chair, home office, recliner

Better comfort, better productivity

neck relief cervical pillow

Imagine a nightly SPA for your neck

previous laboratory experiments

Dreamy, like sleeping on clouds

Savings up to 40% off

Cushion Lab Pressure Relief Seat Cushion 01

Buy More, Pay Less!

Imagine sitting all day on any chair in total comfort.

Patented shaped by in-house ergonomists, the Ergonomic Seat Cushion is scientifically designed to relieve sitting and hip pressure while improving sitting posture on any chair. Proprietary extra-dense charcoal memory foam is odor resistant, and provides cloud-like support for all-day sitting comfort.

  • Office or Working From Home Instantly add enjoyable bottom cushioning to any chair to relax hip muscle and improve posture.
  • Driving Enjoy comfy cushioning with every drive without fatigue.
  • Grippy Bottom Secures the cushion against your chair, removable in seconds.
  • Pairing Use with Back Relief Lumbar Pillow for the ultimate sitting comfort.

Dimensions: Standard: 18 W x 15.5 L x 4 H inches Large: 21 W x 18 L x 5 H inches

*For 210lbs+, we suggest opting for the Large.

previous laboratory experiments

Advanced design with multi-region pressure relief

sciatica pain seat cushion for work from home

Soft and comfy, like sitting on cloud.

Relieve Hip Pressure

Proprietary HYPERFOAM™ , soft rebounding memory foam is infused with charcoal extract to prevent odor and conforms naturally around the leg and hip to deeply release muscle tension and provide cloud-like comfort.

Smooth and Breathable

Removable and washable cover is made with earth-friendly recycled polyester fabric that breathes and wicks moisture to ensure cool and airy usage even during the summer time.

The Pressure Relief Seat Cushion is for anyone that's looking to sit comfortably for an extended period of time. Its unique multi-region pressure relief design provides even body weight distribution, better sitting posture, and coccyx/tailbone relief. It also provides effective lower back pain and sciatic relief.

When paired with the Back Relief Lumbar Pillow, you can enjoy the ultimate working/sitting comfort that keeps your body loose and relaxed.

  • Elastic earth-friendly recycled polyester outer shell.
  • Breathable earth-friendly recycled polyester mesh inner cover.
  • 100% charcoal infused Extra-dense Hyperfoam™, CertiPUR-US certified premium visco elastic memory foam.

This pillow comes with a 1-year limited warranty. 

Dimensions 18 W x 15.5 L x 4 H inches 

Washing - The Pressure Relief Seat Cushion comes with a conveniently removable hypoallergenic cover that is machine washable. 

Drying - Air dry only.

*The memory foam pillow core is not washable.

Free shipping is included  for the 48 contiguous United States. We charge a flat shipping rate for Alaska and Hawaii. 

We ship on weekdays from our warehouses in Fontana, CA, and Louisville, KY, whichever is closer to you, to speed up the shipping time. Weekend orders will ship on Monday, and all orders should arrive in 2~5 business days. 

Free 3~5 Business Day Shipping

Excellent Customer Service

Ship from USA

IMAGES

  1. Experiments in a chemistry lab. conducting an experiment in the

    previous laboratory experiments

  2. Premium Photo

    previous laboratory experiments

  3. Experiments in a Chemistry Lab. Conducting an Experiment in the

    previous laboratory experiments

  4. Lab Experiment

    previous laboratory experiments

  5. Chemistry student doing experiment Stock Photo

    previous laboratory experiments

  6. Experiments in a Chemistry Lab. Conducting an Experiment in the

    previous laboratory experiments

VIDEO

  1. Experiments in the Revival of Organisms (1940)

  2. Laboratory experiments 🧪 @Theschooldiary07103 #experiments #schooldiaries

  3. laboratory experiments at chemistry forum

  4. Top 10 Most Terrifying Experiments in History

  5. Revolutionary Coil Gun: The Electromagnetic Cannon Breakthrough

  6. It's never too early to start revising. Get your revision guides here: www.oxfordrevise.com #gcse

COMMENTS

  1. How To Write A Lab Report

    A lab report conveys the aim, methods, results, and conclusions of a scientific experiment. The main purpose of a lab report is to demonstrate your

  2. America's Lab Report: Investigations in High School Science

    Read chapter 3 Laboratory Experiences and Student Learning: Laboratory experiences as a part of most U.S. high school science curricula have been taken fo...

  3. How Do I Present Findings From My Experiment in a Report?

    General Considerations It is useful to note that effective scientific writing serves the same purpose that your lab report should. Good scientific writing explains: The goal (s) of your experiment How you performed the experiment The results you obtained Why these results are important While it's unlikely that you're going to win the Nobel Prize for your work in an undergraduate laboratory ...

  4. LabCheck : Improving your lab report

    2-give the necessary background for the scientific concept by telling what you know about it (the main references you can use are the lab manual, the textbook, lecture notes, and other sources recommended by the lab manual or lab instructor; in more advanced labs you may also be expected to cite the findings of previous scientific studies ...

  5. Laboratory Experiments

    Laboratory experiments can include animals as they offer greater experimental control opportunities than research with humans. ADVANTAGES: Lab experiments offer precise control over variables, minimizing extraneous influences and facilitating the establishment of cause-and-effect relationships, which is particularly advantageous in psychology ...

  6. Experimental Method In Psychology

    1. Lab Experiment A laboratory experiment in psychology is a research method in which the experimenter manipulates one or more independent variables and measures the effects on the dependent variable under controlled conditions.

  7. Enzymes

    You are already familiar with the steps of the scientific method from previous lab experiences. You will need to use your scientific method knowledge in today's lab in creating hypotheses for each experiment, devising a protocol to test your hypothesis, and analyzing the results.

  8. Laboratory Exercises in Microbiology: Discovering the Unseen World

    12. Make sure to carefully read through the entire procedure before beginning an experiment in the lab. This will help prevent you from making mistakes that could compromise your safety. Notes: Since many laboratory procedures are carried over to the next week, make sure you bring previous lab write-ups with you to the following lab.

  9. A guideline for reporting experimental protocols in life sciences

    Abstract. Experimental protocols are key when planning, performing and publishing research in many disciplines, especially in relation to the reporting of materials and methods. However, they vary in their content, structure and associated data elements. This article presents a guideline for describing key content for reporting experimental ...

  10. PDF Experiment 1: Mass, Volume, and Density

    In this experiment, which will take two lab periods, you will use common glassware and equipment in order to study the physical property of density.

  11. Evaluating replicability of laboratory experiments in economics

    In this report, we provide insights into the replicability of laboratory experiments in economics. Our sample consists of all 18 between-subject laboratory experimental papers published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014.

  12. PDF 17-ID-GeneralAspects

    To demonstrate a capacity to utilize previous laboratory experiences to accurately interpret tests conducted to identify a certain organism. To know specifically which diagnostic tests are required to identify a bacterial species. To demonstrate skill in coordinating the usual laboratory work with that of identifying an unknown organism.

  13. Anthrax detected in a moose near Elk Mountain

    You can report a wildlife disease incident online or by calling the Game and Fish Wildlife Health Laboratory at 307-745-5865. Human cases are rare but precautions are warranted. If you have concerns that you may have come into contact with an anthrax-infected animal, please contact the Wyoming Department of Health and seek medical attention.

  14. Pandora Era Lab-grown Diamond Pavé Bar Necklace 0.26 carat tw 14k Gold

    Meet your new never-take-me-off necklace. Prong-set with 17 lab-grown diamonds, this 14k gold necklace is designed with a bar shape that radiates light from all angles. The prong setting embraces the lab-grown diamond on a diagonal, allowing as much light to reflect through it as possible, so your jewelry sparkles all the brighter. Add it to your stacks for a touch of brilliance, or wear it ...

  15. RI crime laboratory under scrutiny as some forensic work is suspended

    The suspension of "casework in the firearms and toolmarks section" of the lab commenced on Aug. 20, 2024, according to a news release issued by the lab. The suspension will remain in place prior ...

  16. NESAP for Programming Environments and Models Postdoc

    The Laboratory conducts world-class research that supports clean energy, a healthy planet, and solution-inspired discovery science. Berkeley Lab is defined by our deeply felt sense of stewardship, which we describe as a commitment to taking care of the Laboratory's research, people, and resources that are entrusted to us. Our values of team ...

  17. PDF General Chemistry 101 Laboratory Manual

    8 Each day, before you leave your lab bench, clean off the bench surface. Remove matches and papers, and wipe down the surface with water and paper towels. C. EYE PROTECTION 1. You are required to wear approved eye protection in the laboratory whenever you are doing any experiment or whenever any experiment is being done in the laboratory around

  18. Lab Members

    UCLA Biological Chemistry 615 Charles E. Young Dr. South 350B BSRB Los Angeles, CA 90095 . Email: [email protected] Phone: 310-825-3658

  19. University of Florida

    Classification Title: Enter the classification title here. Job Description: Dr. Artem Nemudryi's lab (nemudryi-lab.com) seeks a motivated Lab Technician to join our growing team in the Department of Biochemistry & Molecular Biology at the UF College of Medicine.Nemudryi lab uses CRISPR-guided ribonucleases to study how human cells repair RNA and how RNA viruses co-opt these mechanisms for ...

  20. Lab Assistant I (ITP/ASL) Part-time

    Assist the instructor or senior lab personnel in preparing lab experiments and demonstrations. Comply with all applicable health and safety regulations, policies, and established work practices. Maintain lab equipment by cleaning and performing basic repairs, contacting the supervisor for more complicated repairs.

  21. How do different laboratory environments influence students' attitudes

    Request PDF | How do different laboratory environments influence students' attitudes toward science courses and laboratories? | The science learning environment is an important factor in ...

  22. Why H&M and Inditex Are Betting on Lab-Grown Cotton

    Boston-based startup Galy says its found an alternative that avoids all of these problems by growing cotton in a lab. The company shared an evaluation by environmental consultancy Quantis to show that, at an industrial scale, its process reduces water use by 99 percent, land use by 97 percent and the negative impact of fertilisers by 91 percent ...

  23. Why Rocket Lab Stock Shot Higher in August

    Rocket Lab USA (RKLB 4.62%) plans to go to Mars, and investors seem to think the stock could go along for the ride.. Shares of Rocket Lab climbed 19.7% in August, according to data provided by S&P ...

  24. 4 reasons you should set up Power over Ethernet for your home lab

    If you're the proud owner of an ultra-fast home lab, you may have to drop thousands of dollars on a 10GbE PoE switch. Related 5 reasons you should buy a network switch for your home lab

  25. A Complete List of Chemistry Lab Equipment and their Uses

    Performing experiments in the laboratory require skills and perfection, I have composes a list of 50 chemistry lab equipment and their uses in the laboratory.

  26. Pressure Relief Ergonomic Seat Cushion

    Imagine sitting all day on any chair in total comfort. Patented shaped by in-house ergonomists, the Ergonomic Seat Cushion is scientifically designed to relieve sitting and hip pressure while improving sitting posture on any chair. Proprietary extra-dense charcoal memory foam is odor resistant, and provides cloud-lik