Sunday, March 29, 2015

3/30 Jenna - Sense-making through models and argumentation

In McClain et al.'s "Supporting Students' Ways of Reasoning about Data," the researchers designed a unit that would lead students to develop a "single, multifaceted notion... of distribution" (2000). Over the course of several inquiries - about battery life, traffic flow, and medical treatment - the students relied upon their intuitive ideas about data to construct knowledge about statistics and data analysis. The researchers designed two computational tools (capable of manipulating data and sets in different ways) to support this process. The data analyses enabled students to make arguments that lead to recommendations for decision-making. 

I found the authors' argument for statistical literacy to be a convincing one. I was especially impressed that the students had so much agency in what they could do with the data to form their argument and that their questions predicted their need for new tools and interpretive strategies to make sense of the data. I couldn't help but think of two of Harlow's pedagogical resources: that children are creative and that guiding students is less certain than telling; I thought bought were applied superbly by the researches in this context. Not only did the students come up with rich interpretations of their data, but they also appropriated the computational tools for use in ways that the researchers had not intended! These are the outcomes that every teacher hopes for. Furthermore, it seems likely that students will transfer this learning successfully to other classes and topics.

I did have some lingering questions after reading this article. Mainly, I wonder how successful this program was and its potential for generalization. I would have liked to see some quantitative data accompany the authors' narrative about the participants, about how the desired outcomes were defined, and how many students achieved these outcomes. Only six students are given by name in the narrative and no information is given about the size or context of the class. 

In relation to this course, I think that this article fits in nicely with the argument-driven inquiry readings from last week as another example of classroom activities that engage students in scientific practices. The unit described in this article in a way illustrates how I interpreted the recent announcement about Finland's interdisciplinary teaching; the designed unit could be enlarged to included inquiries about the topic of the data and thus integrate more scientific knowledge with the mathematical practices of statistics. I also think this article enriches our conversations on modeling by raising questions about how students use models to generate data as well as how they construct models from data. 

Sunday, March 22, 2015

3/23 Jenna - Supporting Argumentation

As I have been working on my model in Netlogo recently, the recurring thought on my mind has been how the modeling experience is balanced with the other NGSS practices we discussed in the beginning of the semester. I don't have much experience with coding, and I've often found myself stuck in the middle of writing my code, unsure of how to proceed. Using the model to generate evidence and explanations goes out the window at that point - I'm sure K12 students would probably fall into this trap too.

I really liked that the researchers from both articles integrated all of the NGSS practices into cohesive curricular units and that they were invested in doing so to improve students' abilities to reason and argue within the scientific domains. From a design perspective, I was really impressed with BGuILE's embedded discourse supports within the inquiry software. The ExplanationConstructor, for instance, definitely speaks to the concern I started this blog with; this tool helps keeps students on task by organizing the explanation structure into manageable and clearly-defined pieces. I also liked that BGuILE lets students choose what data to plot when looking for patterns, and how the researchers demonstrated that the software helped students retain the distinction between evidence and theory.

Netlogo seems to diverge from BGuILE in light of these design features, especially when students use Netlogo to build their own models. With the programs in the Model Library of Netlogo, the program acts as a simulation: the student manipulates the initial conditions, and the program responds. The data that is visible from this response was pre-decided by the program's author. So, these models (to me at least) seem pretty superficial in their exploratory use; it's only when students start manipulating the code that this expands. On the other hand, building code from scratch in Netlogo is a daunting task, and the student can only seek the teacher or peers for immediate guidance, and must turn to experts (via scientific papers) for support for initial assumptions and modifications. In BGuILE, embedded tools mediate the entire inquiry experience in a process that is a complete reversal from Netlogo. BGuILE presents huge data sets to the students, which they sift through in search of meaningful patterns, which they then use to construct explanations and models, while in Netlogo, the model building is the primary process and meaningful patterns can be substantiated later.

What tools would Netlogo need to provide an experience that is more like BGuILE? What affordances and constraints does each have? Which would you prefer to use in your classroom, and why?

3/23 - Kim K. - Beguiled by Science!

The BGuILE software discussed in Reiser et al’s paper makes me swoon.  They have created a program that incorporates writing and explanations within it, rather than a separate worksheet a teacher might create to go along with the program, and it makes me really wish that NetLogo could have a similar feature.  I especially like that students can copy statements directly from their data analysis as evidence support for their explanations.  Also the BGuILE model follows the ADI instructional model discussed in Sampson and Gleim’s article quite well.
            I was concerned about Reiser et al’s opening statement about needing science class to be more rigorous, because we have read many other articles discussing the need for covering topics in a depth versus breadth way.  However they saved themselves with this gem, “these design efforts should focus more on maximizing the breadth of conceptual and material approaches, rather than on maximizing a breadth of content topics,” and I think that is exactly how science teachers need to be teaching their students.  Students need the tools for thinking about various scientific problems so that when they encounter unfamiliar topics either on tests or in their college careers they will not be struggling to reason and evaluate the new content.  I think the scaffolding apparent in BGuILE will help students to slow down in a way that they will not be able to (or want to) get through the science program as quickly as they can.  The ExplanationConstructor is especially great for promoting scientific literary and scientific writing skills.

My Question is:

Is there a way we can incorporate the ideas behind ExplanationConstructor with NetLogo? (I feel that NetLogo in itself does not require students to associate the scientific question with explanations.)

3/23 Davio Bergsmith Inquiry in Science Classrooms

            Both articles included scientific inquiry for scientific knowledge. Also, both articles stressed using evidence to draw conclusions. Students should explain and argue scientific content to build knowledge including vocabulary and theories and also address misconceptions. Reiser says explanations should describe concepts with detail and should include observations that the students have made. These experiences of sharing and talking about science whether between peers or peers and instructor, help to construct a shared science knowledge between peers. Sampson and Gleim used this approach in their Argument-Driven Inquiry strategy. This strategy should be applied throughout science teaching, especially as students are in engaging in scientific practices. The ADI approach has some similarities to the eight science practices laid out by the Next Gen Science Standards. Both involve identifying a task or problem by asking questions, the use of models or experiments to develop a theory or argument, then explain the argument to peers or instructors and revise. Reiser also talks about scaffolding computational use in the classroom. This should be used so that students can develop a use for computational models and gain the most from them. Throughout our class, we have discussed how to include things students want to learn about in our curriculum. When students are interested in the things they study in class, it will give ownership to the inquiry practices that will be offered. The importance of students' ideas and interests also has value in classroom ecology as well as scientific inquiry.


            How difficult would it be to construct a NetLogo model fro scratch? I cannot remember if this has been discussed in class or not yet. When reading about inquiry in the science classroom this week, I thought about what concepts do not have a model in the library and how difficult would it be for my students to try to develop a model from scratch. It would be disappointing to students who showed a high level of interest in a particular concept but then did not have a model available to learn more about the concept. I think that if I devoted some time to NetLogo I could write a program (using some other models for help) but how difficult would this be for high school students? And would students be willing to put in the time to learn about a program so that they may learn about one particular concept?

3/23 Laura: Science is Social

I really enjoyed both readings this week as they highlighted the communicative element of scientific practice, a factor which I think is integral and often under-valued.  I think that both Reiser et al and Sampson and Gleim offered powerful examples of learning communities and the importance of developing and revising explanations with the help of peers, the teacher, and diverse resources.  The ability to create sound explanations, which articulate causal mechanisms and account for observed data, is a powerful tool in all subjects.  By supporting our students in developing strong logic in the sciences, we can help them become better writers and more critical consumers of information, goals which I think we all have as teachers.   To develop this logic, Reiser et al suggest highlighting the distinction between observation and interpretation.  I think this is an astute decision as here you can get to the root of student’s misunderstandings of what is ‘knowable’ from an observation, and gently steer them away from assuming, projecting or anthropomorphizing by encouraging metacognition.  The ExplanationConstructor journal is a valuable tool in this metacognition as it allows both the student and the teacher to trace the student’s thought process through notes and the decision tree.  This journal is reminiscent to me of my lab notebooks working in marine science labs, and I hope to offer my students a similar space, either through BGuILE or through written observation journals, where they can work through inquiry and data and be able to visualize the experience later, from which they could glean more sophisticated and personal correlations and theories.  
            I was also very excited that Reiser et al envisioned their program working seamlessly into existing curriculum and side by side with physical representations and experiences.  I think that traditional and computational modeling and experiences can and should be complementary, to utilize the power and scope of computational models as well as the familiarity and experiential nature of physical models.  Reiser et al make an excellent point about unfamiliarity, in that you have to be aware of the level of transfer your student is going to experience when using a new tool and optimally prepare them or scaffold with the familiar to help make the experience less dissonant.             
             I am concerned that it is very challenging to know what to pull from a large dataset, even for professional scientists, and that could be a limitation for student’s experience.  Reiser et al suggest doing one exercise beforehand with a smaller data set and then ‘strategic conversations with students about what they are trying to achieve and what they will learn from particular queries of the data’.  However, I think these conversations would need to be heavily scaffolded as I can imagine the blank stares when asking students what they would learn from different statistical queries, potentially with a whole class lesson or discussion on what is possible to ask.  
            I am disappointed with teaching practice section of Reiser et al, as did not really indicate good practices for ‘creating and sustaining a climate of inquiry’, but rather just said to ‘augment’ software, which I think minimizes the role the teacher plays in prompting inquiry.

Questions:
Is the timeline presented by Reiser et al realistic?  With 36 total school weeks, and 6-7 week units, you could cover max 6 concepts. 
What is the best way to scaffold students before/while introducing them to large data sets where correlations may not be readily apparent?
Is Sampson and Gleim’s suggestion to allow students to invent their own method too open ended?

3/23-Elizabeth-Inspecting Inquiry

Both articles this week discussed the importance of student agency and inquiry, and the importance of a deeper understanding of scientific concepts over barely touching upon all topics. 
The Sampson and Gleim article focused on the Argument Driven Inquiry approach, which emphasizes collaborative explanation and argumentation along with the incorporation of other subjects, such as reading and writing.  Included within this article are the 8-steps of ADI, including the beginning identification of a scientific phenomena, collaborative work of forming refining explanations through argumentation, and final student reflection.  According to the researchers, by collaborating and making each other’s explanations and arguments visible, students look at their own reasoning and way of knowing to revise their argument and help lead to a consensus. 
In his article “BGiuLE,” Reiser focuses on the importance of authentic student explanation driven inquiry through the use of technology supported learning activities.  One comment that Reiser wrote that I thought was critical was “constructing a technology-infused curriculum requires designing both classroom-based activities that prepare students for complex software investigations and off-computer-activities interspersed with students; work on software, that set the student interactions with the technology in a broader set of social interactions.”  Oftentimes when I have thought about technology within the classroom, I see the use of technology as a completely separate entity.  However, Reiser does a good job in creating clear scaffolds (which slowly build up to the computer programs) and showing how one can include multiple media into a lesson in a logical way to benefit the students.  In what he calls his “staging” activities, he tells his readers to use familiar learning strategies/materials, such as worksheets and data sheets, where students can analyze the data in a more confortable setting and be more willing to begin asking those deep-thinking questions before moving onto the more technology based instruction and learning.  I really liked this part of his article and found it very useful.  Reiser also talked about “generality” versus specificity in the supportive tools used.  While he does not push for one or another, I think is a topic that hasn’t been discussed thus far in the class but one this is very important.  While there is a benefit to having a general scaffolding tool, such as a computer program, there are also benefits to having more tailored programs where students can manipulate more things and observe more outcomes.  Thus, at this point in school, what should possible computer models look like?  Should schools strive for a common program across domains for more detailed, tailored programs for individual domains? In a sense, the model that Caitlin and I are modifying supports generality, as it can be used across the domains of Chemistry and Biology,  

Major themes:
·      The importance of inquiry, argumentation and explanation within the learning of science, their interdependence, and their use to make science more engaging.
·      The importance of collaboration in fostering this development of scientific knowledge and the need for peers in helping critique and reconstruct consensus explanations.
·      Making science meaningful and applicable for students is critical for student success, participation, and interest in the sciences

·      Classroom environment is key for success-create an environment where students will not be afraid to come up with faulty explanations, not just the “right” one and “the intimidation factor” as Sampson states does not become an issue. (468)

3/23 Dan - Inquiry and Investigation

One of the connections between the BGuILE and ADI practices that I found interesting was that both advocated deeper exploration into topics at the expense of covering a wider range of topics.  I thought that this was interesting because many current curricula treat subjects more as survey courses, especially at the AP level.  Deeper exploration into fewer topics allows teachers and students to work on the scientific practices that both articles are arguing are more important than the content.  I agree that the practices need to be a much bigger emphasis in science classrooms.  Equations and definitions can be learned from almost anywhere, but learning how to ask appropriate questions, investigate, collect and interpret data, and defend those interpretations with evidence are a set of practices unique to the scientific community and need to be at the core of any science course.  My concern with ADI is that it is an extremely time consuming process that may be difficult to replicate several times with the same group of students.  Perhaps it is a process hat is not used with every topic within a course, but is just used a handful of time throughout a school year.  In the BGuILE classroom, I like the idea of having software that can help students construct arguments by directly connecting claims and evidence.  I know this is a relatively old article, and I would by curious to see what similar software would look like today, and across what subjects it would be useful.  Both of these articles seem to fit with the themes of other articles we have read of leading an inquiry based classroom.  Both emphasize the importance of evidence in support of claims.

Questions: How important is it that these practices are fully employed for every topic?  For the sake of classroom efficiency, could these practices be used only on selected topics?

3/23 Caitlin BGuILE and Inquire

This weeks readings, by Reiser and Sampson, are all about teaching scientific inquiry. Both authors argued that science classes should focus on science practice and making curricula include more depth than breadth. NGSS standards are trying to follow this ideal. The standards have fewer science concepts and more technology and modeling performance requirements. These requirements include observation, articulation (explanation), reflection (revision), defending (argumentation), and modeling. Scientists and engineers use all of these requirements as they research and solve problems. Reiser, Sampson, and many of the other authors we have read support that science classrooms should focus on these practices. The question is how to do this while still teaching students the content they need to know for exams.

One of the components of scientific practice and inquiry that the articles found important was the social practices in scientific communities. Sampson, with the multi-step lesson, and Reiser, with the problem-based learning instruction strategy and the idea of a social friendly classroom culture, support the idea of authentic practice. The performance standards are based on how an actual scientific community works. In the classroom, students, in groups or individually, can explore a problem they specifically had in mind, and then come together as a class (or community) to share what they have learned, and critique and discuss what they have learned as a whole before revising their models and explanations. Other authors we have read, such as Harlow, Nersessian, diSessa, and Shwarz, believe in making what is learned in science class meaningful. Reiser and Sampson’s different instructional strategies can be successful options to make science class meaningful.


Reiser also supports ‘infusing’ technology into science classrooms. Technology makes modeling less time consuming, which is a challenge in most science classrooms. It also makes modeling the link between all of the different stages and parts of science practice. According to many of the authors, and class discussions, computational modeling can be the observation, the explanation, and the product (or answer), which is goal of the activity. In our own class, we are creating products to explain and explore scientific concepts that can be useful in our own future classrooms. Each class, we share what we have done and what we are hoping to do, and we have a chance to hear feedback and possible revisions from others in the class. Modeling can give our students a deeper understanding of science content concepts, while teaching them scientific inquiry and practice. Reiser and Sampson argued that, while modeling was important, explanation and argumentation are also important practices students should master. How much time, or emphasis, should be placed on these practices in the classroom? Or, if there is little time, how could this be made up for? 

Science question: 
In cell membranes, would other structures, such as cholesterol or transport proteins, stick to the hydrophobic or hydrophilic part of the lipid, or even the link between the two (this is what I'm trying to figure out for the model)?