People often ask how Ruth Chabay and I came to develop the Matter & Interactions curriculum, and particularly why we felt it essential to include computational modeling.

Starting in the late 1980s at Carnegie Mellon University, Ruth Chabay and I worked first on E&M. We chose E&M as a domain for study partly because most Physics Education Research (PER) work was on mechanics, and E&M was significantly less studied. Because E&M is significantly more abstract than mechanics, we thought it likely that students would benefit from the infusion of a healthy slug of qualitative and conceptual reasoning, and from simple experiments designed to let students observe some of the basic phenomena that E&M tries to explain. To do this, we developed the E&M experiment kits now distributed by PASCO, and a variety of computer visualizations (Ruth’s “Electric Field Hockey” game came from that period).

We quickly came to see that the traditional sequence of E&M topics lacked coherence and made it difficult for students to develop any conceptual understanding. A large number of abstract and highly mathematical concepts (charge, force, field, flux, Gauss’s Law) were introduced very rapidly at the beginning of the course. Before many students had begun to understand the difference between charge and field, all of these concepts were abandoned, and students were required to deal with potential and conventional current, which seemed to them unconnected to any previous concepts. At the end of a semester, even the best students had trouble remembering anything about the electric field of a point charge!

To address some of the difficulties students were having, we started writing supplements to a traditional textbook, and by 1995, with the encouragement of our students, we had backed into writing a full textbook on E&M, published by Wiley.

With the publication of our E&M textbook it became very clear that people interested in this treatment felt a strong need for a compatible prequel on mechanics. At that point, it wasn’t clear to us how to proceed, but when we reflected on which elements of our E&M we liked best, we realized that it was those problems and experiments that involved modeling complicated real-world situations: idealization, approximation, refining the model, etc. (at CMU we were working with unusually strong students). An example is problem 15.P.48 on page 621 of the 4th edition of our textbook (16.P.49 on page 658 of the 3rd edition): rub a plastic pen, predict how close the pen must be to pick up a small piece of aluminum foil. We were encouraged in this view by David Hestenes and others in the Arizona State modeling community. Given an emphasis on modeling the real world, Ruth reasoned that computational modeling should be a prominent component of what would be M&I. In part this was because of the centrality of computer modeling to contemporary physics, and in part because it was the only way to allow beginning students to see how fundamental principles applied to systems other than simple toy systems. It is easy to refine a computational model and very difficult to refine an analytical model, especially at this level — classical perturbation theory is not accessible to intro physics students.

Sometime in 1996 or early 1997 we met with CMU colleagues and outlined our thoughts on “Modern Mechanics”, integrating macro and micro, mechanics and thermal physics, and including having students write computer programs to model physical systems. One of our colleagues said, “You can’t possibly do what you’re proposing to do, but if you do, I very much want to teach it!”

We were pondering how best to deal with the thermal aspects of modern mechanics when we read the excellent January 1997 AJP article by Tom Moore and Dan Schroeder, on how to use the Einstein model (isolated-atom version of the ball-and-spring model of a solid) to do quantum stat mech in a way that is easily accessible to intro physics students. The key point is that the simple evenly spaced energy levels of the quantized harmonic oscillator, and the lack of spatial contributions to the entropy, make the central issues highly salient and easy to compute.

In the late 1980s at CMU I had created the cT programming language, a descendant of the TUTOR programming language of the PLATO computer-based education system that Ruth and I had helped develop at the University of Illinois at Urbana-Champaign (see videos of the 2010 conference on the history of PLATO). cT can be thought of as being somewhat similar to a good BASIC, with easy 2D graphics built-in, and cT programs ran in windows without modification on PCs, Macs, and Unix workstations. In our first offering of modern mechanics, in the fall of 1997, we taught students a minimal set of cT features and they were able to write computational models of physical systems, with 2D animations and graphs. An example of a computational problem from that time is problem 3.P71 on p. 128 of the 4th edition (3.P.80 on p. 136 of the 3rd edition), the Ranger mission to the Moon.

In the second year of offering both mechanics and E&M, the 1998-1999 academic year, we had an extraordinary student, David Scherer, who while in high school had led a team of friends to create a sophisticated 3D PC game that later won a national prize. After taking our course, Scherer said he thought he saw a better scheme than cT, which would be 3D. I said, “Well, what we have is adequate; I don’t think we need something else.” But Ruth, who had already done a lot of 3D work at great effort, using hard-to-use tools, said, “3D? I will certainly use it!” In the spring of 2000 Scherer created VPython, with Ruth and me spending many hours per day with him on design and testing, and it was fully deployed in our course at CMU in fall 2000. We dropped development of cT, as VPython was far superior. (You can read about cT and even download it from the “cT archives” at vpython.org.) People who have made major contributions to the further development of VPython are Jonathan Brandmeyer, Steve Spicklemire, John Coady, Ruth Chabay, and me.

For at least ten years there has been a lot of talk about the need to increase greatly the role of computation in undergraduate STEM education. Despite all this talk, with various national conferences and financial support from NSF, even today computational modeling doesn’t really have a central role in physics departments. In many physics departments a student can major in physics without ever having done any computational modeling. Even at places with a long history of computational physics courses, computation may be relegated to one course, with no impact on other physics courses, and if the local enthusiast is away for a year, the computational course isn’t taught, in contrast to intermediate E&M, say. And yet computation in the discipline of physics is now co-equal with theory and experiment, which means that here is another way in which the undergraduate curriculum is not representative of what the contemporary discipline of physics is all about.

The wheels grind very slowly. But little by little more and more physicists are getting uncomfortable about the absence of computation in the formal education of physicists, so finally people wanting to bring computational modeling into the introductory course are starting to catch up to what we were doing in 1997. And when they look around for curricula that incorporate computational modeling carried out by students in the introductory course, they find Matter & Interactions. It takes a very long time to develop curricula, so it’s a really good thing that Ruth got us started on this so long ago.

#### Later developments

We moved to NCSU in 2002, where students are bright and willing to work but on average are less well prepared than students at CMU. We were able to make Matter & Interactions accessible to these students by providing more support in the textbook, and this work was incorporated into the 2nd edition (2007), with further improvements in the 3rd edition (2010) and the 4th edition (2015). The 4th edition was the first to include explicit instruction in VPython, whereas previously this was taught only by handouts in computational labs. We also designed and coded a suite of questions for the WebAssign computer homework system to accompany the textbook, which is an important resource for students in large classes. Coverage was later extended by WebAssign and WileyPlus.

With respect to the computational modeling activities, Ruth and some of her PER graduate students at NCSU carrried out a lot of research and development aimed at improving the computational modeling activities students carry out as part of their work in labs (which also includes experiments and group problem-solving). The new materials have proved to be a significant improvement. They include helpful videos on basic aspects of the VPython 3D programming environment used by students to model physical systems. They are included in the instructor resources available to adopters of the curriculum and are also available to other physics instructors here.

In 2011 David Scherer and I began the development of the GlowScript browser-based programming environment (blog article), inspired by classic VPython. Since the beginning of 2012 I have been the main developer. Initially, programs had to be written in JavaScript or CoffeeScript, but in 2015, thanks to the RapydScript compiler of Alex Tsepkov, which translates Python to JavaScript, it was possible to enable GlowScript users to write VPython programs that run in a browser, without having to install anything. Starting in 2017 GlowScript VPython uses Kovid Goyal’s RapydScript-NG tranpiler, which comes closer to handling true Python syntax. GlowScript uses the WebGL 3D graphics library now available in modern browsers.

The GlowScript libraries in turn are used in VPython 7 for use with true Python, in the vpython module created by John Coady and extended by Ruth Chabay and me, which is usable in Jupyter notebooks or with program launchers such as IDLE. The GlowScript libraries are also used by Brian Marks in Trinkets.

*Bruce Sherwood*