Wednesday, December 29, 2010

Molecule Sandbox


Are your children already tired of their Christmas toys and games? Or would you like to see them play with something less violent and more educational?

Vi Hart’s math doodle games can provide countless hours of fun. But another option that you may want to try is the Molecular Dynamics Applet that I created three years ago. It was originally intended for college students, but I soon discovered that small children love it.

The MD Applet is a sandbox for playing with atoms and molecules. Make up to a thousand atoms, large or small, in your favorite color. Watch them jiggle around endlessly, attracting and repelling their neighbors. Add energy to make liquid droplets boil; remove energy to make a gas condense and then freeze into a solid crystal. Start with an orderly arrangement and watch entropy increase. Connect atoms together with bonds, and even build simulated nano-scale machinery.

I don’t spend much time around small children, but Christmas is often an exception. This year I found myself entertaining (and being entertained by) a delightful seven-year-old who kept coming back, asking to play some more with the MD Applet. She asked her deepest question almost immediately: Why don’t they all just fall down and stop?

And in between molecular dynamics sessions, she learned how to draw stars with seven, eight, and even ten points!

Tuesday, December 21, 2010

Math Doodles

If you haven’t seen them already, you must watch Vi Hart’s fantastic math doodle videos on stars, squiggles, fractals, and infinite elephants. Browse the rest of her web site too, and be awe-struck at how accomplished she is at having fun.


I’m not much of a doodler, but Hart’s masterpieces reminded me of this modest Escheresque MacPaint doodle that I made soon after buying my first (original!) Macintosh computer in 1985. That was during my first year of grad school, when I should have been putting every effort into those problem sets on quantum mechanics, statistical mechanics, and solid state physics. Why are we most creative when we’re avoiding what we’re supposed to do?

(By the way, isn’t it cool that I can still open that MacPaint file in Preview? Thanks, Apple! Now please tell me how to open my old MacWrite files...)

Thanks to Charlie Trentelman for pointing me, via Facebook, to a blog post on Hart’s videos by NPR’s Robert Krulwich. And thanks to my old grad school friend Ned Gulley, whose venerable blog featured an entry last year about Hart’s Möbius music box. It’s become trendy to gripe about the Internet and Facebook, but this is the sort of thing I love about both.

Krulwich also quotes from Paul Lockhart’s magnificent tirade about math education, “A Mathematician’s Lament.” It’s not new, but I don’t think I’d ever seen it before. Read it and weep.

Sunday, October 10, 2010

Uncertainties in Science and Politics


Nate Silver is currently predicting that Republicans will gain 47 House seats in the upcoming election—plus or minus 30. In other words, he’s fairly sure that Republicans will gain between 17 and 77 seats, although there’s a slight chance the gain will be even less, or even more.

What good is a prediction when it carries such a wide margin of error? Answer: Far, far better than a prediction with no stated margin of error at all.

Understanding uncertainties, and learning to live with uncertainties, and stubbornly insisting on knowing the uncertainties, is the hallmark of a good scientist. A constant awareness of uncertainties may be the most important thinking pattern that distinguishes the two cultures (“techies” and “fuzzies”, we called them at Stanford) from each other.

If you study physics, we try to teach you about uncertainties through lab exercises. Suppose someone predicts that “g” (the acceleration of a freely flying projectile) should equal 9.8 meters per second squared. You measure its value and get 9.6. Have you confirmed the prediction or not? Yes, if the uncertainty in your measurement is at least 0.2. But if your uncertainly is only 0.02, you’ve disproved the prediction (and perhaps discovered a local gravitational anomaly).

Our introductory physics lab students think they’ll be graded on how close they get to the “right” answer, but they’re wrong. We actually grade them on whether they make a good uncertainty estimate, and on whether they interpret their results correctly in light of this uncertainty.

Perhaps I learned this lesson in undergraduate lab courses myself; I can’t remember. What I do remember, vividly, is my graduate school days at the Stanford Linear Accelerator Center, when I must have sat through hundreds of seminars on experimental results and their theoretical interpretation. Each speaker would spend a major portion of the talk meticulously explaining how the uncertainties had been estimated. Most of the after-talk discussion would center on whether these estimates were accurate and whether the speaker had, given the uncertainties, drawn the right conclusion. In elementary particle physics, where the experiments cost hundreds of millions of dollars and the data are inherently statistical, you had better interpret your results correctly.

But uncertainties don’t play well in politics. Whenever a climatologist admits that there’s any uncertainty at all in the predicted rise in earth’s average temperature, global warming deniers loudly yell “See! The scientists admit they don’t know what will happen!” Or to take another example from the not-so-distant past: Imagine that President Bush had said in 2002 that he actually wasn’t sure whether Iraq was developing weapons of mass destruction, but there seemed to be a 50/50 chance of it. Would Congress still have authorized the war?

Nate Silver is a hero (and an anomaly) because he’s able to look at all the data, make his best prediction, and still be honest about his uncertainties—even when the subject is politics.

I’ll end (once again) with a local political example. A respected economic consultant recently predicted that the middle segment of Ogden’s proposed streetcar system will stimulate $8.5 million of investment if it follows one proposed alignment, but only $1.5 million if it follows an alternate alignment. Ten days ago I asked her what the uncertainty range is on those numbers, and she replied, “Well, you can see that we rounded them to the nearest half million.” I’m afraid I laughed at that point, and tried unsuccessfully to convince her that the uncertainties were many times larger. I knew the numbers had been calculated from property value assessments, and that these assessments can be systematically off by 50% or even more. Worse, I knew that the lists of properties to be included in the calculations had been compiled through a subjective, undocumented process. After our conversation I looked up some of the property assessments and quickly saw that you could increase the $1.5 million prediction to over $9 million by excluding just two properties (out of several dozen) from the list. A fair estimate of the uncertainty would be much higher still.

But economic consultants apparently aren’t in the habit of thinking about uncertainty. Undoubtedly this is because their clients don’t want to hear about it; they just want simple answers. In this case the client was the Utah Transit Authority—a government agency that supposedly represents the people. Ultimately, it is the citizens at large who need to learn to think like scientists.

Wednesday, September 1, 2010

The Fight Over NASA Continues

The fight in Washington over NASA’s future has gotten complicated and ugly, like any other legislative battle. I can’t keep up with the details, but the latest development is noteworthy.

The voices of reason have just sent an open letter to the chairman of the House Committee on Science and Technology, pleading for more support for new technology, commercial spaceflight, robotic precursor missions, and student research. These are some of the programs that our government has been scaling back in recent years, and may continue to scale back, in order to divert every available dollar to the entrenched Constellation Program contractors.

The letter is signed by 14 Nobel laureates and a list of eminent former NASA officials and astronauts. Will anyone listen to them? I have no idea.

Meanwhile, our local entrenched contractor test-fired a rocket motor yesterday, resulting in yet another article (and yet another cool photo) in the local paper. Of course, the article reminds us yet again of how many local jobs hang in the balance as Congress debates NASA’s future. And what is the purpose of this new rocket motor? All we’re told is this: “ATK hopes its motor will boost a rocket into low Earth orbit, or maybe space.”

Sunday, August 22, 2010

Textbook Prices


Today, on the eve of the start of fall semester classes, Mark Saal’s column in the Ogden Standard-Examiner appropriately takes aim at astronomical textbook prices. And although many of his examples are books for economics courses, he also lists the price of an introductory astronomy textbook!

(Feynman joke: “There are 1011 stars in the galaxy. That used to be a huge number. But it’s only a hundred billion. It’s less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers.”)

As Saal points out, high college textbook prices are mainly due to the fact that the people choosing the books (the professors) are never the same ones who are paying for the books (the students). Publishers bombard professors with free copies of textbooks and in fact, I doubt that most professors even know what their assigned books cost. (The sales reps certainly don’t volunteer this information.) Under this system, textbook prices have been creeping upward considerably faster than inflation for the last 25 years.

One force that tries to counteract this trend is the used book market. Students have been selling their used books to each other for a very long time. College bookstores take an “if you can’t beat ’em, join ’em” attitude, buying back used books for half price and then reselling them at 75% of the price of a new book. (The difference, 25% of the new price, happens to be the same as the bookstores’ profit margin on new books.) Students who don’t wish to keep their books can save a lot of money under this system, buying a used book for 75% of the new price and then selling it back at 50%, for a net cost of only 25%. Students who want to keep their books, though, still pay 75% of the new price.

Fighting back, publishers do everything they can to suppress the used book market. Mass-produced introductory books are now revised every three or four years, thereby making all used copies of the previous edition worthless. The revisions rarely add anything of value to the content. If publishers could revise books even more often, I’m sure they would—but that’s pretty much impossible. So they now publish most books in paperback, designed to self-destruct after a semester of use (while saving almost nothing in production cost). Another trick is to shrink-wrap a single-use student workbook with the main book, hoping that professors will require their students to have both. More recently, publishers have started providing online extras such as self-grading homework assignments, protected by a password that students have to pay for unless they buy a new book. The password expires after a year, and cannot be transferred to another student.

Being the half-assed crusader that I am, I’ve been fighting this system, in my own small ways, since 1990. I’ve written angry letters to publishers, posted a web article documenting the alarming trend in prices, and even made my own publisher put a clause in our contract to limit the price of my thermal physics textbook. I’ve never required my students to use the shrink-wrapped workbooks or online homework systems. For my own astronomy section, I’ve started writing a free online text (emphasis on started).

But of all the ways that professors can save money for their students, the most promising by far is simply this: Turn the publishers’ tactic against them and let your students use an earlier edition of the book. College bookstores won’t stock superseded editions, because they can’t be returned to the warehouse if they don’t sell. But the Internet makes it extremely easy for students to obtain used older editions, and the prices are rock-bottom.

Wednesday, July 28, 2010

History’s Greatest Star Map


Next time you’re out under a clear, dark sky at night, look up and pick out a star at random. Chances are, nobody knew until 15 years ago how far away that star is. Now, thanks to the European Space Agency’s Hipparcos mission, we know.

Your randomly chosen star is probably somewhere between 100 and 1000 light-years away, although there’s about a 15% chance that it’s closer, and about a 10% chance that it’s farther. If your star is one of that nearest 15%, then its distance was probably known, to an accuracy of 50% or better, before Hipparcos. Otherwise, astronomers could have given you no better than a rough estimate of your star’s distance.

Direct measurements of star distances come from the method of triangulation, or parallax: Look at the star from two different directions, and measure its angular shift as you switch viewing locations. It’s the same principle as two-eyed vision, except that in the case of stars, the two viewing locations are on opposite sides of earth’s orbit around the sun—300 million kilometers apart.

Despite this enormous baseline, the angular shifts are miniscule, even for the nearest stars. And for stars beyond 100 light-years, the angles are too small to measure with any accuracy through earth’s blurry atmosphere. So in 1989 the ESA launched the Hipparcos satellite, carrying a special-purpose telescope dedicated to making accurate measurements of the positions of 100,000 stars. By repeating the measurements over a three-year period, the instrument determined not only the parallax shifts but also the steady motions of the stars as they gradually drift across our galaxy. The catalog of results, published in 1997, gives accurate distances and motions for all but a handful of the naked-eye stars, and many, many more.

You can now read about the Hipparcos mission in a new book by Michael Perryman: The Making of History’s Greatest Star Map. Perryman was Coordinating Scientist for the Hipparcos mission, and he does a masterful job at conveying what an immense undertaking it was. Hundreds of scientists spent many years of their careers on Hipparcos, while some of Europe’s most advanced industries fabricated the satellite and its unique optical system. The story also includes high drama, thanks to the failure of the booster rocket that was to put the satellite into its final orbit. That the scientists were able to recover from this disaster and still surpass all the mission’s goals was nothing short of miraculous.

Unfortunately, Perryman’s book has several shortcomings. He tries to do too much, telling not only the story of the Hipparcos mission but also the whole history of astronomy since ancient times—in fewer than 300 pages. Indeed, the main intent of this book is apparently to establish the place of Hipparcos in history, and to properly credit several dozen of the principal scientists for their respective roles. Educating the reader is secondary, and although the book tries to be accessible to non-astronomers (and to wow them with vague superlatives), I fear that most would be overwhelmed by the enormous number of technical details so superficially explained. I learned quite a bit from the book, but I’m already a professional physicist who teaches introductory astronomy. For my own part, I was disappointed that the book didn’t adequately explain how the Hipparcos optical system worked, or even point to a reference where I could learn more. I still have no idea why the system’s limiting resolution was about a thousandth of an arc-second, or how this relates to the diameter of its main mirror (30 centimeters).

Still, the inadequacies of the book shouldn’t detract from the importance of the Hipparcos mission. Virtually every subfield of astronomy now rests upon a firmer foundation, thanks to Hipparcos.

As an American, I can’t help but notice the differences between Hipparcos and the many equally impressive science missions carried out by NASA. Hipparcos produced no pretty pictures, and made no sudden discoveries. You can’t convey its importance in a ten-second sound-bite. It was designed, built, launched, operated, and funded by people who were focused not on short-term payoffs but on the long-term advancement of science. Such a mission would never have been supported by NASA, an agency that is forced to put glamor ahead of science because its budget is continually threatened by the whims of politicians. Of course, an advantage of the American system is that NASA has become very good at making its results accessible to the general public.

Ironically, it may not be long before the importance of the Hipparcos mission is merely historical. Encouraged by its success and the progress of technology over the last two decades, the ESA is now preparing a successor mission called Gaia, scheduled for launch in late 2012. If all goes as planned, Gaia will measure the positions of a billion stars, with an accuracy a hundred times greater than that of Hipparcos. Its completed three-dimensional star map will stretch across most of the Milky Way galaxy, far beyond the most distant naked-eye stars. Gaia will also discover thousands of planets orbiting distant stars, as well as tens of thousands of asteroids within our solar system. It will gather data over a period of five years, and its results will be published by 2020.

Saturday, July 24, 2010

How to Photograph the Milky Way


This summer I’ve been making quite a few wide-angle astronomical photos, especially of the Milky Way. Here are links to a collection of photos taken in June in the San Rafael Swell, and some other miscellaneous astronomical photos.

When I show these photos to people, they often ask how to make similar photos themselves. Here’s a summary of what I’ve figured out so far. For much more advice on astrophotography, I highly recommend Jerry Lodriguss’s site.

To photograph the Milky Way, you need the following:
  • A camera. I use Canon’s cheapest digital SLR, the Rebel XS (street price $500). Any other DSLR will probably work fine, except perhaps some of the earliest models which have higher noise levels. There may now be some high-end point-and-shoot cameras that will give acceptable results, but I’m not sure of this; most point-and-shoot cameras can’t take long enough exposures, and even if they could, the noise levels would be unacceptable. Film cameras don’t work well because even the fastest readily available films aren’t as sensitive to dim light as the sensor in a DSLR.
  • A wide-angle lens. I’ve invested in a Sigma 20mm f1.8 lens ($520), although the inexpensive 18-55mm zoom lens that came with my camera was good enough to get started. If money is no object, get the Canon 24mm f1.4 ($1700), along with a full-frame Canon 5D ($2500); that’s what the pros seem to use, as far as I can tell.
  • A tripod. I got a perfectly usable one at a discount store for $29.
  • A dark site. This is the most difficult part for many people. You cannot make decent photos of the Milky Way from a light-polluted city. But here in Utah, there are some very dark sites within a one-hour drive of my urban home. Depending on where you live, you may need to travel farther.
Of course, you also need a clear sky with a view of the Milky Way. From the northern hemisphere, the best views of the Milky Way are in the summer, with the brightest parts in the southern sky.

Before heading out on a dark night, practice with the settings on your camera. Put it in fully manual mode, including manual focus. Set it for a 30-second exposure at ISO 1600, with the lens at its widest aperture (perhaps f3.5 on a zoom lens). Practice turning the display on and off, and turn its brightness down. Set the camera to store images in “raw” format, rather than jpeg. Most importantly, figure out how to manually focus the lens at infinity. Some lenses are conveniently labeled for focusing, but my zoom lens isn’t, so I had to mark the infinity setting (when zoomed out to 18mm) with white tape.

With this preparation, taking the photos should be pretty easy. Turn the display off when you’re pointing the camera (so it doesn’t ruin your eyes’ dark adaptation), then turn it back on to check the settings (30 seconds, ISO 1600, widest aperture) and fire away. It’s hard to compose a photo in the dark, but you can review the composition on the LCD and try again as needed.

After downloading the photos to your computer, use the software that came with the camera to adjust the brightness, contrast, and color balance. With “raw” images you can make some pretty dramatic adjustments without losing quality.

Speaking of quality, there are three factors that limit the amount of detail in a photo of this type:
  1. Digital noise, which gets worse at higher ISO settings;
  2. Lens aberrations, which blur and dim the edges of the image, and which get worse when the lens is opened to a wide aperture (low focal ratio);
  3. The earth’s spinning motion, which turns star images into trails and blurs the Milky Way over time. (In 30 seconds the earth turns by 1/8 of a degree.)
To lessen any one of these problems, you generally need to worsen one of the others. The trick is to make sure that no one of them is much worse than the other two. By all means, experiment with different ISO settings, apertures, and exposure times. I always stop-down my Sigma lens to about f2.8 to reduce aberrations, but stopping-down may not be an option if you’re using a relatively slow zoom lens. I’m happy with ISO 1600, which is the highest setting on my camera. Most of the digital noise disappears when I reduce the photos to screen size, but in long exposures there are always some “hot pixels” which can be manually fixed in Photoshop if necessary.

Even with the most expensive equipment, photos made in this way will not be sharp enough to withstand poster-size enlargements. For example, I’m a big fan of Wally Pacholka’s photos, and I have a framed 36-inch panorama of his in my living room, but it doesn’t show much more detail at that size than in the screen version on his web site.

It’s a nice touch to include foreground scenery in your photos, but if you want more than silhouettes, you’ll need to plan carefully. A small amount of artificial light, from ambient light pollution or even a flashlight, can sometimes illuminate the scenery without ruining the Milky Way. Moonlight is another option, but anything bigger than a crescent moon will brighten the sky too much for a good Milky Way photo, and there are only a few nights each month, and a few hours each of these nights, when the crescent moon is above the horizon after dark. Even then, the moonlight won’t always be shining in the direction you want.


If you don’t want to include foreground scenery in your photos, then life becomes much easier. You can try using a tracking mount to compensate for the earth’s rotation, allowing much longer exposure times. Then you can use a smaller aperture and/or lower ISO setting to reduce problems 1 and 2 above. You can even use a film camera, which is far less expensive but requires additional skills and patience.

Wednesday, June 30, 2010

Detecting Bad Data

Political numbers geeks learned yesterday that Research 2000, one of the most prolific national political pollsters in recent years, may have been manipulating or even fabricating much of its data. This news comes less than a year after another national pollster, Strategic Vision, was exposed for probable fraud.

The evidence against these pollsters has come mainly from statistical scrutiny of their published results, performed by heroes like Nate Silver and Michael Weissman. But in most cases, you don’t have to be an accomplished sports statistician or a PhD physicist to detect bad data. You just have to care about numbers, and spend some time with them, and use a lot of common sense.

The sad thing is that in America today, hardly anybody cares about numbers except professional scientists and sports enthusiasts. Journalists, in particular, seem to think that their only job is to report both sides of the story--as if there’s no such thing as a fact. Except sports reporters, of course, who have to be extremely careful with facts and figures.

The good news, at the national level, is that the traditional media usually pick up the fraud stories after the bloggers do the actual work. The New York Times wasted no time reporting the Research 2000 accusations on its Caucus blog. If the accusations hold up, we’ll undoubtedly hear more. (Nate Silver will soon be assimilated into the New York Times. Let’s hope these kinds of stories don’t get suppressed in the process.)

Also, at the national level, there’s often enough honest fact-gathering that the frauds don’t make much difference. No single pollster had much impact on Silver’s bottom-line prediction of the outcome of the 2008 presidential election. The danger arises when everyone is relying on a single primary source, like the military or the White House.

At the local level, relying on a single authority is the rule rather than the exception. The Ogden Standard-Examiner almost always prints the word of local government officials as if it were fact, with no questions asked. Despite the detailed exposés on Weber County Forum, the Standard-Examiner has yet to report that the Ogden government manipulated its crime statistics, or that the government’s revenue projections for the Junction development were fraudulently overblown.

In science, fabricating data is the most serious of all crimes. I’ve given failing grades to astronomy students for fabricating their observations (which is usually easy to detect). There are continual allegations of fraud in medical research, where the financial stakes are incredibly high. Fortunately, the list of significant and documented cases of fraud in the physical sciences is extremely short. Although we physical scientists are just as human as everyone else, we know that our peers will tear our work apart if it doesn’t hold up to scrutiny.

Wednesday, June 16, 2010

APOD Celebrates 15 Years


Astronomy Picture of the Day, one of the very best sites on the Web, is celebrating its 15th anniversary today. My heartfelt thanks go to its devoted authors and editors, Robert Nemiroff and Jerry Bonnell--and to NASA for hosting the site.

APOD’s diversity is remarkable. The pictures include straight photographs, highly processed digital images, graphs, and even paintings. They come from professional astronomers, NASA, dedicated amateurs, scientifically inclined artists, and historical archives. The subjects go beyond pure astronomy to include the space program, earth science, and physics. Each picture comes with a short lesson, written by the editors, full of hyperlinks for those who want to learn more.

Although APOD is pitched to the general public, it’s also extremely useful to those of us who teach introductory astronomy, and to any scientist who needs a daily dose of breadth in this era of hyper-specialization.

Among Web sites, APOD is also remarkable for its simplicity: No banners, no sidebars, no drop-down menus, no fancy fonts. This is the Web as it was originally meant to be, where content takes precedence over presentation, and the hyperlinks are inserted by real human beings. The most noticeable change since 1995 is that the pictures have gotten bigger. They’ve also added Javascript rollovers to annotate some photos, and even an occasional video. And there’s now a linked forum where you can discuss the pictures. But simplicity still prevails.

Tuesday, June 8, 2010

Collected Works from Weber County Forum, Volume 2

About a year ago I posted a list of my Weber County Forum articles (loosely defined) over the previous three years. Since then the list has approximately doubled in length, so it's time for an update. Here, then, are my contributions since the middle of last June, in reverse chronological order:

Wednesday, May 12, 2010

Purge Your Myrtle Spurge


Today’s Salt Lake Tribune has an article about a plant that’s been invading Ogden’s foothills over the last few years. Now I finally know what it’s called: myrtle spurge, euphorbia myrsinites.

A native of southeastern Europe and Asia Minor, myrtle spurge has made its way into Utah’s gardens as an easy-to-grow xeriscape plant. But it is also extremely invasive, spreading into natural areas and crowding out native vegetation. Now that we know the danger, we need to get rid of this enemy before it propagates any further. (Colorado has already banned myrtle spurge; Utah is still a little behind the times.)

Fortunately, myrtle spurge is easy to recognize and to uproot. Look for the low-growing succulent plant with gray-green leaves and yellow flowers and bracts at the tips of the stems. Being sure to wear gloves, gather up the multiple stems in both hands and firmly pull the plant up by its root.

You need to wear gloves, because the sap of the plant can cause a severe allergic reaction in some people. Be sure to wash your hands after touching it, and avoid touching your eyes. A reaction is especially likely in people who are allergic to latex.

This morning I made a good first dent in the myrtle spurge infestation just above the top of 27th Street. If a few others help out and we keep following-up, I’m sure we can purge it from this location. I’ve also seen it growing along the Mt. Ogden Exercise Trail, and I’m told it’s widespread in Ogden Canyon. I don’t know if it’s still feasible to eradicate it from Weber County, but now is the time to try.

And whatever you do, don’t plant this noxious weed in your yard! 

Wednesday, April 28, 2010

Science and Nature Reading List


Now that school’s out, it’s time for summer reading! Here are a dozen of my favorite science and nature books, recommended to students, colleagues, and friends alike. None of them are especially recent, and in fact, many are books that I first read for fun during graduate school, when I should have been working on my thesis. They’re listed below in approximate order by difficulty, starting with the lightest reading and ending with books that require some effort. None, however, assume any specialized background. Of course there are hundreds of other good science and nature books out there, most of which I haven’t read. I can’t promise that you’ll like all of these as much as I do, but I can promise that each of them is of the very highest quality.
  • Encounters with the Archdruid by John McPhee. In this classic from the golden era of environmentalism, McPhee arranges for Sierra Club hero David Brower to spend some quality time with three of his natural enemies: a mining geologist, a resort developer, and a dam builder.
  • Desert Solitaire by Edward Abbey. Essays by a hard-nosed realist about the wonders of southern Utah: juniper trees, snakes, clouds, heat, quicksand, tourists, inhabitants, and the encroachment of industrial civilization.

  • The Cuckoo’s Egg by Cliff Stoll. My favorite mystery, and all true! A Berkeley hippie astronomer and computer geek discovers that a hacker is breaking into U.S. Government computers. Soon he’s teaching the FBI, CIA, and NSA all about internet security.
  • Voodoo Science by Bob Park. An entertaining survey of perpetual motion machines, cold fusion, human space flight, and other things that look like science but aren’t. Written in the same spirit as Martin Gardner’s classic, Fads and Fallacies.
  • Basin and Range by John McPhee. The best geology book ever written, which just happens to be about the place where I now live. Filled with clever juxtapositions of human and geologic time. The three sequels are also good: In Suspect Terrain, Rising From the Plains, and Assembling California.
  • First Light by Richard Preston. Before the author became famous for writing The Hot Zone, he spent some time hanging out at Palomar Observatory and wrote this delightful book about the astronomers working there.
  • 365 Starry Nights by Chet Raymo. Among the hundreds of guides to the night sky, this is by far my favorite. It offers a mini astronomy lesson for each night of the year, with lovingly hand-drawn illustrations. Its only deficiency is the lack of an index, so I created one years ago.
  • The Character of Physical Law by Richard Feynman. A set of seven informal lectures by the great theoretical physicist, just as relevant and insightful today as when they were first delivered in the 1960s. If you like this, you’ll also enjoy Feynman’s QED: The Strange Theory of Light and Matter, which presents four more lectures on quantum physics.

  • Guns, Germs, and Steel by Jared Diamond. The big picture of human history and prehistory.
  • The First Three Minutes by Steven Weinberg. Still the best book on cosmology, written soon after our understanding of the hot early universe became firmly established.
  • The Copernican Revolution by Thomas Kuhn. This well-crafted classic on the history of astronomy reminds us that a moving earth was once just as much a threat to some peoples’ belief systems as evolution and global warming are today.
  • Gödel, Escher, Bach by Douglas Hofstadter. A weighty masterpiece that interweaves art, music, logic, puzzles, puns, language, molecular biology, and artificial intelligence.

Monday, April 12, 2010

Mobile Computing Growing Pains

The experts have been saying for years that the future of computing is in mobile devices. I ignored them until recently, but now the iPhone, iPad, Kindle, and similar gadgets have gotten my attention. I own an iPhone myself, and I’m beginning to see the potential of mobile platforms for some of my own creative projects. These projects might include textbooks, educational software, and perhaps a trail guide to the local area.

I’ve already written about some of the challenges in delivering a textbook or trail guide on a mobile device. But perhaps the biggest challenge facing any of these projects would be the diversity of competing mobile platforms, and the fact that only a minority of the potential audience owns any one of them.

The obvious solution is to create content in a cross-platform format. This is trivial for a book whose formatting doesn’t matter. But a versatile and attractive electronic format for physics textbooks doesn’t seem to exist, so some custom, platform-dependent coding would probably be required. A good, practical trail guide would require much more coding. And a decent interactive simulation of molecular dynamics or the night sky requires a thousand or more lines of user-interface code.

For several years I’ve been writing these kinds of simulations in Java, which makes them portable to virtually all of today’s desktop and laptop computers--and deliverable over the web. I can’t overstate what a huge advance this is compared to the bad old days when you had to write native code that would run on only one platform. (The native Mac simulations that I wrote between 1985 and 1992 were never widely used, and now they don’t even run on the new Macs.)

Unfortunately, mobile devices don’t run Java applets. Apple’s mobile devices don’t support Java at all. I’m not absolutely wedded to Java, but I’ve been hoping that some kind of usable cross-platform development environment for mobile devices would soon come along. Last week my hopes got a major setback.

Apple has now added the following sentence to its iPhone developer license agreement:
Applications must be originally written in Objective-C, C, C++ or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++ and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited).
The geek blogs are all abuzz over this new rule, and rightly so. It seems to prohibit virtually any sort of cross-platform development tools, and even restricts what programming languages you can use to develop iPhone apps. Bloggers are inferring that Apple isn’t merely trying to maintain the quality of apps; it’s literally trying to make life difficult for any developer who wishes to deploy an app on multiple mobile platforms.

For a part-time, half-assed developer like me, this move by Apple is devastating. I write software not to make money but to reach a target audience. I have no intention of writing software that can reach only the fraction of that audience that owns a particular device. And I don’t have the time or the resources to port software from one device to another.

Eventually, I suppose, the situation will improve--just like it improved for personal computers when Java came along. Until then, I’ll keep deploying physics simulations as Java applets for personal computers. And I’ll keep publishing books in the tried-and-true format that’s universally readable by all.

Friday, March 12, 2010

The Fourth Estate?

Today Utah woke up to the news that House Majority Leader Kevin Garn has been keeping a secret.

It seems that 25 years ago he had a little naked hot-tubbing encounter with a young woman. That’s no big deal in itself, but it seems that this woman was a 15-year-old girl at the time, and that Garn was approximately 30, and that she was also his employee, and that he was also married, and that when she threatened to go public during his 2002 campaign for Congress, he paid her $150,000 in hush money. Oh, and after he confessed all this to the Legislature last night, they gave him a standing ovation.

But among all the juicy details of this still-unfolding story, the one that interests me most is this: The Deseret News knew all about it 8 years ago, and never printed a word.

Their excuse is that they learned Garn’s secret shortly before the primary election in which he was defeated. They didn’t want to print something so inflammatory right before the election, when voters might not have time to hear and absorb all sides of the story. And after the election it wasn’t newsworthy because he was no longer a candidate or office holder.

They may have been right about not publishing before the election. Depends on how close to Election Day it was, and exactly how much information they had at that time. But there’s no excuse for their suppressing the story even after the election. Garn had then served in the Legislature for 12 years, and a story like this is newsworthy even when it’s about a former legislator or former candidate (just as the John Edwards scandal was newsworthy when it broke). And when Garn joined the Legislature again in 2007, the story became even more newsworthy.

Makes you wonder what else the Deseret News knows that it isn’t telling us.

The behavior of the Deseret News reminds me a lot of how our local Standard-Examiner treats Mayor Godfrey. In his case there have been no sex scandals, but there’s been plenty of lying, cheating, and illegal activity that the Standard-Examiner has done its best to ignore.

When the Press is part of the cover-up, there’s something seriously wrong with our democracy.

Saturday, March 6, 2010

iPad Textbooks


I have no immediate plans to buy an iPad, since it can’t replace either my iPhone or my laptop computer. But as a textbook author, I’m intrigued by the iPad’s possibilities as a book platform.

Following a link from the New York Times, I just read a thoughtful blog essay by Craig Mod on the future of printed and digital books. Mod wisely divides book content into two categories: “formless” (which is trivial to port from one delivery platform to another) and “definite” (which is created with a particular platform in mind, using that platform’s physical features in an essential way). Last year’s digital book platforms--Kindles and iPhones--were fine for formless content, and allow us to foresee a day when these kinds of platforms will be common enough to make most mass-market paperback books obsolete. The iPad, according to Mod, changes the picture by opening up new opportunities for digital definite content.

Mod doesn’t specifically mention textbooks, but they’re discussed in the comments below his essay. Electronic textbooks have some obvious advantages: they’re less bulky; their text can be cross-linked and searchable; they can incorporate multimedia content; and they can link to related content on the web. Also, textbooks are so expensive already that the additional cost of an electronic reading device shouldn’t be much of a barrier.

As an author, I’m attracted not only by these advantages but also by the prospect of no longer having to worry about page breaks. Both of my textbooks were created using TeX, a mathematical typesetting system that mostly frees the author from thinking about form. But inevitably, when a book is full of equations and illustrations, one of the last steps before going to press is to manually tweak the layout to minimize awkward page breaks. Even then, there will be many places where students end up flipping a page back and forth to see what’s on both sides. Electronic books on portable devices won’t show as much information at once, but at least they can (if done correctly) present an entire chapter on a single scrollable page, with no artificial discontinuities.

Unfortunately, the technology for good electronic physics textbooks isn’t yet where it needs to be. For one thing, there still doesn’t seem to be a good way to incorporate complex mathematical equations into electronic documents. In html pages, equations are usually rendered as ugly, low-resolution bitmap images. A pdf document can incorporate equations made of scalable fonts, but you can’t (as far as I know) create a pdf without breaking the document into pages.

Another limitation of electronic textbooks is that it’s hard to scribble notes in the margins. Reading a textbook should be an active experience, during which the student frequently jots down thoughts and questions. (When my thermal physics textbook was published, I made sure the publisher gave it wide margins for students to write in.) Perhaps, though, the iPad can help here. With the right software, a reader should be able to add text annotations to a document using the on-screen keyboard. And with the large touch-screen, it should even be possible to add graphical annotations that include math symbols and sketches.

So even though I don’t yet plan to buy an iPad, I’ll be eager to borrow one and check out the iPad book reading experience.

Thursday, March 4, 2010

The War Against Science Escalates

Yesterday’s New York Times reports that the anti-evolutionists are joining forces with the global warming deniers. I suppose this was inevitable, as both groups share the common practice of believing what they want to believe, without regard for the facts.

Here in Utah we get a strong dose of anti-science every winter during the legislative session. This year our elected leaders have officially proclaimed that global warming is a hoax. They also introduced a bill requiring the health department to produce a video of the heartbeat of an “unborn child” of three weeks gestational age, despite the fact that at that age an embryo does not have a heart.  (This bill was later modified to add another week, making the health department’s task barely possible.)  If the legislature had political reasons to dislike the law of gravity, they would undoubtedly try to repeal it.

Amidst all this, I recently received the latest Save Our Canyons newsletter, which contains a refreshing essay by SOC President Gale Dick titled “Is Science Just Another Opinion?”. Dick is also a retired physics professor from the University of Utah, so he and I naturally look at a lot of things in the same way. In the essay he insightfully lists possible reasons why so many people reject science:
  • Flaws in our education?
  • Sheer laziness?
  • Fear?
  • The inability of science to explain why so many terrible things happen to us?
  • Distrust of academic scientists who come across as arrogant and elite?
  • Belief that science is the enemy of religion?
  • Cultural aversion to mathematics?
  • Reluctance to accept the limitations that science puts on what is possible?
  • Reluctance to accept the responsibility that comes with scientific knowledge?
There are no simple antidotes to any of these understandable human shortcomings. The only cures are education, hard work, and integrity. All three of these things are part of the difficult process of growing up, when we recognize that we must accept the things we cannot change, work to change the things we can, and inform ourselves well enough to tell the difference.*

Monday, February 15, 2010

Yet Another Planetarium Simulation


Last November, as another semester of teaching Elementary Astronomy drew to a close, I finally broke down and started writing my own planetarium simulation.

On its face, this was a ridiculous waste of time. There are already dozens, if not hundreds, of planetarium simulation programs that will show you where the stars and planets appear in the sky, as viewed from your chosen location at your chosen time. They run on every microcomputer platform and many handheld devices. Some are extremely sophisticated, with databases of millions of stars, beautiful images, and even the ability to interface with a telescope.

But I wanted something a little different. To be useful to most of my students, a simulation program has to be (a) free; (b) delivered through a web browser, with nothing to download or install; (c) easy for beginners to understand; and (d) convenient for showing the motions of the stars and other objects with respect to earth's horizon.

Sky View Cafe is a great Java applet that almost fits the bill, and I’ve been recommending it to my students for several years. It’s loaded with features and was obviously written by a pro. But its time/date control is a bit awkward to use, so it isn’t ideal for showing celestial motions. Worse, its default full-sky view can be disorienting for beginners, who often have trouble relating the circle on the screen to the domed sky overhead.


Fifty-six years ago, H. A. Rey showed the best way to draw the sky for beginning observers. The illustrations in his delightful book Find the Constellations show half the visible sky at a time, as if viewed through a huge domed window stretching 180 degrees from side to side, from the horizon at the bottom to zenith at the top. I vividly remember reading that book as a child, and I wanted a web applet that gives a similar view of the sky.

I also wanted intuitive, analog controls for changing the time and date, so the sky’s motions would be easy to explore. Inspired by the GoSkyWatch iPhone app, I settled on a circular dial for changing the time of day, with a concentric inner dial for changing the date of the year. I added similar analog controls for setting the latitude and longitude.


To make the motions even more apparent, I included a feature that WSU’s digital planetarium projector has: the ability to show “trails” that simulate long-exposure or multiple-exposure photographs. Besides the familiar star trails, you can use this feature to trace out analemmas and retrograde loops.

This project has soaked up much of my free time for the last three months, but I think it’s essentially finished for the time being. I hope my students--and others who discover it--will find it useful.

Thursday, February 11, 2010

Skiing Wheeler Creek


Without a doubt, Ogden’s best easy ski touring is on the Wheeler Creek trails, below Snowbasin. While the better skiers are risking their lives on higher, avalanche-prone slopes, I find plenty of challenge--and great exercise--on twisting, narrow trails over gentler terrain.

My favorite tour, when snow conditions allowed, was to ascend the East Fork of Wheeler Creek from the Art Nord trailhead to Green Pond. After an initial 1.5-mile climb, the terrain opens out and you’re suddenly rewarded with an expansive view of Mt. Ogden and its satellite peaks. The photo above was taken from that magical spot on my very first trip up this trail, 15 years go. (That’s Jock up ahead.)

The final leg of the three-mile trail to Green Pond is now crossed by the new Snowbasin highway, which has ruined several of the old routes in the Green Pond area. So now the best option is to stop just short of the highway and descend the lovely trail along the Middle Fork of Wheeler Creek, closing a five-mile loop either on the old highway (now unplowed and groomed for skate skiers) or on the more challenging trail that parallels the highway just to the north. Ten of us went on a Sierra Club outing around this loop last Sunday, enjoying it as much as ever.

Fifteen years ago, navigating any of these trails in winter required good route-finding skills and a high tolerance for oak brush. But the Forest Service rebuilt all the trails shortly before the 2002 Olympics, clearing the brush and building excellent bridges across the many tributary creeks. Now the trails get packed down by snowshoers soon after each storm, so navigation is rarely a challenge.

The East Fork / Middle Fork loop outlines a small peninsula of public land, bordered by private land on the east, west, and south. It’s just a matter of time before those private lands will be developed with trophy homes and golf courses. The solitude and sense of remoteness will then be nearly gone, but at least the trails themselves will remain.

Sunday, January 10, 2010

Happy New Year!

If anyone out there has been checking this blog for the last 11 weeks, hoping to find something new, please accept my apologies. Blogging got pushed to the back burner by a combination of the November election, professional duties, holiday travels, and personal matters. In any case, Happy New Year! 

Speaking of holiday travels, I again spent New Year’s weekend in remote Boulder, Utah, with a group of accomplished environmentalists. Couldn’t ask for a better setting, or better company, to start the year off right.

Besides enjoying the scenery and getting some exercise, we did a bit of work during our day hikes. On New Year’s Day we documented the existence of a new house that someone has illegally built on BLM land in the Grand Staircase-Escalante National Monument. I suppose the builder figured it would be easier to get forgiveness than permission. Let’s hope he gets neither.


Then on Saturday we helped Bill Wolverton in his ongoing war against invasive Russian olive trees in the Escalante canyons. Bill is a seasonal employee of the Glen Canyon National Recreation Area, and I’m pleased to report that after a decade of work he is winning the war--at least within the NRA boundaries, where he has cleared the Russian olive from the Escalante River canyon and nearly all of its tributaries. Let’s hope that the BLM, which manages the upstream portions of these canyons, will soon step up its weed eradication efforts to match those of the Park Service.

Our small project was merely to cut and burn some Russian olive that Bill had already killed some time ago. This is important not just for aesthetics but also to clear out the thorny brambles for the benefit of hikers and wildlife. First we had to hike through a couple miles of weed-choked canyon down to the NRA boundary, where Bill’s past efforts were immediately apparent. Still, there’s more work to be done, and we built two splendid bonfires to dispose of the cut logs and brush.


Returning to Boulder at nightfall, we were treated to a spectacular sky full of stars. So I quickly changed into dry socks, grabbed my camera and tripod, and ran back out to the road to take a few photos before moonrise (and supper). I first shot a photo of Orion rising and another of Cygnus (and the Milky Way) setting. Then glancing over at Jupiter, I saw a distinct glow around it and realized that, for the first time in my life, I was seeing the famous zodiacal light. Boulder must be one of the few inhabited places on earth where this faint band of light is still visible. I’m already making plans to go back for a star party in warmer weather.