Slime, Memorization, and Forests

0
Photo by George W. Ackerman

Creatures That Don’t Conform.” You don’t have to agree with Lucy Jones’s politics or philosophy to share her amazement at slime mold (and don’t miss Barry Webb’s photographs): “They can humble us—with their complexity which is beyond our understanding. We think we have mastered the natural world, yet we don’t know how a slime without an apparent brain can conduct itself intelligently. We think we can bend the Earth to our will, but we know barely anything about microorganisms.”

Study Shows You’re Nobody Until Somebody Loves You.” Naomi Schaefer Riley reviews The Good Life: Lessons From the World’s Longest Scientific Study of Happiness by Robert Waldinger and Marc Schulz and finds much wisdom for leading a good life that runs counter to the dominant cultural narratives: “It is a paradox of modern liberal attitudes that on the one hand careers are supposed to provide fulfillment—all the fulfillment that maybe families and friends used to provide—but at the same time, menial jobs or jobs that are done by less educated people are considered a burden and to be avoided at all costs. Public policy seems shaped by this idea—debt forgiveness for college graduates so they can go into low-paying white-collar jobs while eliminating work requirements for the lower classes. The idea that having a set of responsibilities for supporting other people—whether it is coal-mining or caring for one’s children—and giving life meaning and purpose, does not seem to occur to these thinkers. Instead, careers are simply about freedom, autonomy, and self-fulfillment.”

Beaver County Res­i­dents on Alert as State Lead­ers Stress the Dan­gers of Toxic Chem­i­cals from Ohio Train De­rail­ment.” Jordan Anderson, Hallie Lauer, and Anya Litvak report on the release of a toxic plume of chemicals from a train accident that happened in eastern Ohio. Residents up to two miles away were evacuated, but it’s not clear yet what the long term damages to people and the ecosystem will be.

Trent Reznor’s Conflicted Rust Belt Legacy. Casey Taylor looks at the family and regional roots of Nine Inch Nails’s lead singer. Whether the slow decline of manufacturing or the sudden destruction of a flood that killed thousands of working class people, this complicated history stands behind Reznor’s music: “Starvation in the Appalachians in the 20th century or drowning in the 19th century always comes back to the same excuse: an unavoidable calamity, or natural outcome of how things break sometimes. Never an assessment of the system that created it. Never an honest appraisal of the ways that our colonial instincts continue to manifest regardless of domestic versus international designation.” (Recommended by Jeremy Delattre.)

You Never Know How It Falls Apart.” Addison Del Mastro muses on the consequences that ensue when “mechanical literacy” wanes and cultures lose the collective know-how to make things. As an example of a drastically simplified “product” that barely counts as something manufactured, he cites Amazon’s Echo: “The whole device is really just a motherboard and CPU chip. In other words, a very basic computer. Everything it does can be altered/disabled by software because everything it does is software. (For example, it used to play songs by request, but now it prompts you to upgrade to Amazon Music. All you actually own when you buy this thing is the piece of plastic. Amazon could render them all paperweights tomorrow.)”

I Thought I Was Saving Trans Kids. Now I’m Blowing the Whistle.” Jamie Reed, who used to work in a pediatric gender clinic in Saint Louis, provides a tragic account of the damage caused by this arm of America’s medical industry: “By the time I departed, I was certain that the way the American medical system is treating these patients is the opposite of the promise we make to ‘do no harm.’ Instead, we are permanently harming the vulnerable patients in our care.”

In Australia’s Outback, a Controversial Cash Crop is Booming: Carbon.” Michael E. Miller and Frances Vinall investigate the challenges that plague Australia’s lucrative carbon offset economy. Even as outside money pours into small, rural communities, divides widen and outside speculators scoop up much of the cash: “in towns such as Cunnamulla or Byrock, carbon farming is blamed for the Outback’s dwindling population as sheep and cattle stations that once employed dozens now grow mulga—although sheep and cows are huge emitters of methane, so are themselves a significant source of greenhouse gas. Land prices are a sore point. The price per acre skyrocketed from less than $20 in 2010 to more than $100 at recent auctions, people in Byrock, the closest town to Kenilworth Station, said. Some had gotten rich; others were resentful.”

How Nepal Regenerated Its Forests.” Emily Cassidy looks at evidence that by returning forest management to local communities, Nepal managed to significantly improve their health: “Under community forest management, local forest rangers worked with the community groups to develop plans outlining how they could develop and manage the forests. People were able to extract resources from the forests (fruits, medicine, fodder) and sell forest products, but the groups often restricted grazing and tree cutting, and they limited fuelwood harvests. Community members also actively patrolled forests to ensure they were being protected.”

There Is No Thinking without Memorizing.” Jon Schaff draws on recent discussions about state curriculum standards to pose a core educational question: “The debate playing out in South Dakota gets at fundamental disagreements about the nature of learning. One of the foremost critiques is that the standards require too much “rote learning” and not enough “thinking skills.” The question is whether these two phenomena, memorization and analysis, are actually at odds. What if the latter requires the former? What if ‘critical thinking’ is not a technique, but a natural outcome of learning content?”

ChatGPT Is a Blurry JPEG of the Web.” Ted Chiang develops one of the more interesting analogies I’ve seen to understand the value and limits of large-language AI models: “Think of ChatGPT as a blurry JPEG of all the text on the Web. It retains much of the information on the Web, in the same way that a JPEG retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it; all you will ever get is an approximation. . . . This analogy to lossy compression is not just a way to understand ChatGPT’s facility at repackaging information found on the Web by using different words. It’s also a way to understand the “hallucinations,” or nonsensical answers to factual questions, to which large-language models such as ChatGPT are all too prone.”

Exit mobile version