Cuisinart CSO-300 steam oven: a re-review

oven

I’ve had the Cuisinart CSO-300 steam oven for three months now. It deserves a re-review.

This oven has earned its way into the same category of possessions as my camera: I’d panic if I suddenly didn’t have it. After three months, I can’t imagine not having it in the kitchen. To cook without it would be a huge downgrade.

My primary reason for this is bread-baking, using the the oven’s steam function. Not until I had this oven could I really make artisan breads at or near the professional level.

The next most important factor, I think, would be reheating leftovers. Granted, most people use microwave ovens for that. However, I abhor microwave ovens and haven’t had one for years. I don’t like what microwave ovens do to foods. Whereas the combination of steam and convection in this Cuisinart ovens warms up leftovers very fast and makes leftovers taste fresh made.

Another important factor is convenience and energy savings. My big oven now seems impossibly slow and limited, and I hardly ever use it now. Not only does the big oven take a long time to preheat, it also takes a long time to cool down. The energy use is substantial, and it creates a huge heat load in the house in summer weather. The Cuisinart oven is somewhat better insulated than most countertop ovens. It preheats very fast.

Today I gave this little oven its first good cleaning. Cuisinart recommends running the oven with the steam setting, set at 210 degrees, for 30 minutes, before cleaning. This does seem to loosen the crud somewhat, though not by any means will it render the oven spotless. The interior is stainless steel. It’ll look nice and clean after you’ve cleaned it, but some of those brown spots are going to be permanent.

It does take time to get used to this oven. One learns to keep tall items away from the upper heat element, so that the tops of whatever you’re cooking won’t brown too fast. Another hot spot to avoid is the convection outlet. This is no big deal when you get used to it. Yes, it would be nice if the oven were a little larger, but if it were larger it would be a problem finding counter space for it in my kitchen. So I think Cuisinart probably made a good compromise on the oven’s size.

If I have complaints, they’d chiefly be about the oven’s digital controls. The bread function, regardless of total baking time, applies steam for the first seven or eight minutes of baking, and that’s not adjustable. That’s not enough steam time. Bread will continue to rise in the oven for longer than that. I get around this by restarting the steam function as soon as it stops, for a total of 14 minutes of steam. It’s also a little annoying that, while the oven is baking, the time remaining is displayed on the screen, but the temperature is not, even though there’s plenty of screen space. I’m going to guess that Cuisinart will improve its control programming in later models.

Amazon sells this oven for $225 right now, shipping included. That seems like a bargain to me. You’ll want this oven only if you’re looking for the steam function. Otherwise a simpler and less expensive oven will do the job for you. But keep in mind that the steam is useful not only for bread baking but also for reheating leftovers and for cooking rice dishes or other casseroles that tend to dry out in the oven. Many people swear by steam roasting for things like chicken. However, I don’t cook chickens, so I can’t testify to that.

My hope is that steam ovens for home use catch on. Then not only might Cuisinart come out with other models, but competitors also will get into the market. Commercial steam ovens are very expensive, but I’m now convinced that serious cooks seriously need steam baking at home.

I rarely watch cooking shows because I rarely watch television, but if Cuisinart was smart they’d get some television cooks to start using these things. Then everybody would want one, and we’d soon have a great market in steam ovens for home use.

The normal failure of CFL bulbs

cfl-bulb
Note that the top of the base is slightly brown from heat, which may have occurred when the CFL failed. This is normal.


I happened to be standing right underneath a compact fluorescent bulb yesterday when it failed. It failed exactly according to the book: There was a quiet pop, about the volume of a single grain of popcorn popping, and the bulb went out. When I removed it and looked at it, the white base was slightly brown from the heat of the ballast failure. This was a completely normal failure in accord with the way CFL lamps are designed to work.

The bulb was one of four in my kitchen ceiling that light the countertops. The lamp was seven years old. It was one of the brighter types — equivalent to 100 watts of incandescent light at 23 watts power consumption.

Right-wingers who believe that any kind of energy conservation is a left-wing conspiracy have done everything possible to demonize CFL bulbs. A while back, a conservative friend on Facebook shared a propaganda post about how terrified some right-winger was when a CFL bulb made a popping noise and a blackish brown spot appeared on the base. If I hadn’t been home it could have burned my house down! said the Facebook post. Horse wash. Some of the earlier bulbs failed less gracefully, but they all eventually fail, and the failure is usually in the power supply. As the Wikipedia article on CFL lamps points out, one of the challenges of designing CFL lamps is designing in an inoffensive failure mode. And of course nobody wants to smell smoke. My CFL failure yesterday created no odor at all.

The power supply in the base of the bulb, by the way, is a small electronics board that first converts AC house current to direct current. Then transistors convert the direct current to very high frequency alternating current, which is fed to the bulb. It’s this circuit that normally fails, not the glass part of the bulb.

No one claims that CFLs are perfect. What we all want is cheap LED lighting with a natural sunlight color. We’re getting there.

Ancient astronomy

A-ptolemy2
An illustration from James Evans’ book on ancient astronomy


I’ve mentioned that the sequel to Fugue in Ursa Major is going to involve time travel. The plot requires that I have an understanding of the state of the science of astronomy around 48 B.C. As a source for that, I am reading James Evans’ The History and Practice of Ancient Astronomy, which was published by the Oxford University Press in 1998. This is a beautiful, well-illustrated, and fairly expensive book. It has left me greatly impressed at just how much the ancients knew.

We generally assume that modern astronomy began with Copernicus and Galileo as the Dark Ages were coming to a close. In 1633, the church convicted Galileo for following Copernicus in saying that earth is not at the center of the universe. But some of the ancient Greek astronomers figured out that the earth moves around the sun, though it was not a mainstream idea in ancient times. Aristotle knew that the earth is a sphere. Heraclides of Pontos, a student of Plato, taught as early as 350 B.C. that the earth rotates and that the stars are fixed. Greek astronomers were able to make pretty good estimates of the size of the earth and moon, though their estimates of the size and distance of the sun were less accurate. The Greeks understood trigonometry. They had a pretty accurate theory of the motion of the planets. Even before the Greeks, the ancient Babylonians were excellent astronomers who made detailed star charts and kept accurate astronomical records. Babylon’s knowledge was passed down to the Greeks. The Greeks built on Babylonian astronomy, especially during the golden years of Alexandria, culminating with Ptolemy’s Almagest around 150 A.D. After Ptolemy, the Dark Ages began in the West, so Ptolemy remained authoritative for hundreds of years.

So, it’s not really true that, to the ancients, the science of astronomy was barely distinguishable from the myths of astrology. They knew a lot.

So how did they use what they knew?

For one, they wanted better calendars. The daily cycle, the lunar cycle, and the annual solar cycle don’t fit together in tidy ratios, so there is no perfect calendar. Our own Gregorian calendar, an antique which is a refinement of the ancients’ Julian calendar, requires all sorts of adjustments including leap seconds and leap years. In its essentials, our calendar today is the Roman calendar, which relied heavily on Greek astronomy.

Astronomy is critical to agriculture — when to plow, when to plant. This remains true today, and I still subscribe to an almanac, as did my grandparents. Benjamin Franklin’s Poor Richard’s Almanac was a bestseller in the American colonies. People planted by it.

Astronomy also is critical to navigation, surveying, and mapmaking. Ancient sailors knew how to navigate by the stars. One of the reasons I chose Ursa Major as part of a book title was its importance to the ancients. The constellation of Ursa Major is visible for the entire year in most of the northern hemisphere. Ursa Major includes some easily identified “pointer stars” (the Big Dipper) that make it easy to locate the polar star and therefore true north. An ancient sailor who wanted to sail east at night would keep Ursa Major up to his left. We know that the ancient Celts had excellent seafaring skills and excellent ships and that the Celts also used Ursa Major for navigation.

How about astrology? It would be easy enough to accuse the ancients of being superstitious because they tried to use the stars to predict the future and to make generalizations about human nature and human fate. But we moderns are just as guilty, since horoscopes remain important in the lives of lots of people.

It’s easy enough to reproduce the astronomical observations of the ancients with some simple instruments. A gnomon (which is what a sundial is) will allow you to deduce and measure all sorts of information if you trace the sun’s shadow for a year. If you trace the sun’s shadow for a single day, you can very precisely locate true north. If you have a protractor or an astrolabe and measure the angle of the sun above the horizon on the summer solstice, you’ll know your latitude. Looking through tubes attached to a tripod will let you measure an object’s motion from hour to hour. You’ll need some star charts. And if you want to get fancy, you’ll need to brush up on what you learned about tangents, sines, and cosines in trigonometry class.

Even today, with an astrolabe, a watch, and a view of Ursa Major, you could throw away your GPS.

How would you do that? Measuring the angle of Polaris, the north star, above the horizon will tell you your latitude. That’s easy. Longitude is more difficult, and longitude bedeviled the ancients. But if you can determine your local time by getting a precise fix on noon (with the gnomon of a sundial, say, or the shadow of a stick stuck in the ground), and if you know what time it is at some distant place with a known longitude (Greenwich is handy for that), then you can calculate your longitude. At night, you can get a pretty good fix on the time by measuring the position of a known star.

To clarify the concept of longitude, keep in mind that the British navy carried accurate clocks on their ships (chronometers) not because they cared about the local time wherever they might be. Rather, the chronometer always said what time it was back in Greenwich. If you determine your local time from the sun or a star, then the difference between your local time and Greenwich time tells you how far you are east or west of Greenwich. After accurate clocks were available for ships, marine navigation greatly improved. This is why Britain’s Royal Observatory at Greenwich was commissioned by King Charles II in 1675. In the U.S., the Naval Observatory is one of the oldest scientific organizations in the country. The Naval Observatory was responsible for the “master clock” that the navy used for navigation. The observatory still is responsible for the master clock! The time used by GPS satellites is determined by the U.S. Naval Observatory.

But before GPS, if you were a ship at sea carrying Thomas Jefferson from Virginia to Calais, you’d needed a star to figure out the local time. The stars most convenient for that are in Ursa Major.

I like to think of it this way: The stars are still up there, raining information down on us day and night. All we have to do is just look up, and measure.

Software review: Scrivener for Macintosh

scrivener-shot

Scrivener is an application particularly designed for writers. It has templates for fiction, non-fiction, academic work with footnotes, screenplays, etc. It is available for Macintosh, Windows and Linux.

I’m a nerd. I have been using Macintoshes since the 1980s. I use Adobe InDesign for publishing work, and I’m very happy with it. But InDesign isn’t for writing; it’s for setting up typography and page layout in the final steps of publishing. As I prepare to write the sequel to Fugue in Ursa Major (which has the working title Passacaglia in Ursa Major), I simply could not face writing another novel in any of the editors I’ve ever used. I’ve used many editors. I go all the way back to vi and nroff in the early Unix world. Like I said, I’m a nerd.

I started looking for outliners and editors. They all are terrible. I like many of the concepts of programs like BBEdit, because they’re clever and oriented toward streams rather than pages. But BBEdit and other programmer’s editors don’t even support bold and italics, as far as I could tell. I despise Microsoft Word and refuse to use it. I cannot stand software that tries to automate things and that thinks it knows better than I do. In Word, things are always hopping around in unpredictable ways. For every keystroke that Word’s automated features might save you, 9,462 keystrokes are required to clean up the messes it makes. LibreOffice is no better, just because it’s from the Word universe. The chief virtue of LibreOffice is that it’s fully Word compatible, so that one doesn’t have to support Microsoft.

Opting for simplicity and nonviolence, I was about to start the new novel in Macintosh TextEdit, using a TextEdit file as an outline. TextEdit is a simple, bare-bones editor. But then while Googling I came across Scrivener. I could scarcely believe it. There are people on the planet who think like me!

Scrivener is complicated, but that’s fine. I think I’ve figured out an acceptable way to set up the new novel. Scrivener’s corkboard metaphor for outlining threw me a bit, but I think I’m getting a grip on how to set up my outline using a table view of the outline (pictured above). I think I would buy this program for typewriter mode alone. Typewriter mode keeps the cursor at the center of the text block, so that you can always see the context above and below what you’re writing or editing. It drives me absolutely crazy to always be typing at the bottom of the screen. Also, I abhor being forced to type on images of 8.5×11 pages and watching text hop from page to page and get entangled in unneeded and unwanted headers. A novel has nothing to do with pages until the last steps of the publishing process, in InDesign. During the writing process, it’s just a stream of text, and putting the text on pages just gets in the way.

I wrote Fugue in Ursa Major chapter to chapter, but I had already decided that I wanted to structure the new novel as scenes. Scrivener was ahead of me on that. By default, chapters are made up of a sequence of scenes. I also wanted a way to track settings and characters. Scrivener was ahead of me on that. It has templates for tracking characters and scenes, and the text can be tagged if you need to search a long novel for places that involve a particular character or scene.

The concept of compiling is brilliant. Having used software compilers for 30 years, it was instantly clear to me how the concept of compiling is appropriate to a long stream of text. Scrivener’s compile feature will probably scare users who are unfamiliar with the concept. But if you go through the tutorial that comes with Scrivener, compiling will start to make sense.

Scrivener costs $45 for a single-user license. The software was written by a nerd in Truro, Cornwall, because he needed a program like Scrivener to help him write his Ph.D. dissertation.

Macintosh memory management

mac-memory

This is a nerd post. Sorry, non-nerds…

I have been using Macintoshes since the 1980s. The Macintosh operating system has gone through many changes in that time, and it’s one of the best computer operating systems in existence. Though my favorite all-time operating system was Solaris, the version of Unix produced by Sun Microsystems. A few months ago, Apple released version 10.9 of the Mac OS X operating system, also known as Mavericks. I like Mavericks (not least because it was free).

If you turn your computer on and off every day, memory management is of little concern to you. However, if you leave your computer running all the time, as I do, and especially if you have an older computer without the abundance of memory in newer computers, then memory management matters.

One of the wonderful things about the Macintosh OS X operating system is that it is extremely stable. A Mac can be left running for months without needing a reboot. I’ve seen Solaris computers run for more than a year without needing a reboot. But stability is one thing, and memory management is another.

The issue is probably a new concept for non-nerds: memory leaks. Applications frequently contain memory leaks. Memory leaks are generally caused by lazy programming. Programs request memory from the operating system for temporary use, and with many programming languages it is the responsibility of the programmer to release this memory when it is no longer needed. That often fails to happen. Memory leaks pile on top of memory leaks, and soon your computer starts to get low on RAM. When this happens on a Macintosh, you may see the spinning beach ball icon while the operating system comes up with the requested memory. When the computer’s physical memory (or RAM) is exhausted, the computer will resort to “swap” memory — using disk space (which is much slower than RAM) to temporarily store the contents of RAM to disk, then paging the contents back and forth from disk to RAM as needed by the running program. This ability to swap is a sophisticated function of good operating systems that has been around since at least the 1990s. It’s better than simply running out of RAM (which is what used to happen). But swapping is slow. You may have to wait, which is what the spinning beach ball is all about.

Apple’s OS X Mavericks made some significant changes in memory management. The concept is that unused RAM is wasted. So OS X uses almost all of the computer’s RAM all of the time. This can be misleading, because the computer may appear to be more memory-starved than it really is.

But here’s the problem. Applications leak memory, and that’s not the operating system’s fault. When an application leaks memory, the memory can be recovered only by stopping and restarting the application. Here’s a for-example.

For security reasons, to prevent tracking by snoopers like Facebook and Google, I always have two browsers running. In one browser (Safari), I run Facebook and Google applications such as Google analytics. In Safari, that’s ALL I do. It doesn’t matter if Facebook and Google track me, because I don’t go anywhere else in that browser. I do all my real browsing in Chrome, using multiple tabs in an Incognito window, which does not save cookies.

As Chrome and Safari continue to run, often for days, they leak memory. This may or may not be a problem caused by the browser itself. Most web pages these days use some sort of god-awful programming language such as Javascript, so the filthy rotten programming on the web pages hogs memory, then leaks it. So, if you leave a browser running for a long time, your computer’s memory gets leaked, or wasted, and the computer’s operating system must jump through hoops and bend over backwards to keep shoveling RAM to the browser. Facebook’s programming is a horrible memory leaker. I often leave Facebook running, but I close the Facebook page and reopen it occasionally to release the memory it’s wasting. (The people at Facebook are lousy programmers.)

So what’s the bottom line for Mac users, especially for you non-nerds? First, upgrade to Mavericks if you haven’t already. Second, if you see a spinning beach ball, consider quitting from open applications and restarting the applications you need. It’s not really necessary to reboot the computer.

For nerds, you can monitor your computer’s memory usage with Activity Monitor. It is possible to “clean” a Mac’s memory and force all the garbage out of RAM. You probably don’t want to, though, because in doing so you’ll also defeat some of the clever methods Mac OS X uses to optimize the use of available physical RAM. But if you know what you’re doing, a little app named Memory Clean will do this for you, while also continuously displaying in the menu bar the amount of free RAM available. You also can clear the Mac’s memory in a terminal window by typing “sudo purge.” The purge command, which comes with the Mac, will do the same thing. In Activity Monitor, watch the “memory pressure” window. If it’s anything but all green, consider taking action to get it back in the green by closing applications and reopening them as needed.

My iMac is now six years old. One of these years I’ll replace it. But for now I have to live with 4 gigabytes of RAM, which is the maximum my older iMac can take.

And if you see a spinning beach ball, it’s not your Mac’s fault. It’s just that your Mac is trying heroically to deal with the crimes of lousy programmers.

Changing domains: Not for the faint of heart

B-web-traffic

About four months ago, I moved out of the crippledcollie.com domain into this domain — acornabbey.com. Not only is a domain change a tedious and challenging process, even for a nerd. There also is the risk of losing readers.

Frankly, I didn’t do the best of all possible jobs. I went to quite a lot of trouble to move all the posts and photos (about 900 posts and more than 1,000 photos) from the old blog to the new blog. I assumed that Google would find, and index, all those posts at the new acornabbey.com domain. I was wrong. I recently figured out that Google would probably never find all the older material unless I submitted a “site map” to Google, using Google’s Webmaster Tools. That has now been done, and I’m waiting for Google to finish re-indexing everything.

Another step that I needed to take was to automatically redirect traffic from the old blog to the new blog. That was relatively easy, using an .htaccess file on the old server to map everything to the new domain and redirect everything to the new domain.

These changes are kicking in, and traffic to acornabbey.com has more than doubled. I expect it to increase even more once Google finishes indexing all the old posts, which go back to 2007. I’ve posted on many different subjects over the years and show up in a lot of Google searches on obscure subjects such as “Pleides,” “iambic pentameter,” or “biscuits and gravy.”

Nerd post: Hewlett Packard 3456a digital multimeter

B-hp-3456a-1

One of the tragedies of being a nerd is not being able to afford the toys one would like to have when those toys are new. But, thanks to eBay, we can go back in time and find bargains in some of the cool things we’d have liked to have many years ago.

A recent eBay acquisition is an HP 3456a digital multimeter. I believe the list price on these devices was $6,395 in 1982. They can still cost $600 to $700 on eBay if the seller can vouch for the history of the instrument and guarantee that it’s in good working order. If you watch the auctions carefully for a while, you can pick up an HP 3456a for less than $100 if you’re willing to accept some risk.

This instrument seems to be in great working order. The display in the photograph is showing the readout with a 100-ohm resistor across the terminals. The tolerance of the resistor is 5 percent, so at only 1.213 ohms off the rated value of the resistor, the odds are good that both the resistor and the HP 3456a are close to specification.

One of the cools things about the HP 3456a multimeter is that it can be controlled (using an HP-IB interface) by an early HP computer — the HP 85, one of which I also happen to have. Getting the two devices talking to each other will be a project for some other rainy day.

It’s a shame that Hewlett Packard is a mere shadow of its former self. Clearly, years ago, it was a company run by engineers, for engineers, selling to engineers. Cost was not a problem. It was all about the quality of design and the quality of the build. That’s why their stuff is still working today.

Keeping an eye on the FCC

roosevelt-2
President Roosevelt prepares for a fireside chat.

A couple of days ago, I posted an item on the importance of keeping an eye on the FCC. The item was focused on the future of over-the-air television, which may not affect your world very much. Still, we all need to keep an eye on the FCC, because decisions made by the FCC are critical to the future of the media, the future of the Internet, the choices we have, and what we pay.

A friend of mine who teaches communications law commented on that post. So that his information doesn’t get lost in a comment, I’m reposting it here.


The libertarians’ absolutist argument against regulation in the communications sector is silly on three particularly ironic points:

1) We already have a largely unregulated system thanks to the Telecommunications Act of 1996, which loosened or erased many longstanding rules, particularly those guarding against monopoly ownership. Among other things, the Act led to an almost overnight consolidation of the radio industry whereby a company like Clear Channel could grow from 60 to 1,200 stations in 18 months. The first order of business in that nationwide takeover was the elimination or decimation of local news staffing at all of those stations.

2) The media and telecom giants, from Time-Warner Cable to Disney, long ago captured the regulatory agencies, along with Congress and state legislatures, and openly and brazenly manipulate the rules they are supposed to live by. Furthermore, the FCC often doesn’t even enforce its own rules, making them meaningless. Just one example here: The FCC has allowed Rupert Murdoch to get around the newspaper-broadcast cross-ownership ban by granting him a waiver year after year; thus, he can control newspapers, television stations and radio stations all in the same market (New York, for one).

3) Media companies WANT there to be rules because the rules help them operate in a necessarily structured and predictable environment, and because, more often than not these days, the rules favor their interests against those of the public. The ban on municipal broadband in North Carolina is a prime example, but industry-friendly — indeed, industry-written — rules stretch to the FCC and the Justice Department, which, for example, is sure to rubber-stamp a merger between Comcast and Time-Warner Cable if the two companies decide to go ahead with it. It would create a monopoly that would control the television and Internet services of about 50 percent of the American population.

There are other reasons that the libertarian dream of a no-rules-at-all utopia is stupid, but those three suffice. I suppose the fundamental point to make is that their position is ahistorical. It is detached from both the technical and legal history of the communications sector. When the federal government first started regulating radio in 1927, it was because the radio owners themselves were screaming FOR regulation — someone to police the wild, wild west of their new industry and sort out the chaos of too many stations chasing too few frequencies. Regulating the technical aspects of radio was at the center of the FCC’s mandate when it was created by the Communications Act of 1934, and it remains a vital part of the agency’s mission today.

An example particular to Acorn Abbey: The only way there will ever be high-speed Internet service in such a rural locale will be through the use of so-called “super wifi,” which harnesses unused “white space” on the “gold-plated spectrum” that television stations enjoy. It can travel for miles and penetrate buildings just like a TV signal. There are even experiments under way to see if television transmitters can be altered so that they also can transmit Internet traffic. It would solve the rural broadband build-out problem overnight because the infrastructure is already in place.

Of course, the same companies that routinely decry regulation of any kind, the likes of Comcast and Time-Warner, will do anything they can to manipulate the rules to prevent the above scenario from happening. And they will try to manipulate the rules at the federal, state and county levels to stop any new efforts to break their monopoly control. And once again, the problem will not be that we have regulations. The problem will be that we have regulations written to benefit the regulated, not us.

For background on the Radio Act of 1927:

http://en.wikipedia.org/wiki/Radio_Act_of_1927#The_Radio_Act_of_1927

For background on the Communication Act of 1934:

http://en.wikipedia.org/wiki/Communications_act_of_1934

For background on the Telecommunications Act of 1996:

http://en.wikipedia.org/wiki/Telecommunications_Act_of_1996

Sousveillance?

sousveillance
Source: Stephanie Mann, age 6, via Wikipedia

Periodically I check out the web site of David Brin, a science fiction writer and futurist, to see what’s on his mind. Brin is the author of the brilliant and classic Startide Rising (1983), which won both the Nebula and Hugo awards the year it was published. But, smart as Brin is, I find that I usually disagree with him. This is because I put him in the unpleasant category of techno-utopians — people who think that technology will solve all our problems, including our energy problems and even our political problems. I think that is bunk, and dangerous bunk.

Brin had linked to a piece he wrote in “The European” in which he argues that the solution to growing surveillance and invasion of privacy is “sousveillance.” The word “sousveillance” is a made-up word and is the opposite of surveillance. It means spying up at elites the same way they spy down on us. The prefix “sur” of course comes from a French word meaning over, or above; and “sous” is another French word meaning under, or beneath.

This notion that sousveillance is an effective antidote to surveillance seems to me to be so obviously silly that I’m inclined to think that the techno-utopians are even more deluded than I had thought. Just give everyone a Google glass and we’ll fix the world’s surveillance problem!

First of all, there is a straw man fallacy: “… [F]or the illusory fantasy of absolute privacy has to come to an end.” Who said anything about absolute privacy? There has never been such a thing as absolute privacy in American society or American law. The law and the Constitution are almost silent on the issue of privacy. But there have been lots of lawsuits having to do with privacy, and as far as the courts are concerned the issue is pretty settled.

But the second and biggest point of silliness is the notion that we small people have the same power to spy on elites that they have to spy on us. Yes, sometimes it happens. The photo of the cop pepper-spraying a group of already restrained protesters held our national attention for weeks. That was a fine example of sousveillance — someone had a camera ready at the right time. Another brilliant lick of sousveillance was when a waiter (or someone) at a Romney fund-raising event for rich people secretly made a tape of Romney trashing 47 percent of the American people as “takers.” It helped expose Romney as a servant of the rich, and it helped him lose the election.

Edward Snowden’s spying on the spies, then releasing the evidence to the media and to Wikileaks, is the all-time best example of sousveillance. Because of the actions of one very clever nerd, the elites caught red-handed are still squawking and trying to lie their way out it. We got some very useful information on how elites’ surveillance systems operate, though that information will soon enough be obsolete.

But as brilliant as these coups of sousveillance were, such things are always going to be rare and accidental. That is because elites have systems for secrecy that we little people will never have. They are rich, they are ruthless, and they are spending hundreds of billions of dollars (most of it our own tax money) to build walls of secrecy around themselves while monitoring everything we do. The idea that the little cameras in our phones, or built into our glasses, can fix this is seriously dumb. Nevertheless, we need to always keep our cameras handy, and we must be creative in coming up with new ways to spy on elites.

pepper
Dumb cop: Nailed by the camera!

romney
Dumb politician: Nailed by the camera!

The future of over-the-air television

television
Hugh Jackman in “Oklahoma,” broadcast yesterday on WUNC-TV

One of my regular themes on this blog is beating down the misconception that digital technologies have made radio obsolete. The opposite is true. Digital technologies have made radio more important than ever. I am, of course, using the broad definition of radio — the wireless transmission of information using the electromagnetic spectrum. Your WIFI router, your cell phone, your car keys — all of them contain radio apparatus, and they all use some part or other of the electromagnetic radio spectrum. By this definition, even television is radio. It’s just that the radio signal used by television is modulated in such a way that it can create an image.

The important thing to know here is that there is only so much radio spectrum, and that there is not enough of it. The only way to manage this limited resource is to regulate the living daylights out of it, and to manage the spectrum wisely and frugally and in the public interest, because radio spectrum is a publicly owned natural resource. That’s what the FCC is for.

Most people get their television these days by connecting to cable or satellite. But about 10 percent of Americans — including me — either can’t or won’t pay the high cost of cable or satellite and get television over the air, through an antenna. This is on my mind right now because I finally was able to find a low-cost antenna ($40), that when placed in my attic and pointed toward Sauratown Mountain can pick up the nearest PBS television station. Up until now, the abbey’s rarely used television (except for watching DVD’s and Blu-ray) has not been able to receive PBS.

I’m not going to get too nerdy on this point, but because I have a strong interest in radio communications and because I have an Extra class amateur radio license, I’m very familiar with radio spectrum and how radio waves propagate differently according to their frequency. Most people, of course, don’t care in the least what frequency their cell phone is using. But we nerds care, and given a particular device we probably can tell you pretty precisely what frequency or “band” it is using. Depending on who your carrier is and whether we’re talking about voice or data, your cell phone is using UHF frequencies between 800 Mhz up to about 2500 Mhz.

The UHF television channels (channel 14 through 69) range from about 470 Mhz to 800 Mhz. This is very valuable radio spectrum, and big players like Verizon want as much of it as they can get. There is no plan at present, as far as I know, to completely toss out broadcast television. But the FCC is working on taking back television spectrum and freeing it up for wireless data.

This is not a bad idea, though I’m always wary when so much money and corporate intrigue are involved. Because radio waves of different frequencies propagate differently, there are some advantages to television vs. cellular frequencies. The lower television frequencies penetrate buildings better and can travel farther. Cellular towers wouldn’t have to be so close together. But because the frequencies are lower, antennas also need to be longer to be efficient. Television frequencies are ideal for devices that are in a fixed location with a larger antenna — for example, in your attic rather than in your pocket. That’s why television ended up on those frequencies in the first place — wisdom and technical savvy exercised years ago by the FCC.

I mentioned that the PBS station nearest to me is on Sauratown Mountain, about 18 miles away. The right technology sitting on that mountain, using television frequencies, would be ideal for finally getting true broadband into a bandwidth-deprived rural home like mine.

Will it happen? First the FCC has to get it right. Then someone has to step in and build the infrastructure. Personally I would like to see a publicly owned, nonprofit system using this spectrum and providing broadband to rural homes and businesses at a reasonable cost. But corporations hate this idea, because the few publicly owned broadband systems in the country are delivering data much faster and cheaper. In North Carolina, corporations even lobbied for, and got, a state law that all but eliminates competition from publicly owned systems. And yes I’m still angry about this, because it shows how easily politicians can be corrupted into serving profit rather than the public interest.

Our job is to keep a close eye on what the FCC is doing and make sure that the public interest is served. Most people don’t realize that the radio spectrum is owned by the public. It is a natural resource, and it is scarce and limited. That’s why it can’t be used without a license, and that’s why it must be closely regulated to prevent misuse and interference. If we don’t keep an eye on the FCC, they’re all the more likely to sell out to profit and betray the public interest. Yes, we “auction” radio spectrum and permit it to be used for profit, but that right always comes with a license and strict terms. There’s always a way of taking the spectrum back if the terms of the license are violated. The radio spectrum ultimately does not belong to Verizon or to any other corporation. It belongs to us.

Now back to PBS for a moment. One of the needs that PBS ought to be serving is keeping the public aware of issues like this. Lord knows the local news won’t. But as far as I can tell, WUNC-TV — North Carolina’s public television system — is letting down on the job. Of course, their budget has been blown apart by the current regime in Raleigh, which I believe prefers that the public be kept in the dark. It appears to me that most of WUNC-TV’s state-produced programming has been heavily featurized and dumbed down. One of this blog’s regular readers is an academic who specializes in this area, so maybe he can comment on the current state of WUNC-TV’s public affairs programming.


An afterword about why regulation is not a violation of individual rights but is absolutely critical: Anyone who holds an amateur radio license is aware of the terms of that license and what kind of violations would cause the FCC to revoke the license. If I tried to use my license to broadcast, as opposed to talking to one other station, I would lose my license. If I repeatedly tried to use the ham bands for political speech or profanity, I would lose my license. If I accepted money for anything I transmitted on the ham bands, I would lose my license. If I got caught even once transmitting on a frequency that I am not authorized to use, I would lose my license. (Try transmitting on a frequency used for law enforcement and see how fast the FCC hunts you down and throws the book at you). If I interfere with another ham radio operator’s lawful rightful to use our frequencies according to the legal terms under which we share those frequencies, I would lose my license. If the use of the radio spectrum was not closely regulated, all your devices that depend on radio, including your GPS device, your cell phone, and the navigation systems of the airplane you’re on, would become unreliable, because there would be no legal means of preventing interference and abuse. I can’t resist getting in the occasional dig at “libertarians.” Mostly, they’re crazy.