None

Microsoft: We can't release code because of National Security

Yet another reason why infrastructure should be as open as possible,eWeek goes over last week's Microsoft's testimony where Microsoft VP Jim Allchin tried to explain to the court that Microsoft should be allowed to keep certain APIs secret because they are so vulnerable that disclosing them would constitute a threat to National Security, incluiding issues in its digital rights management systems, and an enterprise system called Message Queueing. To quote from the article:

When pressed for further details, Allchin said he did not want to offer specifics because Microsoft is trying to work on its reputation regarding security. "The fact that I even mentioned the Message Queuing thing bothers me," he said.

The mind boggles. So Microsoft's screw-up will be an acceptable defense to keep its protocols secret? Wow. If this isn't one of the mist Alice-in-wonderland turns in recent memory, I can't think of many better ones. Perhaps it would only be stranger if the judge actually bought the argument, I suppose.

It's as if asprin could actually be turned into poison if it was taken while a particular high-pitched tone was sounded, but the asprin manufacturers used the fact that asprin is a part of the Army's standard medical kit, and that if they had to disclose the frequency of the sound, it would put our military at risk. Rather than just FIXING THE PROBLEM, of course.

And another group plans to offer free roaming

openhotspots.net has just announced theirexistence. Started by a pair of German CS graduate students, they say it is not just another database of open APs. Rather, they are working on an authentication system based on NoCatAuth (sound familiar yet?) that will allow for free roaming across registered hotspots. A first Java-applet based client is done and they are working on integrating it into NoCatAuth. Rock on, guys.

NoCat team proposes handing out the 10.x network for a "Virtual Public Network"

Cory Doctorow, blogging from the O'Reilly Emerging TechnologiesConference, notes that the NoCat folks are proposing to hand out slices of the 10.*.*.* network-space to people who operate radios that use NoCatAuth.

Interesting idea - but there are a few problems:

(1) You've got a limited IP Address space. (2) No one considers you the authority, so others are going to use the space as well. (3) How do you deal with rogues? (4) It is going to require a huge VPN - or at least IP-in-IP tunnelling system to allow for routing between nodes in the private space over the public internet, along with all the routing headaches associated. These are not small problems.

It is an interesting idea, though, akin to Sputnik's roaming capabilities - by placing authentication databases at a number of centralized points, Sputnik already provides this free roaming capability, without the need for tunnelling or dedicated IP space. We're solving somewhat different problems - Sputnik for example, is not built to give each wireless user his own unique IP address; rather it is allocated from the pool that each Gateway delivers, and therefore some of the whiz-bang routing features of the NoCat proposal aren't implemented. However, at the same time, the smaller problem set allows us to avoid the big problems mentioned above.

The "tragedy" of the radio spectrum commons?

Glenn Fleishman posts a greatcommentary on the recent discussion about the "tragedy of the commons" of the 2.4 GHz space. Given some comments on the matter by Dewayne Hendricks, member of the FCC Technology Adisory Council and chairman of the FCC's Spectrum Management Working Group. Definitely worth a read - Glenn points out the real issues involved, the technology limitations, and the industry standards coming out designed to promote shared use of the spectrum.

This is an interesting thought experiment - will the success of the 2.4GHz spectrum (and any other unlicensed spectrum) fail due to its own success? Will illegal amplifiers turn the spectrum into another Citizen's Band? Even without illegal amps, is it doomed to failure because the density of devices will increase too quickly?

I don't think so. But it does remain an open question - how much is enough? In other words, as 802.11h and other standards that help to reduce interference become more popular, at what density of spectrum do even those methods fail? Surely there is a transmission power and density for which the specrtum becomes unusable. The question is, can technological advances outpace the bandwidth needs of the public? As more bandwidth is available over the airwaves, whether by spectrum allocation, frequency increases, or new standards for interoperable devices, at what point will the spectrum be rendered effectively unusable? To what extent is legislation or regulation needed here?

Maybe the answer lies in the fact that the unlicensed spectrum (2.4GHz, used by 802.11b and others, and 5.3GHz, used by 802.11a), while unlincensed, IS NOT UNREGULATED. Among other things, all devices have to follow FCC Part 15 rules, which means that they must be approved by the FCC before the manufacturers can offer them for sale. Perhaps the key to saving the commons is to ensure that these devices are interoperable and, well, for the lack of a better term, polite to other users of the spectrum.

Of course, this would be an extension to the FCC's regulatory capacity - essentially asking it to endorse certain protocols at a layer above the radio. However, I think that by focusing on protocols rather than products, it (a) does not act anti-competitively, and (b) promotes the public good (remember, WE own the airwaves!) by enabling more functionality and usability of the spectrum we use. And if we are smart, we can agree to a level of interference protection that all higher level radio protocols can use, allowing for even greater flexibility.

Think of it as a new layer, sitting between layers 1 and 2 of the network stack - the new layer would provide for interference detection, channel switching, and possibly even automated changes in the spread spectrum algorithms to make sure that different devices, running different higher-level protocols, would automatically detect each other and not interfere with each other.

Good Technology announces wireless e-mail service on Blackberry

The company is backed by Kleiner Perkins and Benchmark Capital, whosepartners John Doerr and Bruce Dunlevie are on the board. Good had raised $6 million in its first round raised in May 2000 and $53.5 million in several tranches between November 2000 and July 2001, none of which were disclosed at the time.

Good Technology's wireless e-mail service functions on Blackberry devices as well as Good Technology's own G-100 devices, which the company plans to release later this summer.

What's cool about this? Its server software sits in your NOC and talks to your exchange server. It gives you up-to-date email/calendar/to-dos in real-time. So if your assistant schedules you for a last minute meeting while you're on the road, the device alerts you on the fly.

OpenOffice 1.0 for Linux - not bad!

I've been playing with the latest OpenOffice(which has just gone to 1.0) and I must say that it is pretty darn good, giving me no problem with common documents, both .doc and .xls files. Yes, it is a hefty program, weighing in at about 60MB, but it works pretty well.

You can download it from the main site or there are debian binaries available by putting the following in your /etc/apt/sources.list:

deb http://apt-proxy.sf.net/openoffice unstable main contrib

and then doing "apt-get install openoffice.org". Kudos to the OpenOffice team.