If anyone has the skill, time and inclination to be funded to write a free-as-in-freedom 802.1x supplicant for Microsoft Windows Xp and beyond supporting EAP-TTLS+PAP and perhaps EAP-PEAP+MSCHAPv2 and EAP-GTC then now would be a good time to get in touch with an estimated time and cost and a portfolio of your previous work.

The motivation for this request is the end of the happy hour for a until-now widely-used free-as-in-beer supplicant. This is a serious request, but at a tentative stage.

Sometimes things don't follow the Grand Strategy. Such as Sun slipping SCO some money for a license Sun's says was required to open source Solaris, and incidently paying for litigation against IBM and Linux. The Grand Strategy says this is a marvellous 1, 2 punch.

But the related SCO v Novell litigation has just found that SCO had no rights it could sell to Sun -- they were Novell's rights. So Novell can accept US$2.5m from SCO to correct that "conversion" or it can accept nothing, invalidating the sale and leaving Sun with no license which Sun claimed was required to release OpenSolaris.

Doubtless we're about to see some backpedalling from Sun about just what the SCO license was for. But not too much back-pedalling, least IBM's lawyers get very upset.

Novell gets to make a very interesting choice. It has the chance to doom OpenSolaris. That sort of move could have bad PR, but Novell has already damned itself with Linux true believers anyway. Personally if I were Novell I'd be asking Sun for $2.5m and a GPL license for OpenSolaris.

This last mile of cabling from the exchange to my house. That little bit of the network which Telstra owns, charges me a fortune each month to rent, and is using to shut out competition.

Why do Telstra own that? I paid for it's installation. I pay for its maintenance. What sort of bad deal doesn't give me ownership of this wire?

My house's plumbing doesn't work like this, my electricity doesn't work like this. I have to pay for the build to the major street facility, but I then own that build. What makes telcos so special?

That would be the decisions of the regulators who privatised Telstra in the first place. I really should do some research and find out who's responsible for giving Telstra my cable.

A lesson in the wisdom of arguing with the media.

London council unimpressed with Microsoft MOU
Mark Ballard 2008-04-18

Newham London Borough Council has scrapped the controversial 10-year Memorandum of Understanding (MOU) it signed with Microsoft in 2004 and drawn up a new agreement with a new set of deliverables.

It appears Microsoft's flagship government contract failed to demonstrate its value, four years after it was signed.

Should I trust the press again?
Richard Steel 2008-04-22 [sic]

...A little later, 'though, I was contacted by another journalist asking about the story that appeared in The Inquirer (a journal I hadn't heard of) which completely misrepresents my comments.

Far from saying that we had "scrapped" our MOU (Memorandum of Understanding) with Microsoft, what I actually said was that we achieved all of its objectives except one — benchmarking against other Microsoft accounts, industry wide — which was due to be completed within the next 3 weeks or so. We had therefore agreed new actions with Microsoft in a progress review last year.

["This blog appears on the Newham Counil intranet as part of CIO Richard Steel's communications strategy with his team. It is repeated on Computerworld UK a week later." Which can't be true, since the response would have appeared on the intranet before the original.]

War of words breaks out over Microsoft MOU
Mark Ballard 2008-04-23

The salutary lesson to draw from our dealings, Richard, is not whether you can trust the press. It is rather a lesson in managing expectations, a process every CIO should know well.

The expectations you invested in your 2004 deal with Microsoft, as enshrined in the memorandum of understanding, were also unrealistic.

The ABC does peer with Australian ISPs. I know, AARNet does.

The concept that peering is always cheaper than transit is intellectually dodgy. There's very little difference in price in being a member of the Palo Alto Internet Exchange and in taking transit in the PAIX colo facility. That's what you'd expect from competition: all competitors end up near the same price, those with higher prices fall be the wayside.

The difference in price in Australia between peering and transit lies in lack of competition: in undersea capacity, in long-haul capacity and in last mile cabling.

The vulnerability of undersea cables to damage means there are good non-economic reasons for government to insist on peering of major Australian ISPs. We just haven't seen a government which understands the issues enough to do that: I think the ACCC understands the issues, but the power to direct a particular network design lies with ACMA and they have a very "the market will decide" hands-off approach. As voice moves to VoIP an undersea outage won't just effect communications to the USA, but cross-network Australian voice traffic as well as cross-network Australian Internet traffic.

People don't understand how complex billing systems are. It's much better to have a few categories of charge and let the differences in costs within a category underwrite the headline charge. Then the billing system is much simpler. For the reverse scenario see traditional telephony. About 25% of the call costs are incurred by the call's billing. ISPs don't see themslves winning sales by having complex plans which pass on all cost categories to the customer but increase the overall charge by 25%.

This discussion also lacked some understanding of costs. ISPs are paying for the links into Ultimo (or to another peering point which passes on the ABC traffic, such as PIPE). The ABC are just looking at interfaces. A J-series router can present 24 optical interfaces at about $2K each, or you can use a switch at $2K for the chassis plus optics.

On 2008-02-21 Microsoft said in a media release:

Microsoft Corp. today announced a set of broad-reaching changes to its technology and business practices to increase the openness of its products and drive greater interoperability, opportunity and choice for developers, partners, customers and competitors.

Having a life, it has been some time before I've been able to read the documents. I have now done so, and compared them to what was already required of Microsoft by the European Union. I can now see why the EU was sceptical.

Unlike the EU settlement, this Microsoft documentation release is not compatible with the GNU General Public License as the patent license is not transferrable.

Microsoft are supplying only a small amount of documentation in addition to that which is available under better terms via the EU process. The most interesting of these is documentation of the Microsoft Exchange protocols.

The API promised for Office 2007 to allow document format plugins is new. The proof of this will be in the pudding. In one sense it's a clever way of addressing support for OpenDocument, as Microsoft need not then supply ODF support itself. The current official API is just dreadful -- basically you write a converter to and from Microsoft's rather limited Rich Text Format -- so something better is welcome.

All in all, the cynics who said this was just a clever spin of what the EU were making Microsoft do anyway are in-the-main right. Those cynics also point to the timing -- immediately prior to the DIS29500 Ballot Resolution Meeting.

For future reference, this is the week HD-DVD died. End-2007 player shipments in the US are 0.75m HD-DVD and 4m Blu-ray.

Warner dropped HD-DVD to force standardisation on the remaining format. Their concern was that with most consumers delaying a HD player purchase the studio would not have enough HD players to make it worthwhile to sell content to. They also had an inscutable remark about the uncertainty over HD formats polluting DVD sales -- presumably some people are not adding to their movie libraries until a winning HD format appeared.

Paramount has an escape clause in its deal with player manufacturers and is expected to follow Warner and drop HD-DVD support.

Toshiba, Microsoft's XBox and Universal are left holding the baby.

To be blunt, if the government's Internet censorship proposal in put into practice unchanged then science in Australia is stuffed.

Science has always been a global undertaking. Researchers interested in the same problems share data, theories and conclusions. Victorian scientists wrote letters to journals, 1970's scientists used faxes, 1980's scientists used computer tapes and in the 1990's scientists built the Internet and the world wide web.

To date the operation of science hasn't been overly hurt by censorship laws. The nearest brush being early research into the social behaviours which spread what we then called LAV/HTLV-III. We had posters at shopping centre bus stops explaining how to clean needles using household bleach -- what modern censors would call "instruction in matters of crime". We handed out leaflets explaining the risks of sexual behaviour, asking gay men to stop anal fisting and other grossly bruising activities and to use condoms whilst committing the crime of "buggery".

Could we do the same on the Labor Internet? What we we did was a private initiative to fill the gap until the necessarily slower but much more organised government programmes got underway, so it would not fit under the "public health" exemption. We would need a website with age verification. That's somewhat pointless when trying to educate the public and wouldn't work at all for educating IV drugs users and gay men (hey, enter your home address here and we'll store it so that the police can get a warrant for it later). Remember that some politicans seriously talked about placing AIDS patients into concentration camps; we had a huge amnount of difficulty just getting enough basic information to stop us double-counting infections.


There is another group of scientists also effected by the censorship proposal. Scientists now use the Internet to share datasets. A 10Gbps connection is faster than putting tapes on a plane and is a lot more convenient. All of the new instruments are interactive -- for these applications there is no "tape" option anymore.

But no firewall runs at 10Gbps. Sure, some say they do in the spec sheet, but in reality they are CPU-based and add considerable jitter. You'll recall that TCP interprets jitter as desceasing the probability that the round-trip time estimate is correct, and to avoid potential congestion collapse the transmission rate is lowered. So the researcher wanting to send data at 10Gbps cannot achieve this, even though the firewall claims 10Gbps throughput.

The obvious solution is not to run a firewall. But the Labor Internet is a proposal to place a firewall within the ISP's network.

So what projects share datasets at 10Gbps? Basically all the astronomy, physics and biochemistry projects of the coming decade. If the Labor Internet places a firewall in the paths used by those projects then Australian science in those fields is dead. The search for the Higgs Boson will only be claimed by an Australian if that scientist is working outside of Australia. The project to build a half-continent radio telescope -- the Square Kilometre Array -- may as well take place in South Africa, leaving Australia without the top-flight instrument in a field of science it has dominated for 50 years. There will be no rapid access to databases of protemes (the proteins used as messengers by our DNA), reducing Australia's biosciences to a poor joke.

There may well be other projects at risk. One of the advantages of packet switched network is that the users can just use the bandwidth -- no reservation is necessary. The flip-side of this convenience is that finding the research projects which use the capacity is not straightforward.

We might well be able to use some non-Internet scheme for data transfers, such as 10Gbps SDH channels. The costs are higher since the link (and thus the link's cost) cannot be shared. Each project will need to meet the $30m undersea capacity lease.

But if leasing links is the solution then it is easily seen that the Labor Internet ensures that Australia will never again be at the forefront of some sciences. A research grant takes about three years to achieve. Imagine asking for $30m of bandwidth three years prior to your research's requirement. Research is not construction, it's simply not going to happen -- progress three years prior will be too tentative for approval for such a huge expenditure, and if approved then costs will be controlled by limiting the lease of the bandwidth to the minimum. So if your research progresses well you will need to wait months for the necessary bandwidth, and if it progresses slowly (or not all all, since this is research and not all therories are correct) you will have squandered millions.

The beauty to the researcher of using the Internet for data transfer and other scientific collaboration is that the expensive resource need only be bought once, can be shared, and can be used by the researcher as required, using the equipment that they use every day.


If I sound upset then that is because I am. Like many scientists I have decided not to move to the USA, but to assist solving the problems faced by the country which raised me, encouraged me, and gave me an education in the first place. But if this stupid proposal makes it to actuality then I will either need to move overseas, stay to watch Australian science wither, or take up some non-science job like driving taxis or network engineering for a commercial ISP.

The Internet was originally built as a scientific instrument. The world wide web was orignally built as a scientific instrument. Just because they have continued on to revolutionise telecommunications doesn't mean that the Internet and the WWW no longer serve their scientific purpose.

My submission is done. To summarise, I'm upset that a poor quality specification, developed by one vendor, submitted with no consultation is even being considered for an international standard.

I don't think Standards Australia get it. How they treat this will determine the relevance of international standards in technology. Not just because Microsoft has Silverlight lined up to take the same path. But because there is a need for aspirational standards-making and if ISO are not going to be that forum, then it will occur elsewhere.

Doc Searls annoys me, I don't know why and it's probably be unfair. His blog was quoted on Linux Weekly News with a question I know a little about: What would you do with fat fiber?

Once you have 'enough' bandwidth, say 1Gbps, another set of factors comes into play to restrain network performance.

Simple latency is one. Unneeded round-trip times in applications are disliked by network engineers just as applications which make unneeded system calls are disliked by the kernel folk. TCP itself performs poorly once the bandwidth-delay product gets near cross-country gigabit.

So one thing that people can do with cheap large bandwidth is to help each other reduce latency: content distribution networks, P2P, and other overlay networks.

Having such cheap large bandwidth also makes sharing the results of computation more economic. Say you're down-rating a HDTV broadcast to SDTV. You can share the results of that intensive computational load to other viewers of the broadcast that might desire a SDTV feed.

Of course, it's a short step from there to full distributed computation, as with the Grid. Although that brings significant trust issues in some applications -- your neighbour may give malicious results.

Cheap large bandwidth also moves the line between what is "inside" the computer and what is "outside". Is large mass storage needed "inside" the computer anymore? At the least there's a lot to be said for using a RAID mirror with the one disk located across the network.

At 10Gbps the network finally rivals sending a station wagon full of tapes for throughput. So there's no reason not to do all archiving across the network.

Large bandwidths also allow distributed sensor networks. The Square Kilometre Array is a massive and early example, but more mundane applications are obvious. If applied to surveillance cameras and microphones these networks would provide a serious intrusion into privacy.

Large bandwidths also allow existing applications to be larger. As the Access Grid and video instant messaging shows, teleconferencing can grow from one-on-one to many-to-many. Quality can also increase, this can be marginal or dramatic. For example, the improvement in experience moving teleconferencing from SDTV to HDTV is remarkable -- someone can hold up a printed document and you can read the text they are pointing to.

Finally, large bandwidths allow us to reassess the economics of connection-oriented protocols. The connectionless packet protocols of the Internet are very vulnerable to denial of service attacks. For applications where these attacks are undesirable we may be able to use GMPLS controlling optical switches to give dedicated bandwidth on demand between end-points.

Extensively drug resistant tuberculosis. It's time to pay for our over-use of antibiotics.

Every time you use an antibiotic you run the risk of not killing the bacteria, especially if the course of the antibiotic is abandoned when the patient feels better. This gives evolution room to operate, the more resistant survive, and eventually the entire population of bacteria are resistant.

To date, new antibiotics have been found and we set off down the path towards resistance again.

Now it is the End Times. We have no more new antibiotics. Overuse of the antibiotics (even giving them to perfectly fine chickens so they will grow with no individual care) has lead to bacteria which resist multiple antibiotics.

XDR bacteria have simply been through enough evolution to be resistant to all antibiotics. Treatment has to return to the pre-Penicillin era. That's fine for some bacteria, they can be treated with old-fashioned sulphur drugs, although this is not as effective and weaker people will die. But there was no effective pre-Penicillin treatment for the common disease tuberculosis.

XDR TB is the new AIDS because is has the attributes that made our response to AIDS so poor.

Complicated by religion. Many religions forbid homosexuality and some forbid condoms. Understanding XDR TB requires accepting evolution, something many Christian religions cannot do. Many of these religions can command more resources than public health agencies. As with AIDS some may use these resources to undermine public health efforts, such as the Catholic Church's view that condoms are ineffective in controlling the spread of HIV via sex.

Long lead times. Politicans had to make unpopular decisions that would not have an immediate effect. The same is true of XDR TB. For exampple, doctors should be required to justify any use of the second-line antibiotics and be prepared to be audited for all their antibiotic use. That oversight does not sit well on doctors view of themsleves, and any government suggesting it will make itself unpopular with a influential community for no immediate effect.

Confounded traditional public health measures. This was probably more true of AIDS, where the paranoia and lack of honesty of closeted homosexuals made it difficult to make AIDS a notifiable disease and to do contact tracking. But our era has cheap global air travel, which was only in its infancy in the 1980s. We don't have effective means for tracking people across the globe. And, as with AIDS, overblown privacy concerns may well stymie and effective response until it is too late.

As with AIDS, government action is lacking. Last week my daughter bought home a pamphlet saying that you should not use antibiotics for a common cold. That pamphlet is twenty years too late. The horse has long bolted, and the issue is now one of damage control.

A vaccine for TB exists, Bacillus Calmette-Guérin, which is a weakened TB. As such it carries risks greater than other childhood injections and also results in positive diagnostic tests for TB. Western nations do not give the BCG vaccine to infants, because TB is not widespead and a simple test allows for better treatment of the occassional cluster of TB.

But worst of all, we don't know if BCG works anymore, and by how much. Measures of efficacy have wildly differing results, which appear to differ by geography, and we don't know why. BCG is a live bacteria, it is even possible that it has changed since it was widely used after WWII. We simply don't know yet.

There seems reasonable hope for new vaccines. Which is a relief after our ongoing failures with HIV. But if these do not appear within the next five years then we will have a public health disaster of the scale of AIDS.

Before any good view of the Microsoft Dynamics Live CRM it has set the terms for partners. And they are not good. MS's service is a Version 1, and it is priced like a Version 3. Worse still, Microsoft wants to sell this thing through partners, and they take most of the risk, face most of the cost, but hand 90% of the takings to Microsoft. Hmmm.


Glen Turner

April 2017



RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated 2017-06-27 13:49
Powered by Dreamwidth Studios