Wednesday, November 30, 2005

I'm so totally Extreme  

This weekend I finally upgraded my wireless network at home. For the past three or four years I've been using a second generation AirPort base station, which was 11Mbps with 40-bit WEP. It's been thoroughly out-of-date for a while now, but it met my needs, was rock-solid, and worked like a charm so I didn't feel the need to upgrade. And after all, 5.5Mbps is still more bandwidth than my real-world connection to the internet, so I knew that switching to the higher-bandwidth AirPort Extreme wasn't going to make a difference in simple web browsing.

But this week I finally broke down and made the jump to the latest and greatest. A combination of two things finally pushed me over the edge:

  1. I've been doing a lot more work lately that involves transferring things between machines in my office. Local transfers from machine-to-machine are where 54Mbps can really make a difference.

  2. The signal strength wasn't so great everywhere in the house, and I wanted to use WDS with an AirPort Express to extend the range of my network.

Another small factor was that if I upgraded, I could have iTunes send its output directly to my stereo via AirTunes. Personally I think AirTunes is kind of a gratuitous and somewhat goofy feature. It's certainly not enough to make me spend a couple of hundred dollars on new equipment. But I have to admit that once the other (much better) reasons pushed me over the edge, I was looking forward to giving it a shot.

So I made the plunge and bought an AirPort Extreme Base Station and an AirPort Express, and hooked everything up.

Perhaps because I like pain, or maybe just because I wanted to see if it would work, I decided to set up the AirPort network from a Windows laptop. Okay, the real reason is that my old TiBook recently died from catastrophic hinge failure and I couldn't find where I'd left the charger for the iBook, which meant that I didn't have a wireless-capable Mac handy.

Setting up the main base station worked like a charm. I plugged it in, ran the Windows version of the admin utility -- which I have to say is really very nice -- and saved the configuration from the old base station into a file on my desktop, then imported it into the new base station. Happily, it copied out my DSL PPPoE account and password. In the process I upgraded the wireless security from WEP to WPA, set my server box as the default host (DMZ), and configured the base station so that it would syslog to one of my machines. Restarted the base station, connected with the new password, and everything worked flawlessly. Great!

Getting the Express set up to extend my network, however, was a little bit trickier. I started at the obvious place with the AirPort Express Assistant for Windows. But somewhere in the middle of the setup as it jiggled Windows XP and base station settings, it failed while reading from the base station with an error -4: "bad param". Tried again several times with the same result. Hmmm. Not so great, and virtually impossible to diagnose what went wrong.

Why, no problem, says I, I'll just configure it directly with the admin utility.

That's much easier said than done. I did get it done in the end, but it certainly didn't go as smoothly as I was hoping it would. There are a lot of non-obvious details that need to be just right before everything works. Most of the answers can be found in the admin utility's help if you know where and how to look, but it takes some digging.

Here are some tips from my experience with setting up WDS manually:

  • Terminology. You have the main, relay, and remote stations.

    • The main base station is the one connected to the internet via Ethernet.
    • Relay stations connect between base stations, and do not have an Ethernet connection.
    • Remote stations provide services to clients, and do not have an Ethernet connection.
    As far as I can tell there's not a lot of practical difference between relay and remote, since both main and relays can be configured to accept client connections too. It may be the case that remote stations are able to dedicate more bandwidth to clients than main or relay stations, but I'm not sure about that. In any event it seems more like something you'd only be concerned about for a large-scale installation with lots of client computers and extremely heavy traffic -- I doubt it matters for home networks.
  • MAC addresses. Before you start, you need to write down the AirPort MAC addresses of all the base stations involved. It's on the outside of the base station, or you can get it from the base station chooser. If you do that, though, remember that the stations will be broadcasting different wireless networks at first. So you need to join each one in turn, then select it in the chooser (no, not that chooser) and write the AirPort MAC address down somewhere.
  • "Distribute IP addresses" should only be set on the main base station, not on relays and remotes. Thankfully, the admin utility warns you about this.
  • All stations must use the same channel. Pick a channel (I like 3 and 10) and set both base stations to it. The admin utility tells you that you can't use 'Automatic', but neglects to mention that all base stations have to use the same channel -- which is kind of an important detail.
  • Set up the main base station first, then relays, then remotes. When you set up a WDS main base station, you'll need to enter the MAC addresses for the WDS remote and relay stations that will be allowed to connect. It doesn't work if you go the other way, because a remote base station won't be able to connect to the main base station until the main base station has been configured.
  • Use different SSIDs (network names) at first. It doesn't matter whether base stations connected by WDS have the same network name or different names. Toms Networking recommends that you use different SSIDs, while Apple's Designing AirPort Extreme Networks for Windows recommends on page 38 that you use the same SSID. But if you plan on giving them the same name eventually, don't start out that way! Give the networks different names so that you can be sure you're connecting to the right base station when testing to make sure that it works. Once everything is working you may then decide to set them to the same SSID; it's up to you.
  • Know where the reset button is. If you make a mistake and can't find one of the base stations on the wireless network anymore, hold down the reset button for about seven seconds (until it starts flashing quickly) to give it a hard reset. The button is recessed, but can be pushed with a paperclip, staple, ball-point pen, or stereo miniplug.
  • Double-check all of your settings if you are importing settings from an older base station to a newer base station. At one point in the process I noticed that my main base station's transmission strength was apparently set to the lowest setting -- 10%, instead of 100%. I can only speculate about why that happened. The older base station didn't have adjustable signal strength, so perhaps it pulled in a zero value rather than the default of 100% when it imported my old configuration to the new base station.

Phew. Anyway, after a little bit of futzing around, I've got it working and I'm happy. The speed on the local network is much better, and the WDS extension has made the signal strength much stronger throughout the house. It wasn't too much of an ordeal and I got it all sorted out in an hour or two, but it was certainly harder than it should have been. Apparently I'm not the only one who thinks so, either.

But as a bonus to reward all my hard work, I can now have iTunes play through my stereo via AirTunes. Dude, I'm like so extreme.

Tuesday, November 08, 2005

Ohio Needs Reform  

[Update 11:20pm: Wow. The full results aren't yet in, but with about 79% of the votes counted it looks like all four of the RON issues got squashed badly. What a shame -- so many Ohioans will be disenfranchised by this, and the state will continue to be mired in economic woes. The current batch of crooks who got us into this mess will continue to rig the system to keep themselves in office. A sad day for Ohio.]

If you live in Ohio, you've probably heard about the Reform Ohio Now proposals that are on the ballot as Issues 2, 3, 4, and 5.

These are at issue in the election on November 8th. That's TODAY! Make sure you get to the polls and vote YES on these issues!

What are these issues about?

I would recommend reading either the summary or the actual text yourself, but in a nutshell:

  • Issue 2 makes it easier for Ohioans to vote by mail. This is a growing trend in voting, since many precincts have lines of up to several hours to vote and many people can't take that kind of time off from work to vote. It's proven successful in other states, and it's long overdue here: Ohio has some of the most restrictive rules for absentee ballots in the nation.

  • Issue 3 puts limits on state campaign contributions. These limits are similar to the McCain-Feingold limits in place for national campaigns. These are designed as a step towards making sure that the super-rich can't have an undue influence on politics.

  • Issue 4 fights gerrymandering. This is what it's called when elected politicians redraw political boundaries to favor themselves. Gerrymandering leads to fractured districts and unfair elections, and is designed to artificially skew the balance so that the politicians elected do not actually represent the true mix of views in the state. In Ohio, many counties are sliced up into two or three different districts, and two congressional districts (OH-5 and OH-18) cover parts of sixteen different counties. Ohio as a whole votes almost exactly 50-50 Republican vs Democratic, but out-of-control gerrymandering has conveniently adjusted the boundaries to disenfranchise voters so that our Congressional representatives are two-thirds Republican.

  • Issue 5 creates an independent panel to administer elections. Right now we have strongly partisan career politicians like Ken Blackwell in charge of oversight for the elections of themselves and their cronies, and they do a deliberately bad job of it -- wilfully disenfranchising thousands of Ohioans in order to push the vote in the direction they want it to go.

Issues 2, 3, 4, and 5 are from the people of Ohio

These issues come from people just like you and me, people who are fed up with the amount of corruption in Ohio. There's a phrase that describes the situation here, where the political machine is making the rules for itself and overseeing itself: "the fox guarding the henhouse". Not surprisingly, that fox has been burying its face in all the eggs it can steal.

Corruption? Our state has plenty. You've heard of Tom Noe, the coin dealer and state Republican crony who got $50 million in unaudited sweetheart investment contracts from the Ohio Bureau of Worker's Compensation, and was disgraced for allegedly embezzling and just outright "losing" up to $10 million of that money. He was recently indicted for money-laundering, accused of circumventing the campaign finance donation limits by giving people money that they would then donate to various campaigns. The hell of it is that the money he used was probably from that $10 million he "lost" of the state's money.

Follow the trail: crony given contract by corrupt Republican officials, crony uses that contract to steal money from the state, crony uses stolen money to make sure corrupt Republican officials get re-elected. It's a vicious circle, and we need a change.

A group named Ohio First has been trying to portray these amendments as coming from "special interests". But they're full of shit -- fact is, they are the special interests. The average donation to Ohio First is $12,941. And remember, that's the average. How many Ohioans do you know that can afford to donate that much to a political campaign? Not even a candidate. A campaign against a couple of amendments. Meanwhile the average donation to Reform Ohio Now is $545. That reflects the large number of small donations (over three thousand and counting) from people like you and me across the state supporting the amendments.

Vote YES on Issues 2, 3, 4, and 5

On November 8th, vote for reform. Vote YES on Issues 2, 3, 4, and 5. While you're at it, I'd suggest voting YES on the following:

  • YES on State Issue 1 -- investment in high-tech and Ohio businesses. Ohio needs to invest and plan for the future, and this seems like a good start.

  • Cuyahoga residents should vote YES for County Issue 6 — funding for Tri-C.

  • Cuyahoga residents should also vote YES for County Issue 7. If this issue doesn't pass, funding for the Cuyahoga County Board of Mental Retardation and Developmental Disabilities will be cut by 60% and many programs that help retarded and developmentally disabled kids and adults will be cut.

Any other Brecksvillians out there? Here in Brecksville we've got Issue 14, which appears to be an adjustment of zoning height restrictions to make it easier to build McMansions in the new developments. Currently each McMansion has to get individual variances, which are almost always granted. I believe this would just make the variances permanent across the board. Personally, I am probably going to vote against it because I'm not in favor of McMansions in general -- they tend to be energy-wasteful and detract from the small-town atmosphere of Brecksville. But given their popularity around here and the deliberately vague wording of the issue, I'm afraid it's going to pass anyway.

Monday, October 31, 2005

Thick as Thieves  

Have you noticed there've been an awful lot of politicians getting busted lately? I was curious so I started to make a list.

  • Jack Abramoff, famously influential Republican lobbyist with close connections to Tom DeLay. Indicted on five counts of wire fraud and one count of conspiracy to commit wire fraud, August 11, 2005.

  • Adam Kidan, influential Republican lobbyist and Abramoff's partner. Indicted on five counts of wire fraud and one count of conspiracy to commit wire fraud, August 11, 2005.

  • David Safavian, Republican procurement chief at the White House Office of Management and Budget. Arrested and subsequently indicted for five counts of obstruction and related charges for lying to investigators in an attempt to cover up Abramoff's dealings. September 19, 2005.

  • Senator Bill Frist, Republican Senate Majority Leader. Under formal investigation by the SEC for insider trading, for dumping large quantities of HCA stock just before a bad quarterly report. Frist originally claimed the stock was held in a blind trust, but that was proven completely false. No indictment yet, but seems very likely. September 23, 2005 and ongoing.

  • Congressman Tom DeLay, Republican House Majority Leader. Indicted on charges of conspiracy to break campaign finance laws. Indicted again on charges of felony money laundering just a few days later. September 28, 2005 and October 3, 2005.

  • Tom Noe, Ohio rare-coin dealer and top Republican donor. Indicted on three counts for illegally circumventing campaign finance limits to funnel money to the 2004 election campaign of George W. Bush. Also continues to be under heavy investigation for losing up to $12 million belonging to the State of Ohio that Republican buddies got the Workers' Comp Bureau to invest in an unprecedented, unsecured, and unaudited "rare coin mutual fund" that he ran. It's not a big leap to suggest that he may have been embezzling from the state and turning that money directly into campaign contributions to Republicans, but investigations are still pending. October 27, 2005.

  • I. Lewis "Scooter" Libby, Vice-President Cheney's chief of staff. Indicted on charges of perjury and obstruction of justice for lying to investigators in the CIA leak case involving Valerie Plame Wilson. October 28, 2005.

Notice anything? Number one, they're all Republicans. Why do you think that is? If you are the type to foolishly insist that this is some sort of partisan thing, and all of the indictments are frame-ups or something, then why aren't Republicans retaliating by getting Democrats indicted the same way? Is it because the Dems are all evil and bad, and the saintly Republicans are "above all that" and prefer to practice squeaky-clean politics? Pfft. Yeah, that doesn't even pass the laugh test. Pull the other one.

Number two, they're all very big fish in their areas of Republican politics. Frist and DeLay are the Senate and House majority leaders, people. It doesn't get much bigger than that. The lobbyists are huge in their field, and wield vast amounts of influence with the money they command. Safavian was director of procurement at the White House. Libby is no mere functionary in the Vice-President's office, but rather his chief of staff. And Noe is so thoroughly wrapped up in Ohio state politics that five out of the seven Ohio Supreme Court justices had to recuse themselves from a case that came before them as part of Noe's Coingate because they had received campaign funds from him. Look it up; he's connected to nearly every single major Republican office-holder in the state.

Worst of all, that whole list described above does not represent all the criminal activity that's gone on. It's just the stuff where they have enough evidence to go after these guys.

Beyond the criminal activity, don't even get me started on incompetence. Where do you think FEMA is going to be when there's an emergency near you? Will the National Guard have enough troops and equipment on this side of the globe to respond?

These guys are out of control.

Wednesday, September 21, 2005

Hurricane Rita  

Animated Hurricane Rita

Would you believe there is another nasty hurricane threatening the Gulf coast? Hurricane Rita is a CAT 4 CAT 5 storm which is moving westward across the Gulf of Mexico. It's currently projected to hit the coast of Texas.

(I've copied the satellite map pictured here to my own web space to reduce bandwidth usage; click to see a live update from Weather Underground.)

[Update 11:50pm: Rita is now a CAT 5. It also has a measured central pressure of 898 millibars. The CP of a hurricane is a rough measure of its strength; the lower the pressure, the stronger the hurricane. Rita's CP makes it the third most intense hurricane ever recorded, and it is likely to intensify over the next 12 hours. The measurement may not have recorded the actual lowest pressure, either:

DROPSONDE DATA FROM AN AIR FORCE RESERVE UNIT RECONNAISSANCE AIRCRAFT AT 623 PM CDT ... 2323Z ... INDICATED THE CENTRAL PRESSURE HAS FALLEN TO BELOW 899 MB...OR 26.55 INCHES.

THE DROPSONDE INSTRUMENT MEASURED 32 KT/35 MPH WINDS AT THE SURFACE...WHICH MEANS IT LIKELY DID NOT RECORD THE LOWEST PRESSURE IN THE EYE OF RITA. THE CENTRAL PRESSURE IS PROBABLY AT LEAST AS LOW AS 898 MB...AND PERHAPS EVEN LOWER.

Rita is now stronger than Katrina was at its strongest, and I repeat, it's likely to intensify tomorrow.]

[Update Thu 1:00pm: Rita is expected to weaken to a CAT 3 or 4 prior to landfall, but the meteorologists at Weather Underground are saying that the storm surge will be equivalent to a CAT 5, just as Katrina's was.]

The Texas coast has a hardier shoreline that is not as fragile as the Louisiana/Alabama/Mississippi coasts, but the real concern is that it will do further damage to our country's energy infrastructure. A lot of oil drilling and refining infrastructure was damaged by Katrina, and Rita has the potential to damage a lot more -- and with the damage from Katrina still being repaired, this could be like kicking us when we're down.

We're all familiar with the way that Katrina drove up gas prices around the country to over $3/gallon. Prices have recovered since then, coming back down to a still-painful $2.69/gallon or so. This was not because refinery capacity magically recovered -- Katrina's damage to refineries is still there. What happened is that the undamaged refineries redirected their efforts toward making more gasoline as opposed to other oil products like heating oil, because gasoline was in such high demand and was selling for such a high price. That's a relief at the pump, but doesn't bode well for winter heating costs.

Now Rita is aiming right at a series of drilling platforms and refineries off the short of Texas. The primary danger for the average person's wallet is the refineries. If the absolute worst-case scenario is true, Rita could shut down a third of the already-reduced refinery capacity of the United States. Even if that happened for only a few days, it would be absolutely brutal to the economy and to your wallet.

Rita is still well on the east side of the Gulf of Mexico, and won't make landfall for a couple of days. Currently the NWS is projecting a Saturday morning landfall at 7am.

I expect that this will be the best source for Rita information over the next few days:

  • The Oil Drum - a progressive blog about Peak Oil. They have extensive information on Rita already.

Other places to keep your eye on include:

And I can't recommend this enough for detailed, easy-to-read background information:

This also presents a difficult, perhaps no-win situation for President Bush. Should he act early to avoid potential damage from Rita? Obviously the answer is yes, because to do otherwise would be foolish. But unless he approaches it with extreme humility -- a sort of "we've now learned our lesson" approach -- he's going to be damned by most of the country for acting quickly to save white people and oil in Texas, while letting black people struggle and die for days in New Orleans.

Does the President have a single drop of humility in him? I sure haven't seen any yet. What do you think?

Friday, September 09, 2005

Changing the DNS query timeout in Windows XP  

I've been having some networking trouble lately. When my PC laptop is busy downloading a file, Windows XP starts failing to resolve DNS queries. So even simple lookups that I know must be cached at multiple levels, like www.google.com, start failing to resolve. Windows just times out after fifteen seconds and gives up.

Needless to say, this makes web browsing while downloading a file insanely frustrating.

My Mac laptops don't seem to have the same problem. I have no idea whether this is a problem with my ISP, my wireless router, Windows itself, or some combination of the three. And frankly, as an end user I don't care and shouldn't have to care. I just want it to stop sucking.

I set out to see if I could increase the client-side DNS timeout so that Windows would be a little more forgiving about slow DNS responses. It turns out there is a way to do that, though it's nearly impossible to find via a web search. (Even Windows experts, which I make no claim to be, seem to have trouble with this one because it's so obscure.)

Here's the registry setting to increase the DNS client-side timeout in Windows 2000 and XP:

HKEY_LOCAL_MACHINE \ SYSTEM \ CurrentControlSet \ Services \ Tcpip \ Parameters \ DNSQueryTimeouts

[Update Nov 3, 2006: Fixed the above link. It used to point here, but that link now redirects you to the main page for the Windows 2000 resource kit. Remember, kids, cool URLs don't change.]

Read the above link for details. The registry entry does not exist by default; you have to create it. I don't suggest you do this lightly unless you're familiar with using regedit to tweak parameters.

Screenshot of regedit.exe

The default value when the property isn't present is documented to be "1 2 2 4 8 0", which appears to represent that 15-second total timeout. (15 = 1 + 2 + 4 + 8. It's not clear to me exactly what the other 2 is for; it may be redundant.)

I wanted something a little longer, so I quadrupled all the numbers to "4 8 8 16 32 0".

Screenshot of regedit.exe

Now I have a 60-second total timeout, with the final query given 32 seconds to get through. In practice this has proven to be a long enough timeout that Windows can continue to resolve DNS names even when my network connection is busy.

And that's good news. I'm much happier again, and I can continue to use my PC laptop without wanting to chuck it out the window every time I download a file.

Katrina's Aftermath  

Wow. Well, I've been meaning to post a wrapup to follow up my previous entry, but the unfolding events in New Orleans have left me a little shell-shocked. It's been difficult to gather my focus enough to write a coherent entry. This is going to be a long one, because I have dozens of links and a lot of pieces to put together.

As you probably know, the situation in New Orleans started out bad, and then went from bad to worse. But it didn't even slow down at that point -- it quickly passed through awful, lingered a while in horrific, sped up as it passed through OMG, and ended up somewhere in "WTF is going on!?"

Throughout the past week I've relied upon several sources for information. Many of those were the links I shared with you in my last entry. But I've also been very impressed by how thoroughly the reader-contributed diaries at Daily Kos (in the sidebar on the right side) have covered the situation. Daily Kos is a political blog with a large community from the left and center. As with any community blog, like Slashdot, you have to sometimes take things with a grain of salt. There are occasional wacky conspiracy theorists who read and post there, but there are also economists, scientists, authors, engineers, members of Congress, you name it.

What happened

Southern Louisiana and the city of New Orleans specifically were declared a national emergency on August 27th, just prior to the hurricane's arrival. Then the governor of Louisiana requested specific funds and relief resources from the federal government the day before the hurricane arrived. Here's the governor's request for aid, dated August 28th, and the President's response, dated August 29th.

I originally thought the local authorities had dropped the ball on early evacuation, but it appears that I might have been mistaken. The New Orleans area started evacuation and contraflow on Saturday, two days before the storm's Monday landfall. That's not bad. I think the problem was that people didn't take the hurricane seriously until everyone else was taking it seriously, so for many people evacuation waited until nearly the last minute. There was to my knowledge no coordinated attempt to use buses or trains to evacuate people without cars over the weekend.

Just before the hurricane arrived, many people who either chose not to or could not evacuate (many because they did not have cars, and there was no plan for using public transportation to evacuate) were moved out of the city of New Orleans into temporary shelters at the Superdome and convention center. Other scattered pockets took refuge elsewhere in the city. Some people are angry over this because it was unknown even at the time whether the Superdome could withstand the expected winds, which were expected to be up to 150mph. But I think this anger is misplaced. This was the right move. For the short duration of the hurricane, the Superdome was probably the safest place to be. There is no situation in which the Superdome would have been less safe than a house in New Orleans: if the wind had been strong enough to rip the roof off the Superdome, it would have been more than strong enough to flatten an ordinary home. Evacuating to large, well-built structures like the Superdome and convention center was absolutely the right thing to do for the duration of the hurricane.

I have to qualify that statement with "for the duration of the hurricane", of course, because of what happened next.

Astonishingly, the drowning city of New Orleans was closed off and nobody was allowed to enter or leave. For the people trapped in the flooded, burning city there was literally no place to go. The convention center and Superdome turned from safe havens into hellish deathtraps -- with nobody allowed in or out there was no food, no water, no sanitation, no medicine, no nothing. And this continued, unbearably, for days and days. As people were rescued from other areas of the city they were sent to one of these two locations, which only served to increase the crowding and filth because nobody was being let out. As Geraldo Rivera pointed out in this must-watch emotional TV horror show, they could have just walked right out if they'd been allowed to.

It went on like that not for one day, not for two days, not for three days... but continued and grew worse over FIVE DAYS until finally, mercifully, someone got their act together. The human conditions down there are still awful this week, but the evacuation seems to be finally proceeding forward.

Accountability

So who is responsible?

FEMA: To begin with, there appears to have been an utter breakdown in the effectiveness of FEMA, the Federal Emergency Management Agency. These are the guys that are supposed to move very quickly and mobilize state and local resources to handle emergencies and save lives. Instead of saving lives, we get lists like this from Constructive Interference:

There's even more in the full story on CI. It seems pretty clear that rather than aiding and directing relief efforts to where they were needed most, FEMA acted as an awful hindrance to relief.

Some ugly stories have come out about several of the leaders of FEMA. It's getting tedious to link everything, but essentially a lot of the leadership seems to be Bush cronies with no prior emergency management experience. Here's more on FEMA director Michael Brown (college roommate of Bush's campaign manager), Chief of Staff Patrick Rhode (Bush's campaign event manager), and Deputy Chief of Staff Scott Morris (marketing director and media strategist for Bush's campaign). None have any previous emergency response experience to speak of; their positions appear to be political patronage and nothing more. [Update: TIME magazine reports that Brown's resume appears to be thoroughly padded, making him a great deal less qualified than he claims -- which wasn't very qualified to begin with.] Whether by malice or incompetence, it seems likely that people like these appear to have utterly crippled FEMA's ability to respond.

The Executive Branch of Government: Yes, not the legislative or judicial branches of government... this was a failure of the executive. In our country we have the guys that make the laws, the guys that interpret the laws, and the guys that enforce the laws. This was a complete and utter failure of enforcement and execution.

A lot of people are angry at George Bush. A very few people are still trying to defend him, even though Bush's actions seem more and more indefensible every day. Here's my take: Although Bush wasn't running FEMA, he appointed the people that were. As President he had a responsibility to pick capable directors, and it's becoming clear that he didn't do that. Bush also sets the budget for FEMA, which by all accounts has been slashed considerably. (Why? Take your pick: tax cuts for the rich, or the war in Iraq, or both. But that's another topic.)

The hell of it is that just last year FEMA and New Orleans ran a training scenario that is chillingly similar to what actually happened with Katrina. The scenario was entitled "Hurricane Pam". The first part of the training involved simply running through the scenario and seeing what happens. That was completed, but many loose ends were discovered -- including the fact that evacuees in the Superdome and other areas would be stranded with no place to go. The second part of the training, where plans would have been developed to fix all the things that went wrong during the exercise, was dropped due to a lack of funding.

Like him or not, Bush is the de facto leader of our country. And dammit, the leader of our country should be proactively handling things like a leader. Take responsibility, fire people that screw up, get in there and do it yourself if you have to. That's the American way. Don't cower and hide and blame everything on your subordinates. (Unfortunately that's sometimes the American way too, but we're not as proud of that one.)

But Bush and the people under him have shown absolutely no leadership -- in fact, negative leadership. Really what we're seeing is that they have bungled it just absolutely beyond all recognition, and are too busy patting each other on the back to notice what the entire rest of the country has noticed -- that it's a giant steaming clusterfuck. I'm half afraid that we're just going to see Medals of Freedom all around and Michael Brown will be rewarded with a Supreme Court nomination. That really might make me lose it.

Some of Bush's defenders have pointed out that hey, the mayor of New Orleans and the governor of Louisiana may share some of the blame. That's very possible. But all the documents and interviews I've seen suggest that they were screaming upward in the hierarchy for help even before the hurricane hit, and everything went to FEMA, who proceeded to drop the ball. FEMA could only be overridden by DHS (represented by Chertoff) or by Bush himself, neither of whom did anything. Under the mayor and governor prior to the hurricane, New Orleans achieved about 80% evacuation which isn't bad. And moving everyone who was left to the Superdome for the duration of the hurricane was the right thing to do. It really sounds like FEMA was the big screwup. Still, the mayor and governor are part of the executive branch too, so they will have to be open and accountable to the public for their actions as well.

(Some of the dumber or more desperate Bush apologists have been trying to claim that the mayor and governor should have ALL of the blame, leaving Bush and FEMA scot-free -- which is just a brown, drippy, steaming crock of bullshit.)

Update: Newsweek just published an article entitled "How Bush Blew It". Despite the unflattering title, it does not lay all the problems at Bush's feet. It's actually a pretty fair all-around discussion of the events and mistakes and problems of the executive branch's response to the storm. It reads like a strikingly honest assessment, which is something I'm not used to seeing from the press after several years of Iraq war cheerleading. Neither Bush, Chertoff, Brown, Blanco, nor Nagin come out looking all that good, frankly, though some look better than others.

The Legislative Branch: There were failures in the legislative branch as well. Repeated abominations like the pork-filled transportation bill have diverted federal funds from where they can do the most good. (In this year's version, Alaska was awarded over $400 million to build bridges to a Senator's vanity... for no viable economic reason whatsoever.) Forward-thinking programs that would have slowed or reversed the erosion of the buffer around New Orleans were cut. Those are not as immediate as the failures in the executive branch, but the legislature's problems are more subtle and more entrenched, and probably more difficult to solve.

Southern Racism: [Added Sep 9th at 3:45pm.] It's coming out that the police chief of Gretna, Louisiana was apparently responsible for pinning people in New Orleans. The people who were trapped were predominantly black.

In an interview with UPI, Gretna Police Chief Arthur Lawson confirmed that his department shut down the bridge to pedestrians: "If we had opened the bridge, our city would have looked like New Orleans does now: looted, burned and pillaged."

I don't think active 'drown as many darkies as we can' racism was at play here. (In the deep South you never know -- but I'm going to at least attempt to give him the benefit of the doubt. For now.) But if it's not that, then it's absolutely a matter of race-based fear, the fear that comes as a result of practical segregation. That fear is strengthened by what's called 'structural racism', a long-standing problem in the US where race and economic class are tied together, and racial divisions strengthen class divisions and vice versa.

Conclusion

Wow. I haven't I haven't even had a chance to get into the economic effects -- such as gas never seeing the underside of $3.00/gallon again because of damage to refineries and drilling, or heating costs quadrupling from 2003 prices because of damages to the natural gas supply from the Gulf. Or the environmental effects -- at least 20 drilling platforms were destroyed and are uncapped, and several large oil storage tanks have started leaking. Or the effect on our status as 'superpower', which appears to have been largely erased as a stunned world sees the country utterly unable to handle a thoroughly predictable disaster. (Although you know, a little dose of realism may be a good thing.) Or the very real possibility that all of the above economic effects will crash the housing bubble and start a real, full-fledged depression in the US and in the world.

But for now at least I can't go on. I need a break, and you probably do too.

I pray for everyone in the region, and I pray for our country. We badly need to regain our senses and do it quickly.

If you haven't donated yet, now is the time. I recommend giving to the Red Cross first. If you have anything left over, the ASPCA is doing what they can for the animals in the region. Both are fine organizations with a lot of able volunteers who will make your dollar go far.

Monday, August 29, 2005

Hurricane Katrina  

[Updates at the end of this post.]

As I write this at about 1:30 AM eastern time, the outer edges of Hurricane Katrina have just started to touch Louisiana and New Orleans. Nobody yet knows how it will turn out.

Katrina is a full-fledged Category Five hurricane with constant winds of 150-175 miles per hour, and gusting up to 200 mph. If that isn't bad enough, 20 to 25 feet of flooding is expected. That much water will basically destroy all buildings up to the third floor. The NOAA had an extremely sobering description of the anticipated damage up for a while (original seems to be gone, this via Atrios):

MOST OF THE AREA WILL BE UNINHABITABLE FOR WEEKS...PERHAPS LONGER. AT LEAST ONE HALF OF WELL CONSTRUCTED HOMES WILL HAVE ROOF AND WALL FAILURE. ALL GABLED ROOFS WILL FAIL...LEAVING THOSE HOMES SEVERELY DAMAGED OR DESTROYED.

THE MAJORITY OF INDUSTRIAL BUILDINGS WILL BECOME NON FUNCTIONAL. PARTIAL TO COMPLETE WALL AND ROOF FAILURE IS EXPECTED. ALL WOOD FRAMED LOW RISING APARTMENT BUILDINGS WILL BE DESTROYED. CONCRETE BLOCK LOW RISE APARTMENTS WILL SUSTAIN MAJOR DAMAGE...INCLUDING SOME WALL AND ROOF FAILURE.

HIGH RISE OFFICE AND APARTMENT BUILDINGS WILL SWAY DANGEROUSLY...A FEW TO THE POINT OF TOTAL COLLAPSE. ALL WINDOWS WILL BLOW OUT.

AIRBORNE DEBRIS WILL BE WIDESPREAD...AND MAY INCLUDE HEAVY ITEMS SUCH AS HOUSEHOLD APPLIANCES AND EVEN LIGHT VEHICLES. SPORT UTILITY VEHICLES AND LIGHT TRUCKS WILL BE MOVED. THE BLOWN DEBRIS WILL CREATE ADDITIONAL DESTRUCTION. PERSONS...PETS...AND LIVESTOCK EXPOSED TO THE WINDS WILL FACE CERTAIN DEATH IF STRUCK.

POWER OUTAGES WILL LAST FOR WEEKS...AS MOST POWER POLES WILL BE DOWN AND TRANSFORMERS DESTROYED. WATER SHORTAGES WILL MAKE HUMAN SUFFERING INCREDIBLE BY MODERN STANDARDS.

THE VAST MAJORITY OF NATIVE TREES WILL BE SNAPPED OR UPROOTED. ONLY THE HEARTIEST WILL REMAIN STANDING...BUT BE TOTALLY DEFOLIATED. FEW CROPS WILL REMAIN. LIVESTOCK LEFT EXPOSED TO THE WINDS WILL BE KILLED.

Most of the area has been evacuated, but about a hundred thousand people who don't own a car have been relocated to the Superdome and other shelters. It's unknown whether the Superdome can withstand that kind of wind and flooding if it gets hit full on -- there's a very frightening possibility that it might become a death trap.

The brand-new (literally, it's nine days old) the Weather Channel blog has some excellent commentary from the meteorologists themselves, and so far at least I find it a lot more informal and interesting than watching TV. This personal note from Lucas about his father and grandmother reminds us how many, many people are affected by this storm across the country. If they can keep the blog updated through the day tomorrow then it should be a very good source of information. Highly recommended. (Update: Not much going on there now that Katrina has made landfall -- probably too busy today. Weather Underground has some slightly better info at the link below.)

Please, pray for everyone in the area. If you want to take action, please, give money to the Red Cross. A cash donation is infinitely better than a donation of goods -- especially for an emergency like this. They buy in bulk direct from the appropriate distributor (often at cost) and deliver it right where it's needed.

Update: More blogging sources: Mark Kraft is collecting eyewitness reports of people who didn't evacuate. Sounds like some people are going to be blogging til the power goes out. Teece has more, and ends on the disturbing comment "I would have liked to have seen New Orleans." The eye is now visible on long-range radar, at least until the radar is destroyed by the storm. The Weather Underground blogs are also good sources.

Here's that Red Cross link again: Please donate.

Update (11am ET): More blogs describing it as it happens:

Pieces of the Superdome's roof are peeling away, but so far it appears to be just the foamy membrane and not the structural steel. (Update 11:40am ET: AP is reporting that a few chunks of metal have come off leaving the interior exposed to the sky, but they are relatively small. (20' x 5', which is small compared to the size of the dome.) [12:30pm ET: NOLA now has pictures of the roof]. Originally I heard 100,000 souls there, but now the numbers being reported are more like 9,000 or 10,000 (or 26,000, or ... you name it). Why? Part of that may be because they were running security and bag checks at the door (oh my God, have we progressed to this?) and there was still a huge line outside as the winds started to get really, really bad.

Here's that Red Cross link again: Please donate.

Wednesday, August 24, 2005

United Federation of Chat  

Google has just come out with Google Talk, their new IM service.

Google Talk screenshotThe application part of it is very pretty; looking at the screenshots, it's essentially iChat for Windows. Simple, clean, and easy to use.

In fact, it looks so much like iChat that it probably borders on copyright-infringing, but hey -- that's for Google and Apple to work out between themselves. It's Jabber-based, supports SSL, and seems very nice overall.

But an IM service is an IM service, and we've got too many of them. The real news is buried in the middle of a page called "Additional Resources". That's where Google quietly announces that they are starting an initiative to merge all the separate IM networks into one. Emphasis mine:

What is "service choice" and how does Google Talk enable it?

Service choice is something you have with email and, for the most part, with your regular phone service today. This means that regardless of whom you choose as your email service provider (Gmail, Hotmail, Yahoo! Mail, your school or ISP, etc), you can email anyone who is using another service provider. The same applies to phone service. You can call someone even if they do not use the same phone company as you do. This allows you to choose your service provider based on other more important factors, such as features, quality of service, and price, while still being able to talk to anyone you want.

Unfortunately, the same is not true with most popular IM and VOIP networks today. If the people you want to talk to are all on different IM/VOIP services, you need to sign up for an account on each service and connect to each service to talk to them.

We plan to partner with other willing service providers to enable federation of our services. This means that a user on one service can communicate with users on another service without needing to sign up for, or sign in with, each service.

What the... federation? Whoa, you mean I won't need to have a separate ICQ number and AIM account and Yahoo Messenger account and MSN Chat account to keep in touch with all of my friends and family and work contacts? Why, that ... that would actually be good for the end user. Unpossible!

Crazy as it seems, it looks like Google has a shot at doing it. Rather than standardizing on one network, their description makes me think they are taking a network-to-network bridge approach. So you'd still use your existing account and chat application, and you'd still have all the network-specific features you were used to, but you'd be able to talk to more people. Perfect. I would also expect, given Google's track record, that it would be done 'right': ie, it would be decentralized and all the networks would be peers. Networks that join up would not be beholden to Google in any way.

If that's so, then it seems likely that we'll see consolidation in reverse order of marketshare: all the small players will join up with Google immediately because it's good business sense for them: Their networks suddenly become huge and they are free to differentiate themselves on software alone. Mid-size networks like ICQ will probably follow soon thereafter. Yahoo and MSN might hold out for longer because they would hate to give in to Google, but popular demand from their users will make it necessary.

And then there was one.

The big one, AOL, will probably be harder to win over. I don't have a solid source for current IM marketshare numbers, which is partly because it's a confusing mess since so many people are forced to use multiple IM services. But the data I've been able to find suggest that AOL and AIM have a massive dominance in the IM field, perhaps covering about 50-70% of all IM users. The problem, of course, is that AIM is proprietary and AOL has refused to open it up for a long time now.

I don't know what will happen there. Clearly Google thinks it's worth a shot. And AOL has opened up the AIM network before, for iChat, so it's possible they'll do it again in the name of an open standard. But the company's sluggish history with moving onto the Internet and supporting email and web standards suggests that they will only be dragged into compliance kicking and screaming; as long as there's a buck to be made by being incompatible they would rather be incompatible. It's possible that for quite a while we'll probably be faced with AIM on one side and everyone else on the other. While not perfect, that would at least be better than what we have now.

So what do you think? Are you concerned that Google has got its sticky fingers into too much stuff? Got any links to actual IM marketshare numbers? Let me know.

Thursday, July 28, 2005

Apple's future with Intel  

Or, how I learned to stop worrying and love x86

It's been a while since I've posted. Work and family has kept me very busy lately. But since my last post the Mac world has received big news which would be impossible to let go without comment: Apple is switching the Macintosh computer to Intel processors. The keynote video stream where Steve made the announcement is still worth watching if you haven't seen it yet.

Enough time has passed that I'm going to fast-forward through the obvious comments:

  • Why switch? PowerPC clock speeds were stagnant, and temperatures and power consumption were staying too high. Fixing those problems was possible, but would have required a lot of expensive investment from Apple. And the problems wouldn't stay fixed: Apple would have to keep paying to push the development of the PowerPC, as they've been doing for years now. It finally reached the point where the price just wasn't worth it. Every other reason suggested is secondary.
  • Did I see it coming? Like a lot of people, I knew how much of the OS ran on x86 to begin with, and was well aware that it was an open possibility. Personally I figured it was inevitable that Mac OS X would run on x86 eventually. It's just happening a little sooner and in a more dramatic fashion than I had thought. I didn't anticipate the abandonment of PowerPC.
  • The leaks beforehand had all the hallmarks of being authentic and deliberate. As soon as I saw them coming out I felt that they were probably true. The leaks went to real news sources first, not the rumor sites. I believe the first mention came in the Wall Street Journal, and the WSJ's factual reporting is second to none. They would not have printed it if it hadn't been confirmed in some way. (Their opinion page is another matter.) And the Journal's Walt Mossberg seems to be a friend of Steve Jobs; he's been granted special early access to all sorts of Apple technologies and usually gives them glowing reviews. The WSJ is a friendly voice to Apple and they would be an excellent place to leak and start creating a buzz.

Now that that's all out of the way, let's move on. By the way, although I have friends at Apple and I'm probably covered by a ton of NDAs over the years, I want to emphasize that I'm not disclosing any confidential information here. This is all personal speculation based on publicly-available information.

Why Intel?

Why did Apple switch to Intel specifically? And what about the details? A lot of folks have been freaking out and wondering why Apple didn't say anything specifically about x86-64 or AMD.

First up, I'm pretty sure the cost savings to Apple will be more than just the CPU. The talk I've heard says that Intel offers bulk discounts to vendors who switch to Intel across the board, and Apple seems like a likely candidate for such a switch. Intel makes a a lot more than just CPUs, after all -- audio chips, SCCs, ethernet controllers, I/O processors, SATA controllers, and PCI chipsets, just to name a few. Apple already uses a lot of non-CPU Intel chips in Macs, Airport base stations, and iPods. It's certainly feasible that Apple might be going all-Intel-all-the-time.

Nor is it just chips, either. Remember, Intel designs motherboards too. Right now Apple does its board design in-house, which creates a significant lead time for product development as new boards are tested and reworked. Discounted Intel boards -- perhaps fully tested and qualified before they even reach Infinite Loop -- could be yet another potential cost savings. Apple has a penchant for pushing design boundaries with iMacs and laptops, true, but the company still does a brisk business selling desktop machines where space is not at a premium. Besides, Intel has been stuffing Pentiums into tiny spaces lately -- witness their recent clone of the Mac mini.

Could AMD compete with all that Intel has to offer? Frankly, it seems unlikely.

The door will still be open for AMD and other x86 chip vendors. I don't think Apple is going to start using Intel-specific features that AMD can't compete with -- that would be a foolish return to the single-chip-source problems that plagued Apple with the PowerPC. It's likely that the companies have made a deal where Apple commits to Intel for some number of years on negotiated terms, and thereafter the door is open to renegotiate or seek a better deal. So AMD might still get their foot in the door eventually. If Intel starts causing problems you bet Apple will switch to AMD or someone else; but as long as they offer a good deal they will be difficult to beat.

64-bitness

What about 64-bit support? Some folks were upset because Apple had done all this work toward 64-bit PowerPC and seems to now be ditching it. That work absolutely is not wasted. Anyone who thinks Apple is suddenly ditching 64-bit computing should put down the crack pipe -- it's not gonna happen. Sure, after Intel came up with the all-but-failed and incompatible IA-64/Itanium architecture, AMD turned around and created a much more popular backward-compatible architecture called x86-64. Point to AMD. But Intel at least recognized its mistake and cloned x86-64 for use in recent chips.

Apple may not make the jump to 64-bit Intel chips in its very first release, though I think it's at least possible that they might. But if they start with 32-bit chips, I will guarantee that they'll be on 64-bit chips within the year. And further, that they'll be going with the x86-64 architecture rather than the Itanium. It's the only decision that makes sense.

Should you care?

My personal opinion is that as a consumer you probably won't need to care about the transition.

Think about it: why do you buy a Macintosh? Do you really care about the chip that's inside it? No. What you want from a Mac is that you want the nice user experience, you want your apps to work, and you don't want the machine to stink up your desktop and crash and be virus-ridden like Windows.

Apple's switch to x86 will not change anything about the operating system or the applications: it will look and work exactly the same as your current Mac. It will still be just as stable and easy-to-use, all of your software will still run, all of your data will copy over and work just fine, and no viruses are going to magically jump over from Windows onto your Mac just because the hardware is the same.

Should you delay hardware purchases while you wait for the new machines? Well, you could. Apple seems to be expecting a certain number of consumers to make that choice -- they have more cash on hand than ever before ($7.5 billion USD), which it's pretty clear is a buffer against an anticipated temporary decline in Mac sales.

But the new machines won't be out for a couple of years. The first new ones might not be all that great, either; you may want to wait a few months for the second generation. Personally I'm planning on following my normal upgrade schedule: I'll keep my dual 2GHz G5 til it's on its last legs or until I get new free hardware, whichever comes first. The typical Mac lifetime is around four years (as opposed to two for PCs), and that's lengthened a bit by the way PowerPC clock speeds have been lagging, so I probably won't need to upgrade until the new machines are out anyway. As for my laptop, a late-model TiBook, it's getting pretty elderly. I wouldn't mind getting a new PowerPC laptop right now if I could afford it -- but so far furniture and family stuff has delayed me and I don't really need a new one badly enough yet.

So for now I think it's not a big deal: buy a new Mac if you want one and don't worry about it. As we get closer to the release of the first Intel Macs, though, the dropoff in Mac sales will be steeper. People will naturally wait for the new machines.

It'll be interesting to see what Jobs does with the release of the new machines: Will the release date be known or widely anticipated, for example Macworld SF 2007? Or will they get sprung on everyone unexpectedly? Several years of watching how Steve Jobs does things leads me to suspect the latter -- they will probably release the machines at least four months earlier than anyone expects. But it's an open question.

Compatibility and speed

Apple is including a way to run PowerPC apps which it is calling Rosetta. As reported by C|Net and Wired, this is Transitive's multipurpose DR emulator.

In an ironic twist, I've heard that Microsoft is also using Transitive's emulator ... but to translate x86 to PowerPC so that old games can run on the next-gen Xbox. Man. What's next -- cats and dogs living together?

I've tried out the emulator on a development system. Rosetta delivers great performance for simple apps. For a geek like me it's really cool: PowerPC apps just work transparently, and they run fast too!

However, anything that is heavily Altivec-enhanced will probably take a big hit when running under the emulator compared to running on a G5. The emulator does not emulate Altivec, so the app will first fall back to unvectorized floating-point, which could drop its speed to 25% of the vectorized floating-point. Then it would incur the cost of emulation on top of that. So some specialized operations running through the emulator might in theory be as much as 5x slower than on a comparable G5 system. As soon as the app goes native, however, it will gain all of its performance back and more.

But that only affects heavily vectorized applications. Your word processor will continue to work fine under emulation and will in fact probably be faster than before: the extra CPU speed will give it a boost. Games are mostly pumping video textures out using OpenGL, so they will probably not be affected.

Probably the most visible area where you'll see applications slow down is anything that uses QuickTime to compress or decompress video. That includes video playback; frame rates may drop significantly under some codecs. Why? Remember, Rosetta is not a mixed-mode architecture; it runs an entire app from top to bottom. And QuickTime loads and runs inside your application. So even though a native version of QuickTime will be available to native apps, an emulated app will run emulated QuickTime which will be much slower. If I've got all that right, then you can bet Apple is making an extra push to evangelize QuickTime developers to port as quickly as possible.

G4 emulation: Someone asked me the other day whether Apple might upgrade the emulator, which currently emulates a PowerPC G3, to emulate a G4 with Altivec. Frankly I doubt that will happen. It would take a big investment of time and money. And the benefit Apple would get from doing that is only in the short term, during the transition period where you want all the extra speed possible out of the emulator. After a while CPU speeds will increase so much that it just won't matter. The ROI (return on investment) just isn't there; too much work and too little benefit.

Classic

Apple has not announced plans for making the Classic environment run on the Intel machines. As a practical matter the current release of x86 Tiger does not support Classic. But have you noticed that they have not vocally announced that Classic is dead, either?

My guess is that Apple may be working on porting Classic, unannounced. There's a not-insignificant minority of Mac software that still needs it. And Apple has a long history of unbroken compatibility which it would be a shame to end.

In a lot of ways Classic is just a normal application: a lot of hardware-specific details were abstracted out of OS9 during its final years. The emulator doesn't need to be ported, after all: there's no reason why you couldn't run Classic's 68K-to-PPC emulator inside the new PPC-to-x86 emulator.

Of course, in other ways Classic is still very 'special'; there are many hacks all over xnu for it that give it special access to supervisor-level PowerPC stuff which Rosetta isn't going to emulate. Still, it seems like it'd be well worth Apple's time to sic a couple of guys on it for a year or two -- the potential gain is huge.

I suspect there are probably some doubts about whether it's technically possible. That's why there hasn't been a big announcement one way or another. But I'll go out on a limb and say there's a good chance someone will figure out a way to make it work.

Windows emulation

Here's an interesting one. Running x86 will make it a lot easier to run Windows applications on your Mac. Things like Virtual PC will run just about at full speed. Of course, Microsoft might not be very interested in porting Virtual PC to the new architecture, since Mac OS X just got a lot closer to becoming a competitor to Windows. if past history is any indication they'll probably do it eventually, but I bet they will drag their heels.

Other options exist, of course, such as VMware and WINE. It seems very likely that VMware is working on porting their software to x86 Tiger right now, and DarWINE is already underway.

Personally I think this would be an interesting thing for Apple to investigate. They might choose to be hands-off and just bundle VMware, Virtual PC, or DarWINE, but all of these are a little clunkier than what Apple prefers. It's possible that Apple might choose to enter the market themselves and deliver integrated Win32 emulation in some way.

Dual-booting into Windows: This sort of falls under this category. As others have noted, in addition to virtual PCs, it will be possible to dual-boot or triple-boot your Mac. Darwin can easily support the partition map styles used on Windows, so there's no reason you couldn't have a Linux partition, a Windows partition, and a Mac OS X partition. Mac OS X's built-in BSD and X11 are so good, however, that there's probably not a lot of reason for anyone who's not a Linux developer to dual-boot into Linux.

HP, Dell, and more

What about all the talk about whether Apple could or should release Mac OS X in a general release to other PC makers, like HP and Dell? Some people think that would've been a better idea. This attitude is exemplified by the somewhat goofy article Apple's Colossal Disappointment which was posted to Slashdot recently.

I call the article goofy because the author pays no attention whatsoever to business realities, nor does he seem to grasp the concept that policies may change over time. His basic complaint is that Apple is limiting its OS to run on its own hardware for the time being, and he thinks that's a mistake. (Or a "colossal disappointment", to use his exaggerated phrase.)

Here's my take. There were three possible ways Apple could have used its x86 code.

  1. Switch the Mac hardware to x86

  2. Allow a select few PC vendors to ship hardware that runs Mac OS X.

  3. Release Mac OS X widely for anyone and everyone with a PC, the way Microsoft releases Windows.

Apple started with item one for the reasons mentioned above -- the PowerPC has been weighed in the balance and found wanting. But the second two courses of action are not immediately feasible, nor are they wise business decisions.

Sure, they are potentially desirable goals that are at least on Apple's radar. I guarantee you Steve Jobs is thinking about them; when interviewed by Fortune magazine he mentioned that three of the biggest PC makers have repeatedly asked him to license Mac OS X for them to bundle. (My guess: the big three are HP/Compaq, Dell, and Sony.)

But it makes absolutely no sense to do all of those things at once.

What do you think would happen if Apple switched to x86 and immediately let others release x86 machines with Mac OS X too? It would undercut Mac hardware sales in a big way. Apple has been burned by cloning before in the 1990's. The circumstances are different enough now that they may try it again, but they will definitely start out slowly and carefully. It will be several years before you see anyone but Apple shipping Mac OS X.

How about making a widespread general release? First of all, you have the same problem as above where Mac hardware sales are undercut. But okay, perhaps you might make that up with increased software sales. No big whoop. But the real cost comes in the support infrastructure. Mac hardware is fairly dependable and predictable, and Apple does a good job with the small amount of tech support that is needed. But making a widespread release would suddenly skyrocket the number of support calls being made -- people would buy it for their JalopyTech 3000 computer and something would break. Apple gets a call. Frankly, Apple doesn't have the support infrastructure to handle a twenty-fold increase in phone calls. It might be possible to get there in the long run, but it won't happen overnight.

The correct course of action from a business perspective is to do exactly what Apple is doing -- switch their own machines only at first. The other things will still remain possible future directions, and I think that both are likely in the long run. But Apple is an "old" company by tech industry standards, and doesn't plan on biting off more than it can chew at any one time.

The Lockdown

Will Apple use LaGrande as part of a scheme to make sure Mac OS X is running on Mac hardware? Maybe. But it seems very likely to me that Apple won't try TOO hard. They want to discourage the casual user from running an unsupported configuration, and they will. But it's not worth the effort (and would frankly be counterproductive) to lock out the hardcore geeks.

A small amount of 'geek piracy' will cost Apple practically nothing in hardware sales. Contrary to their huge presence on the internet, the actual real-world market share of geeks is fairly small. The vast majority of Apple's and Microsoft's markets are people who wouldn't know how to perform that sort of hack, and wouldn't want to, because they know they wouldn't get support for it. Overall it's far simpler to just buy a Mac in the first place.

Honestly, it's like free advertising... let the geeks hack OSX to run on their machines. Once most geeks start using OSX their prejudices evaporate and they quickly get hooked on it, talk to their friends about it, recommend it, and so on. And that will ultimately grow the Mac's marketshare.

Further Reading

A few interesting articles in case you missed them:

Friday, May 13, 2005

A brief history of purity at Apple  

I was reading Ben Goodger's take on the recent Safari-KHTML kerfuffle. He's basically right. What I find particularly amusing is that ten years ago Apple would have been on the opposite side of the debate... with Apple fighting for purity and the open-source guys just putting in a quick hack to make it work.

It took Apple quite a while to get over its obsessions with purity in software design. Don't get me wrong; purity isn't a bad thing at all. In fact it's an important goal to strive for. But you have to make compromises to ship a useful product sometimes.

Ten years ago, more or less, Mac programmers like me were trying to deal with lovely gems that came out of Apple like these:

  • Apple Events -- in theory a great concept which used an incredibly flexible and extensible tagged-data interface. Problem was that its API and implementation were so annoyingly pure that they required allocation of a lot of temporary objects and a lot of memory got copied around needlessly. In the end it was much slower than any contemporary RPC/IPC call you could imagine.

    (Today: Apple Events have been kept alive, although with the implementation optimized and the API expanded to be faster and less pure. Several even-less-elegant but faster-and-easier methods of RPC/IPC became available with OSX and were quickly adopted by many grateful programmers.)

  • AppleScript -- in theory another great concept, designed around an abstracted open scripting architecture which could support multiple scripting dialects, and which allowed those flexible and extensible bits of tagged data from the Apple Event manager to be passed around. But only one dialect was ever completed, and the architecture wound up making it painful to implement from the developer's point of view. Heck, even the one-and-only dialect was fairly clunky from the user's point of view. This might make you wonder whom exactly it was designed for.

    (Today: AppleScript has stayed alive too, although the way developers and users actually use it in practice frequently disobeys the pure object-verb syntax of the original design. And what AppleScript support is out there generally comes for free from AppKit these days. Do you know anyone who's actually written code to properly parse a 'whose' clause lately?)

  • AOCE, or Apple Open Collaboration Environment -- in theory a great concept, trying to create a system that integrated mail, address books, digital signatures, networking, and more before all of these things were widespread. Its problem was that rather than attacking each problem individually, it tried to do them all at once. And it had a nicely object oriented design, but rather than allowing direct access to any of the dozens of classes or hundreds of members, it created accessors for every single operation. Ultimately it collapsed under its own weight. It was monolithic, huge, and incredibly difficult to understand. One of the Inside Macintosh: AOCE books was almost 1400 pages long -- big, 8.5" x 11" pages too. My phone book was smaller. And that book was only one of at least three...

    (Today: Deceased, just like all those trees.)

  • Open Transport -- a networking stack and API that had a very theoretically pure design and was supposed to, in theory and if you used it right, deliver rockin' performance. It did deliver better performance than the older MacTCP, but it was generally much too obnoxious for most developers to use directly. The most popular way to use it was via a wrapper library such as GUSI that made it more socket-like.

    (Today: Deceased. On OSX, Apple re-implemented the APIs with glue that calls through to Unix sockets.)

  • Newton -- another very pure concept. A tablet that you can just write on, and it will just recognize your handwriting! Cool! Electronic book! Electronic paper! The problem is that the technology and/or design clearly wasn't quite up to the task. It was big, clunky, expensive, and slow. The handwriting recognition technology was good in theory but notoriously imperfect in practice, but Apple stuck with it anyway.

    (Today: Deceased. A few years later, when Palm did basically the same product, they put in a hack called Graffiti so that they didn't have to solve the problem of generalized handwriting recognition.)

  • OpenDoc -- a very nice idea that was (and probably still is) ahead of its time. Rather than documents being things created by applications, they were just collections of objects which were essentially peers and had defined relationships to each other. If anything, OpenDoc was less a victim of its design and probably a victim of its circumstances: it was C++ and required COM, which was burdensome in terms of both performance and licensing, and it had to run on a rather lackluster and nonstandard OS which limited its portability.

    (Today: Deceased. But interestingly, Apple is just now starting to regain OpenDoc-like functionality with things like KVC, bindings, and CoreData. Thirteen years later.)

I think you can see what I'm getting at here. Apple being chastised for putting in quick hacks to make things work, rather than going for purity? By a bunch of Linux coders? Ah, sweet sweet irony.

Branching and Integration

I definitely feel the pain of the KHTML guys. But in the end what happened was inevitable.

See, one way or another Apple wound up on a branch. I don't know whether Apple chose to deliberately make incompatible changes, or if KHTML refused to accept some of their changes and forced them out on a branch. But regardless of your interpretation that's the way it ended up. And that was the first step towards KHTML's problems.

There should be (but probably isn't) a lesson in Software Engineering 101 about what happens when you have work proceeding on two different branches of a source tree. Integration -- the unpopular gruntwork everyone loves to hate -- starts to rear its ugly head. As long as the teams devote roughly the same amount of time to developing their branches, the branches grow in parallel and each spends a roughly equal amount of time integrating changes back and forth.

But what happens when one of those branches suddenly has a great deal of work invested in it, and the other doesn't? The team maintaining the less-vigorous branch starts spending more and more of their time on integration and less on development. Integration sucks; it's a necessary evil, but nobody likes doing it. Quickly it becomes less like fun and more like work. So the less-vigorous branch is in danger of withering even further.

It's made exponentially worse if the less-vigorous branch ever refuses some of the changes in the more-vigorous branch, because that causes the source bases to diverge even further. Now not only is there more integration work, but it's harder too. If the team is made up of part-time volunteers it can kill their enthusiasm for the project completely.

In a nutshell, that's pretty much what is happening (or has happened) with Apple's full-time engineering team and KHTML's part-time engineering team. Apple exacerbated the problem with what some are calling "code bombs", ie releases of an entire tree at once, but you're fooling yourself if you think the same problem would not exist regardless. The problem is really in the quantity of changes being made. Is it Apple's obligation to go back and do all the work to make their changes work on Konqueror? Not at all.

It kinda sucks for the KHTML guys, but the solution is clear even if they don't want to admit it: transition the upper levels over from the less-vigorous branch over to the more-vigorous branch. Rather than always integrating lots of changes from B back into A, start using B and port a few changes from A forward into B. It sounds like this is what Maciej suggested.

Happily, it sounds like they are pursuing something like this even as we speak.

Sunday, April 10, 2005

Hey check me out -- I'm Dancin'! I'm Dancin'!  

Get Perpendicular

Hitachi recently announced perpendicular recording technology, which is meant to improve the density at which bits can be put down on hard disks.

Storage density is exactly that -- the density at which information can be packed onto a storage medium like a hard disk. As density increases, that means that you can fit more data into the same size space, or fit the same amount of data into a smaller space, or both.

Over the years storage density has been increasing steadily. This has led to larger-capacity drives in physically smaller containers. Twenty years ago hard disks were clunkier and held a lot less data. 20MiB hard disks could be purchased but were expensive and rare. Today if you purchase a computer you'll probably get a 50GiB or larger hard disk, or over 2500 times the capacity, and it will take up a much smaller space. That's why the tiny little iPod mini can hold 6GiB of data.

But this rapidly increasing density has led to a problem. Magnetic storage technology is nearing its limits; today we are making bits so small and packing them so closely together that if we go any smaller, bits are in danger of flipping spontaneously from external factors. Since the bits are where your data is stored, you can't have them flipping arbitrarily -- because then you can't read what you just wrote. It'd be like trying to write on an etch-a-sketch that someone is shaking.

The arbitrary point at which this starts to happen and data is lost is called the superparamagnetic limit. Storage technology is getting very close to the superparamagnetic limit. Since bits need to be kept large enough that this doesn't happen, that in turn means that magnetic hard drives are close to their theoretical maximum density.

Hitachi has come up with a dodge that avoids the superparamagnetic limit for a little while longer. Rather than arranging data bits so that their magnetic poles aligned linearly along the surface of the disk pointing at each other (longitudinal recording), Hitachi is arranging them so that their magnetic poles are aligned parallel to each other and pointing away from the surface of the disk (perpendicular recording), with corresponding innovations in how the bits are read and written. This vertical alignment makes them less vulnerable to flipping and again increases the possible storage density. The result is that magnetic bits can be packed several times more densely than before, which means that drives will continue to be able to be made even smaller and higher-capacity than before. For example, rather than 6GiB iPod minis this will eventually make it possible to have 60GiB iPod minis.

It's all very serious and interesting and geeky.

So why exactly did they chose to promote this technology with a crazy Schoolhouse-Rock-style video including "Actuator Man" and little rectangular bits disco dancing?

Good question. Don't get me wrong, I think the video's great... I'm just a little baffled. Then again, it fits right in with Sanyo's HD-BURN promotion. Mr. CD-R, doubling of charming points!

Sometimes this is a bizarre industry I work in. But I'm glad people have a sense of humor about it.

Tuesday, April 05, 2005

Google Satellite Maps  

Google Maps just added an option for satellite photos. And I have to say ... wow. That is just wicked cool.

Not every location is covered by a high-resolution satellite photo, but a lot of places are. In particular, my town of Brecksville is available at the highest zoom level. I can see my house!

Play around with it for a while. Zoom in and out. It's incredibly fun. Here are a few neat locations to start with:

I'm incredibly stoked to see this feature in Google Maps. Yes, there are other sources for satellite and aerial imagery... but up until now it's either been slow and clunky, or a pay service. Integrating it with Google's terrific map interface makes all of that data extremely accessible for the first time.

Got any neat satellite image finds of your own? Let me know in the comments!

Monday, April 04, 2005

Nixon Approval Ratings  

From time to time as I look at current Presidential approval ratings, I find myself wanting to look at the week-by-week job approval ratings of former President Nixon. They are out there and available, of course, but they're surprisingly hard to find.

The best source I've found so far is the Roper Center at the University of Connecticut. They have a page with the data, but their graph quite frankly stinks. The value axis is compressed, there aren't any gridlines, and it just doesn't tell you much. So I compiled their data and made my own graphs, which I'll share with you now.

First, some background and the raw data:

And now the charts:

  • Stacked charts showing the poll results (click to zoom):
    Nixon job approval stacked poll results, base = approval Nixon job approval stacked poll results, base = disapproval
  • Approval only for Nixon's entire presidency (click to zoom):
    Nixon approval-only ratings, full Presidency
  • Approval only for Nixon's second term (click to zoom):
    Nixon approval-only ratings, second term
  • Net approval rating (approve minus disapprove) for Nixon's entire presidency (click to zoom):
    Nixon net approval ratings, full Presidency
  • Net approval rating (approve minus disapprove) for Nixon's entire presidency, annotated (click to zoom):
    Nixon net approval ratings, full Presidency, annotated
  • Net approval rating (approve minus disapprove) for Nixon's second term (click to zoom):
    Nixon net approval ratings, second term
  • Net approval rating (approve minus disapprove) for Nixon's second term, annotated (click to zoom):
    Nixon net approval ratings, second term, annotated

I think there are a few very interesting points that are visible in the raw data.

First of all, Nixon started with a substantial number of "no opinion" ratings -- as high as 35%. But the ambivalence quickly disappeared. There was substantial movement from "no opinion" to "disapprove" during the first few months of his Presidency. I haven't done a thorough analysis, but from the few others I've checked it looks like other presidents show the same trend in their approval. This doesn't seem to be specific to Nixon.

Second, Nixon was much more popular than Bush throughout most of his Presidency. President Bush's approval shot up after 9/11, but has steadily eroded ever since. Bush spent most of 2004 hovering under 50% approval and with a net approval of less than zero. Nixon didn't get that low until his former counsel was testifying before the special Senate Watergate panel describing the political espionage that he'd personally taken part in.

Third, to me it looks like Nixon was seen as "above the fray" at the start of Watergate. Everything was somebody else's fault and he didn't know about any of it. His approval rating was at an all-time high as the Watergate burglars were convicted, despite the fact that it was a fairly high profile case and some of them were former Nixon aides. It wasn't until one of the convicted former aides, James McCord, started to make allegations of obstruction of justice and point fingers higher up that things started to take a turn for the worse.

Fourth, July 1973 was an incredibly bad month for Nixon. His net approval rating dropped by a whopping 25%. At the end of June, his former counsel testified that he was involved personally in the Watergate break-in and was involved in the obstruction of justice. Then in July he refused to testify before the Watergate committee, citing executive privilege; the existence of the White House tapes was revealed; and he refused to hand over the tapes. All this contributed to a serious drop in public approval from which he never recovered.

Fifth, even as the worst came out there was still a solid core of about 25% of the country who never abandoned Nixon and continued to approve of him.

All in all, a very interesting set of data indeed.

[Update: December 1st, 2005] Probably because of the plight of our current president, a whole lot of people have been coming here to look at the gory details of Nixon's approval ratings. I should point out that Professor Pollkatz has a pretty nice chart up comparing raw numbers from Bush, Nixon, and Clinton. And he updates his Bush numbers regularly so you can keep track of the progress. Go check it out!

And keep in mind that despite all the recent indictments, President Bush himself has not really had a decisive Watergate moment yet, and probably won't while the Republicans control Congress.