Ted Leung on the air: Open Source, Java, Python, and ...
- I like to eat fish eyeballs
- I love Mangos
- I am scared silly of spiders
- I hate talking on the telephone
- I didn't have my first glass of wine until sometime after I turned 30
Now for the hard part, who to tag...
Our contribution to the celebration was definitely a family affair. If you've seen some of the other photos floating around, you'll know the part that Julie and the girls played. My part's probably not much of a surprise, and I've started uploading some of the photos that I took to this set on Flickr. There are a ton of photos, and I'm going to spread out the upload over a few days to give people time to digest.
Recently a number of people have asked for my advice on buying a digital SLR (DSLR) camera. I'm not going to go into the pros and cons of DSLRs versus point and shoot cameras - I'm assuming that you already know that you want a DSLR.
The first thing that I would advise is that you go and read Philip Greenspun's excellent article, Building a Digital SLR Camera System. I found earlier versions of this article to be very helpful. Not only does Greenspun suggest what to buy, he suggests a possible acquisition plan as well as photographic projects that you can do given what you have got.
Beyond Greenspun's article, I have some additional comments based on my experiences during the last 18 months or so. Call it
"Things I wish people had warned me about"
1. It's the system, stupid
If you think that there is any chance that you are going to get bitten by the photobug, you need to know that it's about the whole camera system, and that you are going to get poor very fast. That said, you can also get very very far on "cheap" equipment.
2. Canon vs Nikon vs ???
You can take great pictures no matter which brand of DSLR you buy. Canon and Nikon are the most popular brands. Part of the reason that I chose Canon was that most of the people that I know also shoot Canon, and so there is the theoretical opportunity to borrow equipment from your friends. I've only done this a handful of times, so this partially depends on how many friends you have and so on.
Other big factors in the Canon vs Nikon debate:
- Lens quality - different people have different opinions on the quality of the lenses. So far, I've never been able to tell the difference.
- Lens selection - which manufacturer has the focal length/aperture combinations that you are likely to want
- Low light sensor performance - Canon has traditionally been the leader here, but it appears that Nikon wants to catch up
- Possibility of full frame sensors/cameras - Canon makes both cropped and full frame sensor SLRs. For the moment, Nikon only makes cropped frame SLRs. Personally, it's still a large number of years before I can justify paying what Canon is charging for a full-frame SLR, unless I find a way to support my habit, er, hobby
- Flash systems - Nikon builds a wireless flash controller into most of their DSLRs. I wish I had paid attention to this. I don't know that it would have caused me to over to Nikon, but it is a very nice feature.
There are real differences in "feel" and control ergonomics, which you should be aware of. If you think you'll care, then you need to find a way to physically hold the camera in your hands.
3. Body vs lenses
My take on this: a lot of the expense of bodies is the electronics (compare prices with film bodies) and electronics are on Moore's law. Lenses are made of metal (or plastic) and glass and are not on Moore's law. The body you like today will be cheaper next year, often by a lot. You should know that Canon and Nikon release new models in a price category every 18 months or so, around the dates of the two big photo shows, PMA and Photokina. It's like Macs and Macworld.
I bought the least expensive body that I could, the Digital Rebel XT (EOS 350D for the rest of the world), and it's served me well. The only major limitation that I ran into was the absence of a PC-Sync cord connector, which makes it tough to control external strobes or non-Canon flashes. There's a workaround for this, but I didn't learn this until I tried to fire a strobe from my camera. I'm ready to move up, but the body that I'm ready to move up to doesn't exist, and there isn't enough benefit to moving up to the 30D to be worth the money.
I think its better to save some of that money that Moore's law will ultimately recover, and invest in lenses. To wit:
4. Primes vs Zooms
When I started out, I mostly wanted zoom lenses, because of the convenience of not having to switch lenses. When I shoot events, I prefer zooms. After you use a lens where you control the zoom manually, you will go insane waiting for the zoom motor in a point and shoot camera. Ok, sorry, that was off topic. I am finding that I am getting more interested in shooting people, and some of my best people shots have been shots with one of my prime lenses. I'd suggest not going hog wild buying lenses until you figure out what you really want to shoot, and then figure out whether zooms or primes are better suited to your style. Most good photo processing software will give you a facility so that you can figure out which focal lengths you are using most often.
We have all seen tons of hideous flash pictures. This turns lots of people off flash photography, and the notion that you would spend money to get an external flash seems kind of crazy. I would point out that lots of pictures that you and I see every day we created using artificial light, either flashes or strobes, and the knowing how to use external lights is worthwhile. Getting a good external flash definitely changed my attitude about flash pictures.
See David Hobby's Strobist blog for great lessons and cheap ideas.
Once you start getting gear, you'll start getting cases. So far, I've managed to escape with 2 cases, but the combination of cameras, lenses, laptop or no, easy access during a shoot, weather proofing, padding, and sufficient capacity is just likely to make you nuts. Every camera forum that I've visited has innumerable threads about the "right case". We're just doomed on this one.
7. Tripods and other supports
Camera motion contributes to fuzzy pictures. I've discovered that I move a lot more than I thought when I shoot a picture. I have some lenses with image stabilizers in them, and that helps. I have a cheapo video tripod that I use in a pinch, and it definitely makes a difference. Getting a good tripod is pretty high on my list of camera purchases. This article by Thom Hogan is in the "we're doomed" category. I don't know that you have to go all out for what Hogan suggests, but I do know that I am in serious fiddling with the head land.
Before I knew anything about photography, I assumed that photographers just went out there and snapped their shots. Now I know that many pictures are made via modifications, either with filters or in the darkroom/post processing. In the digital age, the are a large number of effects that can be done in post processing, which reduces the need for filters (yay, less to carry).
Lots of people will give you their opinion on whether you need a skylight/UV filter to protect the front element of your lens. I have a UV filter for my smaller diameter lenses, but not for my larger diameter lenses. I can't see much of a difference.
The one filter that I think you need to have is a circular polarizing filter, because it's a filter you want a lot -- it makes a big difference in lots of outdoor situations. It's also impossible to achieve the effect of a polarizer in post production.
9. Flash Cards/Readers
It seems like there's no such thing as enough flash cards. I started with a 1GB card. That was a lot for a while. Then I started shooting RAW, and I needed another 1GB card. Now I'm finding that I'm shooting even more at events, and that filling up cards at the wrong moment is bad. More cards? At least Moore's law works on flash cards.
I'm probably going to get a flash card reader - USB to start with because they are cheap. This means I can pop a card out of the camera and let it upload while I keep shooting. Also it would mean that I am not using the camera battery to download, which is a factor when traveling - anything to avoid packing another battery charger.
10. Computer Software/RAW
Most cameras come with photo processing software, but everyone I know uses some other software. There's Picasa (PC) which is free, and iPhoto on the Mac. There's also the public beta of Adobe's Lightroom (Mac and PC), and Apple's Aperture on the Mac. I am using Aperture and am pretty satisfied with it now, but I am probably going to buy a copy of Adobe's Photoshop when it comes out native on Intel Macs. Why? Selective image editing. If you want to perform an edit to just a part of an image, you need Photoshop or something like it, and most of the something like its don't run on the Mac.
Phil Askey's Digital Photography Review is the best source of information on cameras. He does the most thorough reviews that I have seen. There are other sites that I consult, but I'll go to dpreview first. They have RSS, go subscribe. There is also a huge selection of photography related sites and blogs.
Having a nice camera is not enough. You must learn to use it. If you only buy one camera book then I'd suggest "Understanding Exposure: How to Shoot Great Photographs with a Film or Digital Camera (Updated Edition)" (Bryan Peterson). I've recently published a list of good books. The photo section includes the most helpful of the various photo books that I've read.
Another thing that's been really helpful to my development as a photographer is Flickr's RSS features. I subscribe to RSS feeds of both individual photographers and the photo pools of Flickr groups. That gives me a personalized daily flow of inspiration.
One of the things that I've been meaning to do is to put up a page of book recommendations. I'm the kind of person that learns well from books - it's a learning style that works for me. If you're a book lover, or perhaps more accurately, addict, you never met a book that you didn't like. The problem is that you have limited time and attention, so good book reviews, recommendations can be very valuable.
The list that I've put up is the result of culling book review entries from the blog. Right now there are entries for open source, programming, and photography. The programming section is a little light, and I hope to be adding more entries there over the next few months.
As a point of disclosure: the book urls on the list are Amazon Associates urls, so if you buy a book by way of those links, you'll be helping support my book habit.
I'd appreciate knowing if you find the list useful. Please feel free to leave a comment.
Of all the conferences that I attend, ApacheCon is different, because I am an "insider". As with all conferences, the technical program is a piece of superstructure that facilitates the human part of the program. Since ApacheCon is one of the few times for Apache folks to gather in person, I find that the human track is much more important than the technical track. It's a time to have those high bandwidth conversations that don't happen over e-mail, to catch up with old friends, and to find some perspective on what is happening all around the Apache Software Foundation.
This year in the "official technical track", I worked with David Recordon, Paul Querna, and Justin Erenkrantz (thanks!) to get all of the Heraldry committer accounts created and Jira accounts setup. That process has been dragging out and it was one of my big goals to get that unstuck so that we can get going. That work paid off handsomely, because a bunch of code showed up in SVN on Wednesday. So now we can get on to the business of getting the community going.
I also talked about Heraldry in the Incubator Fast Track, a set of lightning talks focused on projects that are currently in the ASF Incubator. This is the first time that I've attended / participated -- I'm not sure if this was done at ApacheCon Europe this year or not. It's the kind of thing that just obviously makes sense, and you wonder afterwards why it took so long. The session took up two session lengths, and there still wasn't room for everyone who wanted to participate. I heard the best quote of the conference during this track. It was during one of the web services talks, and the presented described the WS-* stack of web services protocols as the "WS Death Star".
I attended Sally Khudairi's media training tutorial for an afternoon. I've been interested in getting some kind of media training for a while now, so I jumped at the chance to get in on this one. This was really "basic" media training, which focused on speaking to people, understanding how much information that you (as a technical person) are throwing at a journalist or analyst, and a bit about the world of a journalist or analyst. Sally kept it very interactive and experiential, which I really appreciated. She was able to get Michael Cote from the Redmonk analyst firm to come and do mock press briefings with us, which was great. I've been a follower of the Redmonk blogs for quite some time, and it was great to meet Cote. He and I had several good conversations during the course of the Con.
Brian Moseley from OSAF did a great job talking about Cosmo. When we submitted the presentation earlier in the year, it was directly applicable to Apache since we were using Jackrabbit as our storage engine for Cosmo. Unfortunately, since then we've had to replace Jackrabbit with a Hibernate based storage layer, so the relationship to Apache projects was not as obvious. Nonetheless, there was a decent turnout (especially for the first talk on the last day), and people asked engaging questions.
On the human/social track, I participated (as usual) in the PGP key signing (don't worry folks, cabot will be filling up your mailboxes soon). This was a little depressing for me. Before my laptop was stolen this year, I had one of the most highly cross-signed keys in the foundation, including signatures with/from people who only attended a single ApacheCon. Having to revoke that key and start over was one of the most bitter pills to swallow on the laptop scene.
The photography walkabout/BOF never happened -- the biggest cause for this was that sunset was around 7pm, and this year the social scene at ApacheCon was really active. During the conference proper there was at least one event (sometimes two) every night. Wednesday night was the keysigning, which I couldn't miss, and Thursday there was the Lightning Lottery Talks, which are a must see. So we ended up with nothing. That doesn't mean that there wasn't a lot of shooting going on. I saw a good number of SLR's and lots of point and shoots. The active social scene provided lots of photo opportunities as well. In fact, this year, most of my shots are from the social activities and not the conference. There are only so many photos that you can take of people sitting in a room listening to someone talk -- same goes for the exhibit halls. In addition, I wanted to do a mini photography project showing various ASF folks in a more human setting. So as we made our way up and down Sixth Street each night, there were plenty of opportunities to shoot, and to interact with other shooters. Torsten Curdt took a bunch of really nice photos and Andrew Savory was around a lot with his Rebel XT. I met Debbie Moynihan of IONA when I noticed a camera strap with "EOS Digital" hanging out of her handbag - another Rebel XT.
Several people have asked me about my shooting at the show, so this next bit is for them. I shot a total of 733(!) frames and posted 159 of those. That includes test shots that I took to figure out the exposure for some of the club/party shots. The whole set of photos is here (Leo, I remembered to change the license this time). Thanks to Ken Coar for annotating the shots of his amazing lightning talk.
- Get my new PGP key cross signed
- Make some good progress on Heraldry stuff
- Hook up with some Abdera folks
- Talk to David about mod_sparql
- Add my FOAF to the committers info
- Go to sk's media training tutorial
- Do a photography walkabout - the austin bat bridge is on my list
This weekend I had some stuff I wanted to write (on paper) and couldn't find any space on the desk in my office. That pushed me over the edge, and I spent Saturday afternoon cleaning my office instead of doing that work. So I was amused to read Kathy SIerra's post about her new office, and James Duncan Davidson's post about the Aeron that he bought when he went solo.
I've been working at home since January of 2001, and I am fortunate to have a dedicated office. When I worked in the Valley, I worked at many companies that believed in hardwalled offices: Taligent, Apple, the IBM Cupertino office. I can easily say that my home office is the nicest office that I've ever had. I have good office furniture, a small (too small to sleep on) sofa, doors to the outside/deck, and nice views of trees. But my office can't touch Kathy's trailer -- sorry no pictures, because I still didn't finish cleaning it.
Like James, I bought a nice office chair when I set up the home office. We had Aerons at the previous job, but even though they were trendy, I didn't like them. Being small, I found them to be cold and relatively uncomfortable. Having had repetitive stress injuries (tendonitis), I knew that things like correct sitting posture and so forth could make the difference between being able to work and being in pain. There wasn't a question in my mind that I needed a good chair, the only question was what it would be, since Aeron's were out. In the end, I got a Steelcase Leap Chair, which is adjustable in all the right ways. It doesn't have the brand recognition of the Aeron, but for me it is far more comfortable.
A few bits on Aperture 1.5
I'm glad to see Apple addressing the issues around the Library system. I wasn't particularly bothered by this, but I know that a lot of other people were. I do think that the changes in 1.5 will make it easier for me to do things like write projects (or parts of projects to removable media). I've just been using a second hard drive as vault volume, which works well, but doesn't help with off site backup.
The new edge sharpen and color tools are nice -- I've come to realize that I am going to want some selective editing tools -- the kinds of things that you can do in Photoshop with masking layers and such. I probably won't want to do this to every photo, but there are some photos where I probably will want to be able to apply such treatments. The more that I learn about photography, the less adamant I get about doing adjustments to pictures. It turns out that lots of things have been done to pictures via filter, darkroom or other techniques over the years. Alain Briot has an interesting essay on these and related matters.
I'm also pleased to see that Apple has taken steps to integrate Aperture with iLife/iWork and the rest of OS X. I've been using some of my top rated pictures as screensaver images, and the new support is welcome, although ideally, I'd be able to use an album or smart album as the source for the screensaver (right now you can only use projects).
My favorite two improvements in Aperture 1.5 are the performance boosts, and the plugin API. The last time I saw James Duncan Davidson, we were swapping Aperture experiences, and we both sort of agreed that all that Apple would have to do for a decent 2.0 would be to fix the performance. Performance of 1.1.x was okay, but not super snappy, and I usually had to quit any RAM hogging applications before I could really crank up Aperture. No longer. It took Aperture over a day and a half to make all the previews for the contents of my photo library, but I was still able to use my machine. Going in and out of full screen mode is much faster, and other operations appear noticeably faster as well.
I sort of lied about the plugin API -- the actual improvement is that Fraser Spiers has done an Aperture version of his FlickrExport. Getting stuff up onto Flickr has been a pain for me ever since I got the MacBook and stopped using iPhoto. While Fraser hasn't yet hit all the items on my Flickr uploader wishlist, he has done some things that I didn't think to put on the list, like adding photos to a pool (now let me do more than one...). I'm not sure how some of the things on my wishlist would work as an export plugin, particularly scheduled/batch uploading, but being able to upload from Aperture is going to keep me happy for a while.
Here's the first photo I uploaded using FlickrExport for Aperture:
Huge thanks to Adriaan for putting out a universal version of ecto 2.4.1. Don't get me wrong, Rosetta is great, but I'd prefer to keep my whole system native.
This year's ApacheCon US starts in less than a week in Austin, Texas. In addition to the regular ApacheCon activities, I'm interested in organizing a photo walkabout, similar to the ones that James Duncan Davidson has started doing at the conferences that he's attending. So if you're interested, or you know good spots to shoot in Austin, leave a comment or send me mail -- please say whether you are around for the tutorial days.
In the comments to my post about IronPython and JRuby there were some comments about C libraries for Python or Ruby which would be unusable in in a CLR or JVM based implementation. This is correct, of course, and it is a problem for people trying to port software across implementations.
Earlier this week, Joel Spolsky made some comments about Ruby performance which triggered a bunch of posts, including a lesson from Avi on the 20 year old technique of in-line method caching. David Heinemeier Hansson weighed in with a post titled "Outsourcing the performance-intensive functions", where he argues that one of the benefits of scripting languages is that you can "cheat" by calling functions written in some other language.
Of course, that capability isn't limited to scripting languages. Other languages like Smalltalk, Lisp, Dylan, and others have foreign functions interfaces that let them talk to C code, and SWIG, which is a favorite tool for making it easy to link bind C libraries to scripting languages, also works for those languages. Reusing existing C code is a fine and worthwhile thing to do.
I don't agree, however, that users of dynamic languages should just agree to outsource high performance functions to C. The whole point of using a dynamic language is developer productivity, and that should be the case for performance critical code as well. And it's not like this is an impossible task either. There are implementations of dynamic languages which are very efficient, and applying those techniques to "scripting languages" is worthwhile endeavor, which is being pursued by folks like the PyPy team. As Avi also points out, the StrongTalk VM has now been open sourced, which may make it easier for language implementors to adopt some of the rich body of work that has been done on dynamic language performance.
Will there be cases where even the most advanced implementation techniques won't yield enough performance? Sure. That's why C compilers have a feature called in-line assembly code. But you rarely see it used. Having to rewrite my performance critical dynamic language code in C should be a rarity. The better the VM's get, the more rare those occasions will be, and that's a good thing. Let's not throw up our hands and say "yeah, you're right, we're slow, but it doesn't matter because we can cheat".
Several days ago I alluded (obscurely) to the possibility of 8 core Mac Pro's based on an upcoming quad-core Xeon. Tuesday, Anandtech demonstrated that the existing Mac Pro hardware is capable of the feat (memory bus speed notwithstanding). Their benchmarks also show that many applications are not able to exploit 4 cores very well, never mind 8. Now, where did I leave that Erlang disk image....
Today marks a major milestone for Mike Jones and myself.
Microsoft announced a new initiative that I hope goes a long way towards making life easier for all of us working together on identity cross-industry.
It’s called the Open Specification Promise (OSP). The goal was to find the simplest, clearest way of assuring that the broadest possible audience of developers could implement specifications without worrying about intellectual property issues - in other words a simplified method of sharing “technical assets”. It’s still a legal document, although a very simple one, so adjust your spectacles:
This is a big step for Microsoft, and so far the details look good. The OSP covers a raft of web services specs, including a few that are important for digital identity. The promise extends not just to spec implementors but down the distribution chain, which is essential to being open source friendly, and there's no registration or notification of Microsoft required.
In the coming days, I am sure more legally savvy folks will look the document over, but so far Larry Rosen and Mark Webbink [Deputy General Counsel at Redhat] believe (via the Microsoft OSP FAQ) that the OSP is compatible with FOSS community requirements.
People that are aware of OSAF are usually aware of Chandler, not as many are aware of Cosmo. Cosmo started its life as the sharing server for Chandler, but over time Cosmo is going to bring quite a few ideas from Chandler into a web based UI. Our goal is to have both rich desktop client and rich web client access to your Chandler data, so that you have a choice of whichever interface appeals to you the most. The Cosmo project is much younger than Chandler, so it is going to take some time to reach that goal.
Several weeks ago, we found ourselves in need of a new manager for the OSAF engineers working on the Cosmo project, and I agreed to take over those responsibilities. Lots of people who read this blog have talked to me about Chandler in the past, so I wanted you to be aware of what is happening with me and Chandler. You can keep talking to me (and any other contributor to the Chandler project) about Chandler, but you can also talk to me about Cosmo and stuff in the web space.
I also wanted to make you aware that we have two openings for people to work full time on Cosmo. The last time I posted about jobs at OSAF, we got PJE, who has helped tremendously on Chandler, so here I am again. Please use the link if you are interested.
As always, if you are looking to stay up to date on what is happening with Chandler and Cosmo, you should subscribe to one or more of our mailing lists.
I think that these announcements are very significant and should be welcomed by people in both the Python and Ruby communities, because I believe that Microsoft and Sun's support of these languages will make it much easier to persuade people to look at Python and Ruby. Today people's biases are still against dynamic languages as whole, as opposed to particular languages, so I think that getting "corporate legitimacy" for either Ruby or Python helps both.
IronPython is already faster than CPython, and JRuby appears to be headed in a similar direction, although we won't actually know until JRuby beats one of the C-based VM's. There is a huge amount of effort being expended on the performance of the JVM and CLR implementations, and if that effort starts to benefit Ruby and Python users, then think that is a good thing too.
I've read some postings speculating on Microsoft and Sun anointing either Python or Ruby over the other, and/or over all other dynamic languages. I don't believe that this is the case. At OSCON in 2003, I attended a BOF organized by Microsoft people who were interested in improving support for dynamic languages on the CLR. If I recall, many of the "major" dynamic languages were represented. Also I know that Microsoft has been talking to folks like John Lam and others who are working on getting Ruby onto the CLR. As for Sun, JSR-223 is aimed at all scripting languages, Sun accepted a JSR for Groovy, and Tim Bray (who helped the JRuby thing get done) also helped organize a meeting at Sun for lots of dynamic language folks. I think that in part, IronPython and JRuby got picked up because the people involved were willing to work with the companies involved.
Other commentary has focused on whether or not Sun or Microsoft is ahead of/behind the other in this area. I suppose this makes sense if you are a partisan of one language over another. It's probably more true if you look at Python, since IronPython's baseline for comparisons is Python 2.3, while Jython is still catching up to Python 2.2. Overall, I think that we are still early in this game, and that neither side has an insurmountable lead over the other. If you look at the pace of VM support, I think that it's not so one-sided. Yes, Microsoft has been at this longer, but they also seem to have a longer cycle time to pick stuff up, since the pace of CLR improvements is gated by releases of Windows. Yes, you can download new versions of the CLR, but that makes deployment a harder deal. Sun still has to get its extensions specified, much less implemented in the JVM, and the cycles on the JVM are also long, but I also think that the window for broad adoption of dynamic languages still has not arrived, so both companies still have time, which also blunts the potential advantage of being first.
I'm happy to see all this going on, but the CLR stuff is far away from me, since my primary platform doesn't really have good CLR support. There's Mono, but it doesn't seem to be getting much uptake on OS X. I am basing this on the amount of buzz and/or actual Mac apps being developed on Mono, not on actual statistics, and I am sure that Miguel will be quick to disprove me with facts... I can at least see a world where I might use something like JRuby or Jython, since I have done a bunch of Java in previous lives.
These announcements also create some interesting points for observation. Here are some things that I am going to be keeping an eye on as these projects march forward:
- Community building around the implementation - I will feel most comfortable if these language implementations are community driven, and not vendor driven. I know from listing to Jim at PyCons that this is a goal, and the JRuby guys have been very clear about this as well. The recent buzz about these two projects gives them that PR bump that might allow them to draw more people into their communities. It will be interesting to see if they can convert attention into participation
- Performance - The IronPython team has shown that they can beat the performance of CPython. The JRuby folks have yet to do that, and both the Python and Ruby communities have higher performance VM implementations underway. This situation reminds me a lot of the situation with x86, Alpha, Sparc, and PowerPC, where you had different architectural approaches which were supposed to produce performance benefits. But in the end, large amounts of money, process technology and non-architectural considerations produced an outcome that was different that what you might have expected by just analyzing the processor architectures.
- Velocity - Having people who are working full time on these implementations is going to make a difference in the velocity of these projects. The question is how much, and at what expense versus creating a sustainable community?
- Tooling - Much has been made about the JRuby folks being chartered to work on tooling in someway. There's been speculation about NetBeans versus Eclipse, and there are also other Ruby IDE's. I haven't heard much about tooling on the CLR side, but it seems plausible that you could see Visual Studio support for IronPython and/or one of the CLR Ruby's should people at Microsoft decide it was worthwhile.
In the end, I think that having languages like Python and Ruby be "legitimized" by the recognition of big industry players, makes easier it is for me. It gives me one more argument to use when talking to people, which I hope reduces the amount of time I have to spend trying to convince people of the merits. That leaves me more time to work in a language that I like. Then again, we have Erlang, Scala, and Io just around the corner...
Michael McCracken is pondering the merits of laptops
I’ve been thinking of how I’d work if I didn’t have a laptop. One thing’s for sure: I wouldn’t spend as much time rubbing my neck while waiting for builds, for a couple of reasons.
I’m beginning to wonder if a laptop is really any good at all, let alone necessary. Wouldn’t I rather not carry that thing around all the time? Should my hands really sweat when my computer is working hard? Doesn’t having a laptop just give me an excuse to pretend I’ll be able keep working “later”, even though that never really works? Does anyone really gain more productivity from working at a coffee shop than they would using a fast desktop computer?
For years, I have wanted a laptop. It dates all the way back to the Apple PowerBook Duo days. I've always wanted to have one machine, which had everything in it, which could be with me at all times, and which could take advantage of the environment that I found myself in.
Laptops have always lagged behind the performance of desktops, and for a long time this kept me off of them as a primary machine. When I started at OSAF, I needed a laptop because I was going to be traveling, and I switched back to the Mac, which meant that the laptop was my primary machine, although I frequently wished for a desktop machine for performance reasons. I was eagerly looking forward to the Mac on Intel announcements, because I believed that the gap between the desktop and laptop Intel processors was much smaller than the gap had been on PowerPC. For most things, this has turned out to be true. iPulse tells me that there are very few times when I am CPU bound, and I am on the slowest MacBook Pro configuration. Instead, I'm finding that lots of the times that I am spending waiting are due to lots of paging/swappping, for which the solution ought to be "more RAM". Unfortunately, I already have 2G of RAM in the machine, and that's all you can get. I've talked to many people who also would like to drop more RAM into their MacBook Pro's. The other area where performance is a problem is video card performance, because Aperture relies heavily on video card performance and photo manipulation has become the number one performance limited application.
I could probably also get some more responsiveness by installing a 7200RPM disk in the machine (mine is a 5400), but then you have a different problem. I want to take everything with me on my laptop (although having a laptop stolen definitely gives you second thoughts about the wisdom of this idea). The problem is that laptop hard disks are just not big enough, and taking a faster drive means less capacity, hence the stack of external 7200RPM Firewire drives.
Lastly, there's there's the issue of taking advantage of the environment. Most of the time, my laptop is tethered to a large external display and keyboard. I occasionally "undock" it and use it around the house, but I don't do it as much as I'd like to, because once I "undock", I have to spend a ton of time putting the windows into some usable state again. I wrote some AppleScripts to help manage this problem, but it's still annoying enough that I avoid doing it unless I have to go somewhere with the machine. It's quite likely that I'd go mad if I actually had to commute every day.
So when you stack all those things up, a desktop, especially the new MacPro's, starts to look appealing again. Even more so when you ponder the Xeon version of Kentsfield.