Category Archives: computers

MySQL Conference 2009

I spent most of this week at the MySQL Conference. I was giving a talk on Python and MySQL, which came about as a favor to some folks in the marketing department at Sun. This was a fair exchange, because I’ve been curious about the MySQL community. MySQL is at the other end of the open source spectrum from the ASF, so I wanted to see for myself what it was like. The MySQL conference is the MySQL community’s equivalent of ApacheCon. There is a mix of talks, some aimed at users of MySQL, and others aimed at developers of MySQL or related products.

There is a sizeable ecosystem around MySQL. There are extension patches from Google and Percona, which were mentioned in many talks that I was in. There’s MariaDB, Monty’s community oriented fork of MySQL. There’s the Drizzle project, which looks really interesting. There’s lots going on, and I got the feeling that there’s lots of innovation happening in various parts of the ecosystem. It feels energetic and fun, and what I would expect of a big open source community, despite it being a long way from Apache or Python.

I attended all kinds of talks. I went to a number of talks about analyzing performance and monitoring, including 3 talks on DTrace. Sadly, these talks were sparsely attended, which is a symptom of some of the problems that Solaris/OpenSolaris has been having. What was interesting was that all of these talks were given by former MySQL employees, and all of them were genuinely enthusiastic about DTrace. The best of these talks was Domas Mituzas’ Deep-inspecting MySQL with DTrace, where he showed some very cool MySQL specific DTrace scripts. If DTrace got ported to Linux as a result of the Oracle/Sun acquisition, that would be a good outcome for the world.

I also went to several cloud computing talks, where the topics was how to run MySQL in the cloud. These were pretty interesting because it turns out that there is a bunch of stuff that you need to do and be aware of when running the current versions of MySQL in a cloud environment. I hope that the Drizzle folks are aware of some of these issues and are able to solve some of these problems so that running in the cloud us pretty simple.

Here are my 3 favorite talks:

  • Don MacAskill’s The SmugMug Tale – I’m a photo guy, but not a SmugMug customer. Don’s been tweeting his experiences using Amazon Web Services to build SmugMug, and he’s been blogging his experiences with ZFS, the Sun Storage 7000, and so forth. I’ve been following his stuff for a while, so this was mostly a chance to see an in person rendering of an on line personality.
  • One talk that I didn’t expect to enjoy was Mark Madden’s Using Open Source BI in the Real World. I’m not really a Business Intelligence guy per se, but the world of blogging and twittering and so forth starts to make you attuned to the usefulness of various kinds of analytics. Anyone building any kind of non-trivial web software need analytics capabilities, so having open source solutions for this is good. It probably also didn’t hurt that I talked to several BI vendors on the expo floor the night before. What I really enjoyed about the talk was the beginning sections on how to be an analyst, think about and project the future. I’m given to a bit of that now and then, so I found this part of the talk pretty interesting.
  • The best talk that I went to was Yoshinori Matsunobu’s Mastering the Art of Indexing. The speaker pretty much covered all the kinds of indexing in MySQL, which indexes work best in which conditions (both for selecting and inserting — there were some interesting surprises for insert), and even tested the differences between hard disks and solid state drives. Maybe I loved this talk because it brought back all the research that I did in query optimization back in graduate school. But that wouldn’t explain all the other people in the room, which was standing room only.

Based on what I saw this week, I’m not in any way worried about the future of MySQL.

Evernote and other applications that are getting a workout

It’s been a while since I reported on the state of my Macintosh. Here are a few apps that I’ve been using a lot recently.

Evernote

I’ve had Evernote installed for quite some time, but I didn’t really start using it until after I got my iPhone. So I was interested to read Ars Technica’s report that 57% of Evernote’s users are using the iPhone client. Evernote is a great example of the “rich application architecture of the future”. Evernote’s family of applications include desktop clients for Mac OS X and Windows, a web applications, and mobile clients, most notably the iPhone. All of these pieces work together to make a great integrated solution. This is the kind of ecosystem that we were building around Chandler, although we never got to the mobile part, and as the Evernote data suggests, we would have been fine just creating an iPhone client. Of course, hindsight is 20/20.

Apple helped Evernote tremendously by providing a barely functional notes application on the iPhone, and then providing no way to sync notes back to a Mac. So the iPhone Evernote client fills a great hole in the iPhone application suite. That got me started using Evernote for information that might need to move back and forth between desktop and device. The next step up for me was that I started using Evernote to take notes for conferences. I used to use Ecto for that, and I would then rewrite my notes into a blog post. But I missed having the raw notes, so I decided that instead of creating a billion drafts in Ecto to hold the raw notes, I would just take all the notes in Evernote, and then write the posts in Ecto. This of course had the added benefit of me being able to use other features of Evernote. I definitely think that the Evernote team is doing something that desktop and mobile software developers ought to be paying attention to.

1Passwd

Another good example of this desktop/web/mobile trend is the fantastic 1Passwd password manager for Mac OS X and iPhone. I got 1Passwd as part of a MacUpdate software bundle some time back. It took me quite some time to start using it, because I was happily using Firefox’s built in password manager. 1Passwd has the advantage of working with Firefox, Safari, and NetNewsWire on my desktop. It does a much better job of dealing with odd web site logins. It does a great job of managing my ridiculous number of passwords. Actually it has a great password generator built in, which makes it easy to stop the common practice of having a few relatively easy to remember passwords that you use everywhere. Which is just plain bad security. 1Passwd also has an iPhone version, which means that accessing sites from my iPhone is no problem at all either. Great piece of software.

PathFinder

The last piece of software is PathFinder, which is PODS (plain old desktop software). PathFinder is a great replacement for the Finder, and the latest version, 5.0, adds a dual plan feature that makes file management tasks much easier. You can also manage sets of tabs. I use this feature to manage projects, by creating a set of tabs for each project. I can then flip a PathFinder pane into exactly the configuration that I want for working on that project. It’s a shame that Apple has been so lackadasical about improving the Finder. Maybe this will improve with the rewrite of the Finder for Snow Leopard. In the meantime, PathFinder is a good solution for those of us that need a little more than what the Finder provides.

Lazyweb: Virtualization software

I am looking at building a bunch of virtualized machines, and I have no idea what software I should be using.

  1. I want to create and run images on Linux and OS X
  2. I want those images to be runnable on Linux, OS X, and Solaris (optional)
  3. I want to make images that run Linux, Windows, and Solaris
  4. I want to run one set of images on a box connected to the internet, and have those images appear as separate machines.
  5. Bonus round: I want there to be some ISP/hosting provider that I can send images to and have them hosted.

I am assuming that my choice are: VMWare, Parallels, and VirtualBox.

Useful pointers and advice appreciated.

The Sun is going to shine on Python

Today is my first day as a Sun employee.

How?

Tim Bray was one of the first people to respond to my “looking for a job” blog post. I have not written about it much, but I’ve been very impressed with how Sun has handled the JRuby project. Tim told me that Sun was interested in ramping up their support for Python in a similar fashion, and asked if I would be interested in coming to Sun to lead such an effort.

Why?

After a bunch of talking and interviewing and so forth, it turns out that I was very interested. Long time readers know that I am a dynamic languages guy, going back to the original dynamic language, Lisp. I spent 2.75 of the last 4 years at OSAF working on a big desktop application written in Python (Contrary to some recent blog posts, Python was not a factor in the difficulties that we had with Chandler). The prospect of doing something that would help Python was very attractive. However, Sun has been slow to embrace dynamic languages (whether atop the JVM or not), and Sun’s history in open source has been somewhat checkered in my view. So there were some questions that I had to answer for myself before deciding to go to Sun (especially since I had 3 other very good options):

1. Can Sun actually work with an open source community?

It’s no secret that I have not been a fan of Sun’s handling of the open sourcing of Java, and it seems like OpenSolaris is having some governance problems of its own at the moment. However, if you look at the way that JRuby has been handled, you’ll see that there are parts of Sun that are learning how to work with a community, and doing a very good job of it. Sun hired two of the leading JRuby contributors and gave them license to keep doing what they had been doing. The JRuby guys have been well received by the “C” Ruby community and even the CLR/.NET Ruby community. In addition Sun has been investing in Ruby via support in NetBeans and via some collaborations with the University of Tokyo on the C VM for Ruby. Over the years, I’ve met many people at Sun who understand a collaborative development style. Many of those folks are committers on Apache projects.

2. How serious is Sun about dynamic languages and how deep does that support go?

Sun is (finally?) very serious about this. As part of Sun’s new direction, Sun wants to give developers the ability to use whatever tool sets they want. Ruby, Python, PHP, Java. On or off OpenSolaris. On or off the JVM. There is an official project, John Rose’s DaVinci Machine project, to modify the JVM to support dynamic languages. As far as Python goes, Frank Wierzbicki, the maintainer of Jython, started at Sun last Monday, so there will be at least two of us working on Python related stuff. That includes Jython, Python support for Netbeans, and some other stuff that we haven’t quite figured out yet. We definitely will be looking for things that we can do to support CPython and the Python language as a whole. This is not just about Python on on the JVM. Sun will try to make its platforms, OpenSolaris and the JVM, the best place to develop and deploy Python applications. But at the moment that’s a goal and not a reality, so there is lots to do.

What’s Next?

Frank and I will be at PyCon in Chicago in a week or so. One of my goals (besides hooking back up with people since I missed PyCon last year) will be to sit down and talk to anyone who has ideas about sensible things that Sun could do to help Python. In the mean time, my e-mail address will be <FirstName>.<LastName>@Sun.com

Oh, one more thing. My new job title is “Principal Engineer, Dynamic Languages and Tools”, so expect to see me dinking around with other dynamic language stuff as well.

My thanks to Tim Bray for helping to make this happen.

Update:
It looks like it’s going to take a little longer to get my e-mail address fully operational…
Update 2:
Ok, e-mail is set and ready to go.

On Scoble’s Chandler interview

A couple of days ago, Scoble paid a visit to the OSAF office in San Francisco and did a video interview with Mimi Yin, the product designer for Chandler, and Katie Parlante, the General Manger of OSAF (and my boss). Of course, I heard about how the interview went, but I was curious to see how the interview would go. Robert and I have talked briefly about Chandler over the years, but not in any detail, and I wanted to know what he thought about what we have done so far. When someone says “I want it”, that’s generally a good indicator, and I was glad to hear that phrase pop out of Robert’s mouth. Also, he asked almost all the questions that I could have wanted him to ask, so if you watch the interview, you’ll get a pretty good idea about some of the most important ideas in Chandler. It should be no surprise that I want to expand/clarify some of the things in the interview, so here goes:

Where’s my Outlook?
If you watch the video, it’s clear that Chandler Desktop today is not very much like Outlook, in the sense that it is not an e-mail centric application. If you believed the Wired-induced hype about Chandler being an Outlook killer, you’re probably disappointed. How did this happen? When we sat down and looked at what people were using PIM’s for, how they worked, and what they needed the most support for, we discovered that there was a big need for supporting groups of people working together as opposed to individuals just managing their own information. That’s why you see an emphasis on sharing and collaborating. A bunch of the infrastructure that we built early on for supporting customized personal information is supporting what you see, so there’s still the capability for doing individual personal information management, but we haven’t focused on those capabilities. Developers take note.

Web-based Chandler
This came across unevenly in different parts of the interview, so I want to make this clear. Chandler Server/Cosmo, which powers our free Chandler Hub service, is a web based version of Chandler. It doesn’t yet have all the features of the desktop version, but we are getting there, and we plan to get all the way there. Not only that, the back end of Chandler Server can provide you with data in a variety of formats/protocols. We want to make sure that you can get data into and out of Chandler Server as easily as possible.

Edit-Update / Sharing (or turning e-mail into a Wiki)
In the interview, Robert latched onto the edit/update features of Chandler. These are still in a primitive state, but you can see the value of them already. He had a great summary of how it works – “you turn e-mail into a wiki”. Exactly. You can create and share a collection with any number of people, and they can all edit/update items in that collection and see each other’s changes, without groveling through endless e-mail reply chains. At one point in the interview, Mimi said something about e-mail being the hub of people’s usage. Truth of the matter is that e-mail is more like the glue that holds batches of information together. Collections of items with edit/update is a different kind of glue.

Plugins/APIs
Robert asked about a Chandler that could slurp data out of Facebook/Twitter/blogs/blog searches and manage all that stuff via the dashboard (the triaging workspace) and Chandler collections. This is a topic near and dear to my heart, and solving that problem is actually the biggest reason that I came to OSAF to work on Chandler. Personal Information Management is no longer about the tiny set of data types that Outlook typically manages. Today, most of my personal information (by volume) lives in the cloud, so any system that is going to manage that information must be integrated with the cloud.

If you look at the Plugins menu of Chandler Desktop, you will see hints at being able to do what Robert asked for. There are demo quality (read: proof of concept) plugins to yank data out of Amazon.com wishlists, EVDB/eventful.com calendars, RSS feeds, and Flickr. We had a plugin for grabbing your del.icio.us bookmarks, but it got way out of sync, but it wouldn’t be too much work to put it back. All these parcels turn their data into Chandler items, which can then be stuck into collections or managed via the dashboard.

On the server, we’re a bit further behind on data type extensibility. It’s possible (I mean it’s code, right?) but it’s going to be a bit more difficult to do because of the server environment. The server does provide good access to calendar data via a number of calendar protocols, including webcal, CalDAV, and Atom feeds. In addition, the AJAXY web UI talks to the rest of the server using Atom feeds and AtomPub, so in theory you could implement a different client by using those feeds and AtomPub. I am quite sure that we will be doing more work on data access API’s for Chandler Server in the months ahead. If you have ideas, suggestions, or code, come by the cosmo-dev mailing list or the cosmo IRC channel.

What’s Left to do?
If you watched the interview, you’ll know that there are a bunch of things that Robert asked about which are not there yet. This is not the 1.0 release of Chandler, and there’s plenty to do. At the same time, the desktop and server are done enough that people can use them and developers can get an idea of what we are trying to do and where we are trying to go. In the server project we’ve already had some good contributions from people outside of the OSAF staff (the hibernate based storage engine, and the minicalendar in the web UI). We’d (the desktop and server projects) love to see even more people get involved.

To keep up on Chandler happenings, visit the Chandler Project blog.

Twitter in Scala

David Pollak shows a simple Twitter clone written in Scala. Last night was our first Bainbridge reading group meeting on Programming Erlang, so this is timely, as Scala’s actor libraries are modeled after Erlang. Also of interest is the use of David’s lift web framework for Scala, which includes ideas lifted from Seaside, Django, Rails and Erlyweb.

Silverlight and the DLR

Microsoft has announced that it is embedding a version of the CLR into their Silverlight RIA technology. Blogging machine Ryan Stewart had some of the initial details, and Sam Gentile has a good pile of links. The CLR enabled version of Silverlight will run inside Firefox (both on Windows and OS X) and inside Safari. This is a good step at cross platform support, but the omission of Linux, while not surprising, reduces the reach of Silverlight versus Flash or regular AJAX. Also, it appears that there are no Mac development tools for Silverlight, although presumably there is always text editors.

DLR
The most interesting part of the whole business is the Dynamic Language Runtime, which is the project that Jim Huginin has been working on since he arrived at Microsoft. The DLR currently supports JavaScript, a dynamic version of Visual Basic, IronPython, and IronRuby. John Lam’s work at Microsoft also appears to be paying off. eWeek had three good articles on DLR technology, and all three articles include conversations with Jim and John. It’s nice / interesting to see that two people could have a large impact on Microsoft. The DLR is being made available under a BSD style license. While I have to give props to Microsoft for choosing an unrestrictive license, I’d point out that a license is not a governance system, and while the DLR might technically be open source, the “Core CLR” definitely is not, and neither is the XAML portion of the Silverlight runtime — no surprise there. I wonder if we will be seeing a port of the DLR on top of Mono. I also wonder if IronRuby can run Rails, although that seems like a weird thing to want to do inside of Silverlight.

Linq
Another part which I find interesting is the inclusion of Linq as part of the Core CLR. I like Linq, and if Microsoft is going to try to define a new platform for inside the browser, I’m happy that they’re including Linq as part of the core.

Impacts
Here are some of the potential impacts of this announcement:

Since Silverlight will include the CLR, it will benefit from the CLR JIT and garbage collector, which together with Mozilla’s Tamarin, will raise the bar for JavaScript performance in the browser. It’s unclear whether regular AJAX apps running in a Silverlight enhanced browser would beneft from CLR acceleration of Javascript. I’m in favor of the browser vendors getting into a Javascript performance race with each other.

Allowing people to write browser side applications in multiple languages fragments the technology on the browser side. You could argue that the benefits of either IronPython or IronRuby are sufficiently large over Javascript that such fragmentation is ok. I’m not as sure that this is a good thing.

If there is significant uptake of IronPython or IronRuby for Silverlight development, that could have interesting impacts on the Python and Ruby communities. The Ruby community is already dealing with a proliferation of different Ruby runtimes, so there probably isn’t much new there other than a change in the mix of adoption of the various runtimes. On the Python side, its less clear, since the CPython implementation is the most heavily used.

The inclusion of facilities like Linq will boost the semantic level of the platform running in the browser. Granted, it only does that for Silverlight, but I hope that this puts some pressure on the other players to provide more leverage in the platform. If we are going to be building much richer applications inside the browser, we are going to need all the help that we can get.

So what?
In the end, though, I probably won’t be doing much with Silverlight, for the same reasons that I’ve written about before. The technology has definitely gotten stronger, but the other issues haven’t really changed much: there are no tools for the Mac or Linux, and as far as influencing the technology, you’re just standing outside the Big House, pressing your nose up against the window.

Adobe open sources Flex

Last week while I was in San Francisco, I sat down for an hour with David Wadhwani, the VP of product development for Flex and Ely Greenfield, one of the Flex architects. After I wrote my original post about open sourcing Flash, I got a note from David asking if I would be willing to spend some time to help him understand the issues that I raised in that post and its follow ons. This afternoon David called to tell me that Adobe was announcing that it was open sourcing Flex v3. I was especially happy when he said that my posts and our conversation had an impact on his thinking about open source and Flex. There is a press release with the announcement as well as a FAQ on the basics.

The Basics
The basics of the announcement are that Adobe will open source Flex v3, due later this year, under the Mozilla Public License (MPL), which is sensible given that they have already open sourced their Tamarin Javascript engine via Mozilla. Before that happens, Adobe will make daily builds of Flex available (the source is already available, but daily builds gives better visibility). Also, they will open their bug tracker to the public in preparation for the open source version of Flex.

Adobe is taking a slow approach on governance. Unsurprisingly, the initial set of committers will be folks from Adobe, and the governance model is underspecified. Right now, the FAQ says that the schedule and roadmap for Flex will continue to be defined by Adobe. There are stated plans to create a subproject process and subprojects could be managed by people outside Adobe, and incorporated into the Flex tree. The full governance model is not yet determined, and will be influenced by feedback and what actually happens between now and the end of 2007, which is the target for the transition to being a full open source project.

I think that there are likely to be some concerns around use of the Flex trademark. Unlike Java, where (in theory anyway) an open source Java could pass a compatibility test suite and gain access to the trademark, the open source version of Flex cannot be called Flex. It remains to be seen whether this will actually impact participation in the project.

Flex, but Not Flash
This is a good first step for Adobe, but it’s just the first step. The Flash player is not being open sourced at this time, but when I talked with David he told me that that Adobe had been telegraphing the fact that they were going to open source Flex for about 20 months, since the opening of Adobe Labs. When I asked him about the Flash player, he said that open sourcing Flex should be viewed as a telegraphing of Adobe’s intentions. Of course, there’s a big difference between intentions and actual followthrough, so we’ll have to wait and see how the Flex project ends up working out.

Bottom Line
Adobe is moving pretty quickly. When I met with David a week and a half ago, I got the impression that he and Ely had decided that they wanted to open source Flex, but hadn’t cleared it with his management chain. A week and a half later, they are making an announcement. As I’ve mentioned, this is just a first step for Adobe, and there are plenty of opportunities for things to go sideways. Nonetheless, I think that Adobe has understood the importance of openness and is taking some initial exploratory steps to do what’s necessary.

If you think that an open source Flex is important, then you should go to the new discussion forum that Adobe is setting up for open source Flex. There are a lot of things which are intentionally unspecified, and there is still lots of time to give Adobe feedback on this move. I know that I’m going to keep giving them feedback for as long as they continue to solicit it.

Update:
Scoble has a video interview that lets you hear some of what I’ve heard from David and Ely.

Everything is dead, except Apple and the Web

Or so it would seem.

A few weeks back, Dare Obasanjo said “Open Source is Dead“. The crux of his argument:

This is why Open Source is dead, as it will cease to be relevant in a world where most consumers of software actually use services as opposed to installing and maintaining software that is “distributed” to them.

If the only valuable property of open source was as a distribution mechanism/channel, I’d be inclined to agree. But open source is a means of production not only a means of distribution and routing around lock in. And of course, his argument applies to all distributed software, not just open source software. Which would make Microsoft dead as well.

This would no doubt please Paul Graham, who earlier this month wrote that “Microsoft is dead“, repeating the idea that software delivered via the web is in the process of displacing desktop software. Although for him to be announcing this in 2007, ‘to be the first one to call it” seems somewhat late. Also he weakens the case for web vs desktop software by tossing Apple into the mix, and the last time I looked, Apple was a desktop software company.

To complete the trifecta, Jeremey Wagstaff [via Marc Orchant] clarified that ‘It’s Not the “Death” of Microsoft, it’s the “Death” of Software‘. That doesn’t seem right either, since there’s a lot of software running all those web apps that are killing off everybody else. Of the three prognosticators of doom, his comments resonate the most with me:

We somehow demand less and less from our software, so that we can declare a sort of victory. I love a lot of Web 2.0 apps but I’m not going to kid myself: They do one simple thing well — handle my tasks, say — or they are good at collaboration. They also load more quickly than their offline equivalents. But this is because, overall, they do less. When we want our software to do less quicker, they’re good. Otherwise they’re a pale imitation of more powerful, exciting applications in which we do most of our work.

But all this just proves to me that there has been little real innovation in software in the sense of making programs do more. Web 2.0 has excited us because we lowered our expectations so much. Of course web apps will get better, and one day will deliver the functionality we currently get from desktop software. They may even do more than our desktop applications one day. But isn’t it a tad strange that we think this is all a huge leap forward?

Perhaps its a Great Leap Forward

More thoughts on Ambient Intimacy and Twitter

After several months of Twitter usage, Leisa Reichelt’s characterization of Twitter as Ambient Intimacy still resonates with me. I have some more thoughts on ambient intimacy in the context of Twitter, and I’m going to take them in the reverse order of the catchphrase.

Intimacy

From dictionary.com:
2. a close, familiar, and usually affectionate or loving personal relationship with another person or group.
3. a close association with or detailed knowledge or deep understanding of a place, subject, period of history, etc.

For me, the intimacy comes from the fact that I choose whose Twitter streams to subscribe to, and the fact that the content that people are putting in their Twitter streams tends toward the more personal. So there’s a technology part (subscribe to people) and a social part, the content of the streams.

Ambient

From dictionary.com:
1. of the surrounding area or environment:
2. completely surrounding;

It seems to me that the ambience is largely a function of which modality you use to access your Twitter stream.
I run an odd Twittering confguration (at last I think so). My Twitter following is multi-applcation and multi-modal. I have the Jabber Twitterbot in my Adium contact list, I’m running Twitterific, and I use the Twitter web page. My Twitter posting is similarly multimodal, with me using the closest Twitter input box, and also a QuickSilver action. If I use the web page, the degree of ambience is low. I don’t sit there with a Firefox tab focused on the page. Usually I go to the page when I am trying to catch up after being away from the computer for a while. I also used it from my cell phone, since a days worth of tweets would blow my text messaging plan. If the IM bot didn’t die so much, or if Twitterific saved an arbitrary amount of history, I probably wouldn’t use the web page at all. When the IM bot was working, I liked it because it showed the full text of all the tweets. Usually I didn’t care that I got the the tweets in real time, and most of the time it was annoying to have Adium making the event message received sound all the time. The only time where I really cared about getting Tweets in real time was when I was using Twitter as a real time back channel. At the moment I’m relying on Twitterific, but I don’t like the fact that I can only see one tweet or the limited tweet history. It appears that the next version will allow you to see the text of multiple tweets, which would be a big improvement.
Twitter interfaces
In an ideal world, I’d like to have a single app (on my computer, anyway – mobile devices are something else), which would allow me to deal with tweets at a degree of ambience that corresponds to my mental state. I’m not sure that this is possible, although it might be fun to play with some heuristics related to how many messages were received recently, perhaps with some measure of burstiness. That might be interesting or it might turn out to be worthless. I’d like a “shut up for the next 3 hours while I work” type of button — and of course, I want to be able to see what I missed without switching to a different app.

There’s also a set of features unrelated to ambience:

  • Another feature that I’d like is a personal Twitterbuzz, so that I could see what my friends think is important. The problem that I have with a lot of social aggregation sevices (del.icio.us, digg, and so forth) is that someone else controls the group making the recommendations. I’d like a way to specify that group myself.
  • Something else that would be useful is streamlining the situation where I am conversing with someone — it’s a pain typing @name all the time during those moments when you are using Twitter in an IM like fashion. Maybe I’d even want to be able to start an IM or Skype session with that person.
  • Quite often I wish that I could search my Twitter stream. A good client would have a way to do that without forcing me to the web page

Epilog
A major way that I’ve noticed my computing environment changing over the years is the introduction of more and more ambient data of various kinds. Perhaps there’s more understanding to be had by looking at various technological changes through the lens of ambience…