Off to a horrible start (http://www.openstreetmap.org/browse/changeset/11638818) and I'm not sure if it's my fault or not (user error or technical issue?). Uploading set 1 just stopped at chunk 46 of 50 (chunk size 1000, too low?). It was going slow but ticking along until #46, which I let it try to upload until the changeset idle closed. Now all the nodes were uploaded but only a few thousand of the ways. I was trying to revert the changeset or at least download the objects and salvage some of the import but no changeset downloads are working (http://www.openstreetmap.org/api/0.6/changeset/11638818/download). Finally I was able to grab everything from Overpass (http://overpass-api.de/api/xapi?*[@user=seattle-buildings][@meta]) but now the challenge is filtering out everything that has already been imported from what I have left of this set, about 1800 ways and trying to merge them. So I got everything back into QGis and deleted all polygons that intersect the successful upload. This leaves me with unique polygons from my set that were not uploaded. Now the problem is getting the existing nodes that did upload connected to these polygons. This sounds simple enough but I can't find anything that preserves that data, the osm_id and whatever tags would be associated with it, everything seems to treat one node like any other nodes, discarding any possible data. JOSM isn't any help here, I tried it out and it just wants to upload new nodes on top of the olds ones.
I think what I'm left doing is deleting all unclosed nodes (19300) and using my resulting qgis selection of unique polygons (that didn't get completed in the upload) to try again. The original set was ~49000 nodes or 4700 buildings so it wasn't a complete wash (2988 got up fine) but not something I want to repeat. I think I'm going to do smaller imports next time, I'll probably just split my existing sets in half to 25000 objects.
I'm going to sleep on it to try and come up with a better solution and get cracking on this in the morning.