I have some computer/video game stuff that I don’t want anymore. I’d feel wrong if I threw it away and I don’t really feel right selling it since I got it so cheaply. So, I am giving this stuff away. If you or someone you know wants it let me know. You only have to provide money for shipping and handling.
Multiple Sets of SUN Cables, Keyboards, and SUN Mice
19″ SUN Monitor
19″ SUN Monitor
NeXTStep Turbo Color Slab
17″ NeXTStep Monitor
NeXTStep Cables / Keyboard / Mouse
Handspring Visor Deluxe
Sony MiniDisc Walkman MZ-R70
Sony Playstation (Original)
Of course if you WANT you can pay for them but I just want them to have a home.
Update as of 2003/11/18: You may notice that all the SparcStation IPXs are taken. If you have an interest in one, I direct your attention to Adam Keys:
I’ve got a Sparcstation IPX with monitor, keyboard, mouse and ethernet transceiver that I’m looking to find a new home for. Having expended your supply, if you get anyone looking for one, could you send them my way? I would greatly appreciate that, and so would the local landfill.
His contact info is on his blog.
Matt May describes why the iPod mini is priced perfectly and purposely.
As for the huge gap between the 4GB mini and the 15GB iPod for $50 more, that’s a selling point. Here’s the inner monologue they’re looking for in users: “$249 for the iPod mini. That’s not so bad. It’s only $50 more than this flash-based player. But then, if I’m going to spend $249 for the mini, I could just spend another $50 and get 15GB, if I trade up in size. But then, that one doesn’t have the remote and dock, and that’ll cost me another $78 if I want it. And for that much, I might as well buy the 20GB iPod for $399.” And there you have it. Filling that gap in the price spectrum opens up a whole new set of new users who walk in hoping to spend $200, and walk out spending $400. If you’re ready to hate Apple for this technique, again, look around. Everyone practices this approach, from the leather cases on the handhelds to the 12″ saute pan to the rack that goes with the grill.
RPGamer has new screenshots from the Dragon Quest V remake for PSX.
Aaron Swartz on downloading music.
Downloading may be illegal. But 60 million people used Napster and only 50 million voted for Bush or Gore. We live in a democracy. If the people want to share files then the law should be changed to let them.
And there’s a fair way to change it. A Harvard professor found that a $60/yr. charge for broadband users would make up for all lost revenues. The government would give it to the affected artists and, in return, make downloading legal, sparking easier-to-use systems and more shared music. The artists get more money and you get more music. What’s unethical about that?
Michael Lucas-Smith points out the benefit and intelligent behind finding numbers to back up claims.
I put in italics the methods that would require a spill object. Only 0.37% of methods in the system will spill!. Clearly this indicates that the technique will work even on a system with 8 registers like an x86.
Another possible metric should be how often said methods are called. Suppose there are 30k methods with no arguments that have a 1% likelihood of being called, but 10 methods with 6 arguments (and thus a spill object) that have a 50% likelihood of being called. Obviously these numbers are made up, but it’s a consideration.
Dave Winer on data formats.
I’ve heard it said that “He who is most liberal in the formats he accepts wins.” I say a couple of things in response. 1. He who says that is probably getting consulting money from a BigCo. And 2. He who has the most happy users wins (and goes to heaven). Users love features, and developers who spend time supporting the most arcane buggy formats aren’t spending time on features that delight users. Formats are there to get the job done, not be pure, not be wonderful, just work, and shut up.
Tim Jarrett responds with additions.
So what happens to a Web browser that only supports XHTML? Or a newsreader that only supports Atom?
Dave’s post is right: it’s about the users and their needs. But sometimes the users need software that is format agnostic to get the job done.
Tom Coates wonders how best to use Wikis for content management.
Now one of the problems with using Wikis generally is that they don’t lend themselves to the creation of clear sectionalised navigation. Nor do they do naturally find it easy to use graphic design, colour or layout differently on separate pages to communicate either your context or the your location in the site. That’s not to say that Wikis are broken, of course, just that the particularly networked rather than heirarchical model of navigation that they lend themselves towards isn’t suitable for all kinds of public-facing sites (the same could be said of the one-size-fits-all design of the pages). This would clearly be a problem. Wikis sacrifice that kind of functionality on the whole in order to gain advantages in other areas (ie. collaborative site generation and maintainance). Without those advantages, you’d simply be left with an inferior product.
Richard links to Richard Stokes on Social Networks.
Wait. It gets even worse. Studies show that social networks follow a “power law” distribution. The “average” person has relatively few contacts (say, fewer than 10 other people with whom they regularly associate). However, the “social entrepreneur” has hundreds. The effective functioning of any social network depends on the presence of these social entrepreneurs. These are the same people that refused to participate on Friendster and other social networks. Simply put – there was no incentive for them to participate. I have hundreds of contacts, but the value I derive from introducing people far exceeds any advantage I would gain by entering them into a system somewhere. Moreover, the value I derive from my hard-earned network is sacrified for the “good” of the system. If anyone, or even just my associates can find out everyone who I know and everything I know about them, I am no longer indispensible. What would possess me to give away my personal “competitive advantage”?
Andy Duncan is coming to Boston. He should come to a Berkman Thursday Meeting.
Michael O’Conner Clarke writes about why he blogs and how he learned to stop worrying about who might see it.
I know other employed bloggers who’ve wrestled with the disclosure boundaries at the intersection of work and blogging. And I’ve walked a fuzzy line in my own work life – steering an ill-defined course between two sets of work colleagues: those people who know about and read the blog, and the other set of people I’d rather didn’t find out.
The little moment of epiphany came from realising that, of course, the people who wouldn’t grok the blog most likely fall into the category of people who just wouldn’t grok Michael anyway, so what the hell do I care what they think? I’d much rather be working with and for people who grok blogs and the whole “blogosphere” (*ack*) in general, or are open-minded enough to be receptive to this way of working with the world.
The me on this blog is the real deal – not too different from the full and unfiltered offline me.
Grant Henninger writes that the Blogosphere is organizing the web in the way the Semantic Web hopes to, but never will, because it’s humans who care.
Blogs are creating neighborhoods on the web. Links bring sites closer together, making the sites neighbors. Much of the time, the neighborhoods will revolve around a specific topic or specific viewpoint. There are many small groups of blogs that link heavily within themselves—these are the neighborhoods—but there are also blogs that act as major thoroughfares, bridging the various neighborhoods.
It is these neighborhoods of blogs discussing a particular topic that are organizing the information on the web.
Jeremy Bowers points to the new Snopes.com RSS Feed.
Snopes is required reading for people on the Internet. If it sounds too good to be true, if it’s a little too conveniently in favor (or against) your favorite ideological position, or if it’s a little too horrifying to be true, check it on Snopes before you get upset, or worse, spread the claims further. Because you’ll meet someone who has nearly the entire site indexed in their head, and there’s little that’s more damaging to your point then to have it conclusive rebutted on Snopes.
Liz Lawley links to Mike Axelrod on software development traditions.
However now we take heed. We realize the strength of the application is so strong that we have created a strong boundary around it that may be becoming too rough and thick. Up until now this has been a benefit. Perhaps our application was becoming like a great tree with rough bark to protect it. But perhaps our new scheme does not integrate well with existing or future planned applications. We pause and reflect. We look to the larger world of applications in our shop and we talk and collaborate. We join our minds with others to find harmony and balance in our workplace. We consider the risks and advantages of having rough edges between our applications. We work together to share a templating scheme. Our applications are already deeply interlocked by shared databases and a shared user population and now the interlocking deepens with shared tools. Our entire suite of web applications is growing stronger. Our designers work with common tools. Retraining is minimized collaboration is maximized.
Michael Feldman points to the Sim Mafia story in the Boston Globe.
They lay down the law inside the Sims Online, a multiplayer computer game run by Electronic Arts. “Our job is to basically take those complaints from the normal citizens of the game, who can’t go to EA because EA won’t do anything about it, and do an eye-for-an-eye for them,” Chase said.
If a player feels his character, or Sim, is being ill treated and can get no justice from the game operators at EA, he can arrange to have bad things happen to rival players, by approaching a local Mafia and ponying up some of the game’s currency, called simoleons.
Dave Winer writes about respecting standards so they do their job–making all our lives easier.
So while the user may want to stop taking the drug, the doctor would be irresponsible in prescribing it if he or she felt it was likely that you wouldn’t finish taking it. The same idea applies to reading bad XML files. If my code reads them, then yours has to too. Eventually the XML stops working. The reason we have XML is so we don’t have toscrape HTML. If the XML becomes as hard to deal with as the HTML, then we might as well just scrape the HTML.
Shimon Rura writes about how great Share Your OPML is.
Feeds.scripting.com is going to be the Amazon of RSS.
This has a lot of potential:
- Significantly better personal recommendations should be possible. Amazon doesn’t do any hard math to figure out “people who bought this also bought that” listings. Feeds.scripting.com could do something similar to provide concise listings of recommended feeds to its users.
- We could learn from users. People might be willing to tell you what feeds they like the most, from which we could form more confident recommendations.
John Gruber writes about HP/Apple iPod deal.
I think that what Apple has built, with their three-pronged iPod/iTunes/iTMS, is an even higher-level platform, specifically for digital music (and perhaps in the future, digital media).
Platforms exist for different reasons. Low level platforms are perfect for certain tasks. If you run a high-traffic commercial web site, your servers might run nothing but low-level software. Just Linux or BSD, along with software built for Unix-like platforms: Apache, Perl, PHP, MySQL. No GUIs. No desktops. Just servers running low-level software, very quickly, and very reliably.
Matt May is my favourite OpenWeb™ advocate right now: “Stop trying to figure out how to get the horse back into the barn, and start learning to deal with the nature of the Web.”
Flash developers are paranoid about their ActionScript, guarding their new-found skills anywhere they can charge for it: in classes, books, consultancies, wherever. On the other hand, the Web, you’ll remember, was built on code sharing (if not outright theft). View Source has always been, and hopefully will always be, a part of every browser. If Macromedia really wanted Flash to have caught on, they’d have made it that way in Flash early on. Every site a tool, every script a lesson.
I can see that Flash developers want to keep a hold of their intellectual property, but Jesus, man, you’re not curing cancer or accurately predicting the stock market. You’re making things move around and blink on a screen. Get over yourselves. Anyone who applies themselves can figure out and replicate what you do.
Jorrit Wiersma comes up with a great feature for a web browser.
One of the things that’s been bothering me lately is the way that web browsers don’t really allow you to surf their caches. What I mean is this: let’s say that I’m at work and I see an interesting article on the Nature website for example. Only, I don’t have time to read the whole article, so I just read the abstract. Now suppose that I happen to be traveling in the train later that day and I’m looking for something to do. I remember the article and would like to read it. I know that the thing is probably still stored on my laptop because my web browser probably cached it, so what I would like to do is just, say, reenter the URL and then my browser will give me a notice that only the cached version is available (since I don’t have an internet connection at the time) and I say okay and I can read my article. But to my knowledge there are no browsers that do this. Why not? It seems like a reasonable and useful feature.
The Onion’s top story: “Scientists Abandon AI Project After Seeing The Matrix.”
“I saw Revolutions with my 12-year-old son Eric,” Markovitch said. “He saw the look of worry on my face and said, ‘Dad, don’t be scared. It’s only make-believe.’ I had to tell him, ‘No, son, it’s what your father does for a living.’”
“After watching Captain Mifune blast away in his robotic battle exoskeleton as hordes of relentless Sentinels swarmed the dock screaming in battle-frenzied rage, I could no longer put my career before the future of mankind,” Markovitch continued. “Those poor, brave children of Zion—their annoying tolerance of rave culture notwithstanding—did not deserve that horrible fate.”
Bob Stepno describes the difference between blogs and RSS.
Q: Why bother with an RSS aggregator if you can read the same content through the browser? Why have the same wine in two bottles?
A: Wine is the wrong metaphor. Wine improves with age. News is more like fish. RSS syndication is a matter of getting the freshest news.
Matt May got back from a vacation:
More importantly, I didn’t touch a keyboard. (In fact, the keyboard I do use most commonly was on vacation in Cupertino, where it got a shiny new LCD and bezel, so it wouldn’t have been of much use to me anyway.) I didn’t get hammered by MyDoom hourly, didn’t join in on the navel-gazing surrounding Dean’s seeming collapse after New Hampshire, and I didn’t read six dozen blogs and their related links every day. It was pretty neat, and highly recommended.
Dan Wood on the dating aspect of Orkut.
One caveat: There are still “dating” aspects to the service, which you can use if you are so inclined. But don’t make the same mistake I did, which was to stumble upon the personal profile of people I’ve known for years from a business and computer sense, to find out what their turn-ons and contents of their bedroom are. Yuck! I’m scarred for life!
Blogumentary discusses an interesting aspect of Orkster.
You know what though? It’s fun. Lori and I wondered whether Dean Blogger-in-Chief Mathew Gross was gay or straight. Tall, cute, shaved head, earring, and Lori wasn’t picking up any “straight guy vibes” when we interviewed him. Well whaddya know, he’s married! Straight as the day is long. God Bless America, and good night.
Danah‘s friend Jesse comments on the difference between a blog and a journal.
Soon enough, some of these “bloggers” started writing more and more interesting commentary, writing more and linking less. Other “bloggers,” liking what they saw, would link to that interesting piece of commentary, and all of a sudden, what were traditionally (I use that term incredibly liberally) linkers were actually becoming the content providers themselves. The format remained the same, however — generally a long page of content, listed by date, and mostly shorter, bite-sized pieces of content — the web given the MTV treatment.
Soon, the content became the primary focus and the links slipped away, though the feel is still distinct from the journal, in my mind. The journal has been, and always will be, a personal account of the journaler’s life. The blog can be written personally as well, and from a very personal point-of-view, but it will definitely be written about a particular item of public consciousness (even if that population that is actively interested in a particular story is very small).
AS18907458907 writes about anonymous writers and praises them.
And in this one respect, perhaps, lies some nobility in anonymous writing. Unlike some talented writers who have turned blogging into high-profile gigs, how can an anonymous writer ever capitalize on any success? Benjamin Franklin, a longtime writer himself, calls his anonymous brethren “a bunch of misguided souls who don’t understand that the whole point of writing is self-promotion.”
Commentary: I find this whole criticism of anonymity bizarre. Anonymous writers are no less accountable than any other form of writer. If I don’t like what Atrios (a popular anonymous blogger mentioned in the original piece) says, I can respond and I can attack Atrios. The only thing I can’t do is scare his children, or try to lose him his job. (The latter, I’ve heard, is common retaliation for Internet enemies.) There’s no reason why folks like Atrios should have to put their jobs and families on the line to share their thoughts. Salon should be cheering for the increasing popularity of anonymity, as it only helps thoughtful debate — it does nothing to hurt it.
NI3 on a Y2K-ish bug in EMC software.
This entire incident was missed by the media in the storage industry. How can EMC miss a bug like this given the experiences of Y2K? This seriously calls into question quality control within EMC’s software development organization. If EMC aspires to be a serious software company, issues like this can’t be missed. This is a serious set back for their software aspirations.