Just like the title sounds. An African decides to visit Greenland and lives among the Eskimos. Mostly to find out about the character of the man who wrote it, and made the journey.
Here’s a really good blog post by Jon Aquino. The short answer: It’s directory structure. The best part of his blog is the example Model/View/Controller in PHP showing the convention over configuration structure that looks very familiar, showing that Ruby isn’t what makes Rails great.
Azul Systems has a brilliant product that solves a major problem — Java’s slowness. They basically have a server appliance that has 24 or 48 cores on a single chip. Only it’s not really a server appliance. It’s a coprocessor. You configure your J2EE application server to point to the Azul Appliance as an external JVM.
While cool, starting at $89K a piece, that really misses the boat. Most Java systems that I know of are running on dual CPU boxes behind load balancers, with maybe a clustered cache. The database is almost always the bottleneck. I guess the real selling points are the lack of garbage collection churn (making up for flaws in the app server, JVM, and business code) thanks to the hardware accellerated GC. All your IO, etc, still needs to be done off-board, so you still need the app servers.
They claim reductions of 20-95%, and boast what seems like a really cool potential — power reduction. A 768 core appliance takes on 3500W of electricity. Which is great, but I don’t know how many “real” cores that is. I’m going to guess performance wise, it’s something like 1/10th, getting rid of 36 dual-cpu systems sucking 250W each (plus cooling & floorspace is a big gain.) But I’m going to guess that even if you cut your appservers by 80%, you’ve still got your web and DB layers full on, and storage still needs to be done (especially if you’re holding everything in memory. So power savings is in the hundreds, not hundreds of thousands of dollars per year.
I suspect that the real benefit of a fat Azul system is that you’ve got a honking load of memory directly attached to your JVM, so you can cache everythink (twice), and that’s where your performance boost comes from.
Bill Burke, of JBoss (now Redhat), has an interesting post on his blog about Radiohead’s experimental release of their new album In Rainbows. They apparently bypassed the record label and offered the album directly on the internet, with no DRM. There is an “order form” where purchasers name the amount (in pounds & pence) they’re willing to pay, if anything, and then are given a link to the download, in zipped MP3 format.
He quotes from a BBC article with some interesting statistics (and asinine conclusions.) He rightly points out that if open source projects were as wildly successful failures as Radiohead’s album, they’d be very profitable indeed. Of the 1.2 million visitors to the site (no number given for number of downloads), only 38% chose to pay. But the average price paid (including those who chose to pay zero, I assume) was over $6.
It was said that the break even point per download was $1.50, which is not accurate, unless someone is getting a royalty. The download costs themselves couldn’t amount to more than a few pennies per, and granted that overhead (development, hosting, etc.) might bring the total to $1.50 per, it scales. At some number, (say 1.2 million), $1.50 per is the total cost including overhead, and at any past that point, the cost per download reduces by at least 95%.
I do, however, suspect that some distributor agreed to make it available for something like $1.50 per download royalty, which is 3 times Apple’s 49 cents royalty for iTunes. Fair enough, given the high risk of such a completely DRM free download. So Radiohead may have a fixed cost of $1.50 per download, and some other distributor took the gamble of needing some number (say 1.2 million) of downloads to break even, though in a competitive marketplace I am sure that others could do it for less of a royalty at a smaller volume and still profit.
Regardless, no artist on earth has ever gotten $4.50 per record, or it’s rough equivalent 25% of gross sales. If you assume 1 in 10 visitors downloaded the album –and there is no reason to believe it is not higher, since there was only a brief form to fill out and an unencumbered download– then they’ve managed a cool half million in sales in the first month. Not too shabby, and probably in line with their most popular albums.
I suspect as the marketplace opens up that the price will go down, and perhaps even the biggest bands with the most loyal followings will only net a few hundred thousand dollars in such a scheme, and that the “labels” that promote them, taking the volume risk, will be happy with even a 10% margin of that.
A popular band will still make millions on concerts and merchandise, physical items where the rules of scale apply less than a half million or so electronic bits reproduced nearly infinitely.
A less popular band, claims the BBC, could not make a profit. But that’s ridiculous. If you manage to only sell a few hundred copies of your CD at $10 (assume $5.50 production costs), a few hundred more at $4.50 don’t hurt your bottom line. And if you’re not making a lot of money off of CD sales, millions of downloads and pirated copies can only increase potential merchandise and concert sales. In any case, it couldn’t hurt it.
On a final note, the BBC published a quote claiming that Radiohead (and their fans) are stealing from the record label, because “Radiohead have been bankrolled by their former label for the last 15 years.” And while Radiohead might not have gotten famous without the marketing of the label, the label did very good by them, and the opposite is obviously true: that Radiohead has been bankrolling the label for most of the last 15 years.
It’s a tricky business, selling QA. Testing doesn’t actually add anything to your product. It subtracts from it. It takes time and resources (read $$$), and what does it give you?
I was going to say testing doesn’t add anything, it takes away something: bugs. But that’s not completely accurate either. It identifies potential bugs. Fixing them is left as an exercise. I always hated getting my homework back with the red check marks, but not knowing why I got it wrong.
Everyone knows what testing does, they know the benefits, but they don’t always see that it outweighs the cost.
Take the term Quality Assurance. It’s selling you peace of mind, “assuring quality”, but it doesn’t really do that. It can only demonstrate the lack of quality, and you can’t prove a negative. Or something like that.
Sounds like you’re buying a warranty. But how can someone guarantee something they didn’t build? It makes a bit more sense if the same entity developing the product is also testing it. Then, they are “guaranteeing” their own product.
But what if you’re external? You can’t answer for the product. But you can let the developers know they did a bad job, and how. But nobody likes to be told their work isn’t good enough. Nobody thinks they make mistakes. And those that do are probably wise enough to think that by knowing they’re fallible that they don’t need someone else to tell it to them.
But separating testing from development makes sense. Everyone realizes that. Any developer can tell you how they overlooked obvious flaws that the first 2 year old who uses their product can point out. (I think it helps to have testers who have an inner 2 year old.)
Does separating the testing organization from the developing organization make sense? I don’t know. There is the aspect of dispassionate analysis, which may more than likely lead to impassioned defense. But is there the sense of owership?
I guess it’s the same problem with consultants of any kind. They want to protect their reputation and get continued business. So do I.
But I think the most value in external QA comes not from the testing, but from the analysis. It’s got to be a partnership. Tools and training are the bread and butter of consulting, and I don’t think that’s a bad thing. Reports and opinions are the jelly, and I don’t think that’s bad either.
Organizations just need to remember what they’re buying. If you don’t want a Power Point slide showing the synergies of transitional analysis, you just want someone to improve your bottom line, don’t hire a consultant. He doesn’t care about your bottom line. But he might be able to tell you what he thinks might improve your bottom line, if you do it yourself.
And QA is just trying to improve your bottom line. Or rather, your top line. Because it makes the bottom line move up, no doubt. But quality is a differentiator that should be able to translate into higher prices, more customers, or better brand loyalty.
In a commodity, quality is perhaps the only differentiator, besides price. It’s better to compete on quality.
Android is Google’s new Java mobile phone SDK. The idea is a common API and sort of cross compiler to be able to port to the variety of different mobile devices.
It looks like they use a custom branch of the Apache OpenJDK fork, an eclipse plugin with mobile device simulator, and a tool that takes java bytecode and transforms it to native code for the target device. I’m not really into mobile development, but of course it’d be fun if I ever had the time. And Android’s the big buzz on the internet this week.
The application market for mobile devices is potentially huge, but still strangled by the proprietary hold carriers and manufacturers have. Every so often a handheld Linux device comes out, get’s press, and then fizzles because no matter how cool it is, if you can’t make phone calls, or get music and games from the distributors, it’s really just a toy.
Android is an attempt to work around it from the other way, via the application development software. If this works, manufacturers & carriers will end up being more concerned about not breaking Android compatibility in order to have access to the open marketplace of applications, the way manufacturers want to be sure their hardware works with windows. Even if not, it just takes updating one “driver” to gain access to all the Android apps.
Yesterday was Thanksgiving, and we had a great one in Ecuador. I spent the morning futzing with my website design while Kelsey slaved away in the kitchen preparing the traditional American feast (Ecuadorian style.) Before she got up, I had started some dough for bread, but I neglected it, while writing updates in my other blogs. There was an overflowing bowl of yeast to deal with by the time I ran out of internet.
So I made the bread, and Kelsey made everything else: Turkey, stuffing, mashed potatoes, fruit salad, pumpkin pie (baked Wednesday night at Roseros), and all the trimmings. We had carrot (zanoria) and cucumber (pepino) sticks for appetizers, with ranch and leftover aji that Kelsey made a couple days before.
We took a break around 10:30 for breakfast: hard shell tacos with home made salsa and guacamole.
I cooked the rolls in the microwave (something I’ve gotten good at) and Kelsey cooked the turkey (pavo) and stuffing in it too. It all turned out great.
Among the things to be thankful for are that our wireless internet connection from TVCable was installed yesterday. Got to go on top the roof and then watch them drill a hole through the brick wall nearly shattering the dining room windows.
It’s not the best connection, but so far (sometimes) it’s been more reliable. I’m enjoying the long cord we asked to be left behind, because it’s now strung across the dining room, down the hall and into the living room — so I can get wifi with a view. Even with my WRT54G juiced up to 100mW, I can’t get connected (with 2 bars) 20 feet away.
I’m working on menus today, and have found a good CSS menu at brothercake.com, but I can’t seem to get it to behave well (in IE7.) Altering the presentation (position, color, etc.) is a hack too.
Now if only we get that washer & dryer that Salgosa’s (the landlord company) been promising.