electric heating

There is something disturbing about electric heating, especially if the electricity used is generated by thermodynamic processes, such as burning coal or natural gas. Lots of heat is sacrificed at the power plant to be able to turn a fraction of the input energy into this superb high-quality electricity that can do mechanical work. Then at the other end, an electric heater just turns it right back into waste heat without doing anything else useful.

But something useful can be done. Instead of straight heating elements, I suggest a server farm. Maybe box it up like an electric heater, sell the CPU cycles back while still getting the same heat out.

Cell synthesized

Scientists create synthetic cell, version 1.0 | [paper]

Our synthetic genomic approach stands in sharp contrast to a variety of other approaches to genome engineering that modify natural genomes by introducing multiple insertions, substitutions, or deletions (18–22). This work provides a proof of principle for producing cells based upon genome sequences designed in the computer. DNA sequencing of a cellular genome allows storage of the genetic instructions for life as a digital file.

This seems significant, equivalent to booting up the first stored-program computer.

Scientists who were not involved in the study are cautioning that the new species is not a truly synthetic life form because its genome was put into an existing cell.

That’s sour grapes, because the original cell cytoplasm decays to zero exponentially fast in the number of replications, a point well made in the paper. It’s only needed for booting. What’s more useful to know is how much of the 1.08Mbp genome consists of existing genes. The paper says it’s a close copy of M. mycoides:

The synthetic genome described in this paper has only limited modifications from the naturally occurring M. mycoides genome. However, the approach we have developed should be applicable to the synthesis and transplantation of more.

The next step will be a basic cell with a minimal genome, a barebones cell OS, if you will. Then, on to synthetic functions. Pretty soon we’ll have cell API’s, fancy-pants programming frameworks, and bugs and viruses. I mean real ones.


Must have been a piece of work by MIT students… Windsor Street near Mass. Ave.

employer of last resort

I’ve been reading about these “job guarantee” or “employer of last resort” theories, and they seem interesting. Basically the government provides employment at delta below the legal minimum wage for those who are unemployed, thereby absorbing excess labor into the public sector. The advantages are clear: it is certainly better than welfare and it doesn’t compete with the private sector.

Why is this? Let’s reason about it in a crude way.
(Read the article)

Toward a multi-input interface

I was recently shown this collaborative editor called Gobby (there are others) and was reminded of an idea I’ve been toying with for a long time.

A lot of work these days goes into novel human-computer interfaces (think coffee table displays, networked whiteboards, etc.) and gestures (think various touch responses like multi-touch zooming), but in my opinion (and metaphor), these work are like lightly scratching the skin when there is a deep deep itch.
(Read the article)

human deCAPTCHA service

About 10 years ago, when .NET was put out as a strategy for providing software services over the internet, I jokingly quipped that across the API interface, it’s just a black box, you’ll never know if you have actual humans answering your queries and passing the data back, as long as it’s in the right format! Imagine if “Jeeves” were an actual person answering what you “Ask”ed. Or if some translation tool were actually human-powered. It’d be pretty cool in a horrible way, like a reverse Turing-test. Students of the Humanities may even call it “dehumanizing” but we’re all evil engineers so who cares… hohoho

But guess what, this is an actual industry. Here is an article that shows, to my great amazement, that people have not only taken this concept to heart to solve the real problem (for spammers and hackers) of automated CAPTCHA decoding by low-wage humans, but they’ve even managed to load-balance the whole thing to reduce latency! What … the hell!
(Read the article)

Simonyi’s comment

Technology Review had a discussion of Charles Simonyi’s intentional programming work. Such a frustrating article. It didn’t say anything — certainly too little about how intentional programming is implemented. Most of the article was just saying, yes, cross layer design is always difficult, abstractions leak (not to mention sometimes abstraction leak is intentional to preserve performance), so on and so forth.

No, the reason this was an interesting article was the biographical part, and of the biographical part was a nugget of a quote by Simonyi:

Simonyi was born in Budapest in 1948. The son of a physics professor, he fell in love at 15 with his first computer–a mammoth Russian Ural II in Hungary’s Central Statistical Office. By the 1960s, the Ural, which received its instructions through cash-register-style keys and had a roomful of vacuum tubes to perform calculations, would already have been a relic anywhere else in the world. But Hungary’s Communist leaders were trying to use the Soviet castoff to optimize rail and trucking schedules. The Ural wasn’t up to the task: there was no way to input real-time data on shipments. “It was completely hopeless,” Simonyi recalls. “It could have been done very easily by supply and demand. Unfortunately, that was politically incorrect.”

An apt observation that the free market is but a machine of humans running an optimization algorithm.