coding block length in action

This article talks about the switch to 4096-byte sectors from the current 512-byte sectors for PC hard disks. This section is interesting:

4096 byte sectors don’t solve the analog problem—signals are getting weaker, and noise is getting stronger, and only reduced densities or some breakthrough in recording technology are going to change that—but it helps substantially with the error-correcting problem. Due to the way error correcting codes work, larger sectors require relatively less error correcting data to protect against the same size errors. A 4096 byte sector is equivalent to eight 512 byte sectors. With 40 bytes per sector for finding sector starts and 40 bytes for error correcting, protecting against 50 error bits, 4096 bytes requires (8 x 512 + 8 x 40 + 8 x 40) = 4736 bytes; 4096 of data, 640 of overhead. The total protection is against 400 error bits (50 bits per sector, eight sectors), though they have to be spread evenly among all the sectors.

With 4096 byte sectors, only one spacer start is needed, and to achieve a good level of protection, only 100 bytes of error checking data are required, for a total of (1 x 4096 + 1 x 40 + 1 x 100) = 4236 bytes; 4096 of data, 140 of overhead. 100 bytes per sector can correct up to 1000 consecutive error bits; for the forseeable future, this should be “good enough” to achieve the specified error rates. With an overhead of just 140 bytes per sector, about 96% of the disk’s capacity to be used.

With longer block lengths, the error correction capability generally goes up for the same coding overhead, however, it seems rather more complicated than this. First of all, I don’t think each manufacturer uses the same code or coding structure. (They used to just use Reed-Solomon code, though later they tried concatenating it with LDPC code, and now I hear some are switching to pure LDPC with iterative decoding.) But even if we assume they use some non-exotic block code, and use interleaving for bursts, the math still seems very strange: 40 error correction bytes can only correct 50 consecutive bits currently? I think not.

employer of last resort

I’ve been reading about these “job guarantee” or “employer of last resort” theories, and they seem interesting. Basically the government provides employment at delta below the legal minimum wage for those who are unemployed, thereby absorbing excess labor into the public sector. The advantages are clear: it is certainly better than welfare and it doesn’t compete with the private sector.

Why is this? Let’s reason about it in a crude way.
(Read the article)

Musing on discrimination in private employment

Title VII of the Civil Rights Act of 1964 limits discrimination in the private sector. At first glance it seems like a reasonable blanket law with no problems, but consider Hollywood. How does it get around the fact that it must discriminate in hiring actors to fit certain roles? Then consider a more subtle case of an ethnic restaurant looking for a chef or a front-desk worker.

The second issue is, if you want, you can always structure your hiring criteria to select the group you want statistically and not violate the letter of the law. In other words, you can find proxies: language ability for national origin perhaps, experience for age, so on and so forth. Unless all forms of discrimination are banned (including things like height, weight, image, etc. — and why not?), there are always correlated variables.

It seems like without genuine bottom-up cooperation, laws won’t amount to much without becoming unreasonably draconian.