Info Theory and DNA/Evolution

We’re reviewing the IEEE papers.

After reviewing the first batch of papers, we came up with some questions to answer in the future, in order of difficulty/open-ness:

1. Given the robustness of a code (e.g. due to a many-to-one codon->AA mapping), can we calculate bounds on the channel capacity of such a code? How does the empirical R (info transmission rate) of the codon->AA code compare with the empirical C (channel capacity, e.g. from mutation rates)?

2. How does the length/entropy/complexity of code-bits (e.g. parity bits, non-message bits used for error correcting) relate to the complexity of the error-correcting task, and e.g. the entropy and length of the data-bit sections (e.g. the actual message you’re sending) to satisfy R≤C?

– Is the von Neumann entropy,
[tex]H = {\rm Tr\,} \rho\log\rho = \sum\lambda_i\log\lambda_i [/tex]
where {λ} are the eigenvalues of the matrix, useful for discussing network robustness? (There’s a paper where they use Kolmogorov-Sinai/Shannon entropy to do this, which Blake has somewhere…) If so, then can we apply this to a genetic-regulatory network, and tie in the error-correcting or homeostatic abilities of such a network with VNE or other network metrics?

Next week:

We meet Monday at BU for group theory. Ben will be discussing SU(2) and SO(3) from Artin. Friday, Blake will present on the symmetry-breaking-in-genetics paper, and possibly on the information-theoretic considerations for BLAST.

BLAKE SEZ: I believe the paper which caught my eye was L. Demetrius and T. Manke (15 February 2005) “Robustness and network evolution — an entropic principlePhysica A 346 (3–4): 682–96.