And the rest of it is that you're arguing in favor of a science fiction explanation for the capabilities of the WIV lab.
Using this figure, we could have some parameters around how much time such a project must have taken and its latest start date (assuming its evolved from known or closely-related-to-known viruses).
Using this figure, and a theoretical timetable, we can also specify how close a cousin virus we would need to have for such a process to develop SAR-CoV-2.
This could give us a more specific understanding of the feasibility of such research.
For example, RaTG13 is closest known virus was discovered in 2013. Does the rate of serial passage enable evolution from RaTG13 over ~7 years? If not, this provides a factual argument allowing us to determine even if gain-of-function research was imposed on RaTG13 the moment of its discovery, it could not have been developed into SARS-CoV-2. (I don't know how to evaluate the actual accuracy of this, but provided as a example of how knowing the serial passage rate would be helpful).
In the lab you'll get a handful of mutations not thousands.
After serial passage through some large fraction of a billion humans, with large evolutionary pressures due to the recent species jump, the delta variant is still well over 99% homologous to the reference Wu-1 strain.
A 96.1% different would require serial passage through billions of organisms, but they measure this difference in terms of years of evolution in nature which is on the order of 30-40 years.
* If a wild strain is too far away from covid, then it would have taken too long to do the passage.
* If a wild strain is almost identical to covid, then obviously covid derived from that without seeing the interior of a lab.
Under what circumstances would you consider that serial passage work done in a lab might have had something to do with covid?
Wouldn’t this count as parallel passage though? Sure there’s more variability and evolutionary pressure on the mutations, but the speed of evolution (the number nucleotide mutations) is the same over the same period of time regardless of how many billions of people it infects in parallel.
Look how many infections were needed to go from Covid Classic to the Delta Variant. What's that a billion?
Comment I read from someone that aught to know what they are talking about pointed out that gain function in a lab produces viruses that are good at infecting cells in petri dishes. Not ones that are good at infecting vertebrates with full fledged immune systems trying to kill it.
The rate of evolution is a function of the mutation rate and the effective population size. There is no reason to believe that a lab setting, with highly parallel evolution on a very large, diverse population, will be as slow as passage through hosts in nature. The application of mutagens, and also the lack of selection by a host immune system can support much higher rates of change. And in coronaviruses, recombination is also very frequent, and this could easily give rise to multiple % levels of divergence in a single step. Finally, it's trivial to synthesize a genome of this size, and also to synthesize pools of related viruses based on common backgrounds.
Could you give the page # for this?
The 30-40 year figure assumes the related virus is a direct ancestor and it stayed within the same species, which is quite a big if. It's useful as a metric within a single population, but not exactly evidence hard enough to play genetic detective.
If they just share ancestors that time is basically halved towards the most recent common ancestor, which puts it back somewhere in the mid 2000s. When evolving in parallel, within different species, the divergence grows really quick. Also when viruses jump species the mutation rate skyrockets at the beginning[0][1] to adapt to the novel host, which could easily account for most of the difference between RaTG13 and Wu-1 anyway.